Human Generated Data

Title

Untitled (woman with young man holding gun)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8807

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with young man holding gun)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8807

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 98.5
Interior Design 94.3
Indoors 94.3
Guitar 90.7
Musical Instrument 90.7
Leisure Activities 90.7
Clothing 88.7
Apparel 88.7
Living Room 73.3
Room 73.3
Furniture 67.5
People 66.9
Play 65.3
Screen 64.4
Electronics 64.4
Monitor 60.9
Display 60.9
Girl 59.6
Female 59.6
Shorts 57.3
Kid 56.3
Child 56.3

Clarifai
created on 2023-10-26

people 99.9
group together 98.7
group 98.5
adult 97.4
music 96.4
recreation 95.9
man 95.8
woman 94.7
sitting 93.4
dancing 93.3
furniture 92
many 91.2
monochrome 90.2
education 89.6
child 89.4
musician 89.2
wear 88.4
teacher 87.5
chair 87
guitar 85.7

Imagga
created on 2022-01-09

blackboard 42.5
television 29.1
person 26.4
people 25.1
man 23.5
education 21.6
classroom 18.9
male 18.4
telecommunication system 18.4
case 17.6
teacher 17.3
board 16.4
room 16.2
class 15.4
chair 14.7
college 14.2
indoors 14
student 13.6
school 13.6
business 13.4
businessman 13.2
adult 13.1
indoor 12.8
black 11.4
group 11.3
study 11.2
desk 11
lifestyle 10.8
chalkboard 10.8
university 10.6
job 10.6
men 10.3
science 9.8
women 9.5
happy 9.4
equipment 9.3
communication 9.2
portrait 9.1
teaching 8.8
standing 8.7
exam 8.6
drawing 8.6
learn 8.5
shop 8.4
office 8.3
silhouette 8.3
computer 8.3
music 8.2
interior 8
work 7.8
math 7.8
smile 7.8
high 7.8
lesson 7.8
sitting 7.7
casual 7.6
hand 7.6
meeting 7.5
holding 7.4
design 7.3
cheerful 7.3
exercise 7.3
to 7.1
device 7
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.3
person 93.5
clothing 93.4
man 86.2
black and white 65.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-51
Gender Female, 79.1%
Calm 97.9%
Sad 0.8%
Fear 0.3%
Surprised 0.3%
Confused 0.2%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 24-34
Gender Female, 96.2%
Calm 99.6%
Sad 0.3%
Surprised 0%
Confused 0%
Angry 0%
Happy 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-31
Gender Female, 55.5%
Calm 53.9%
Sad 39%
Angry 2.1%
Fear 1.4%
Happy 1.1%
Disgusted 0.9%
Confused 0.9%
Surprised 0.8%

Feature analysis

Amazon

Person 99.6%
Guitar 90.7%

Captions

Microsoft
created on 2022-01-09

graphical user interface 27%

Text analysis

Amazon

.74
39474.
M.

Google

39474. •74
39474.
•74