Human Generated Data

Title

Untitled (three women seated on flowered couch)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10542

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women seated on flowered couch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 98.3
Person 96.1
Person 93
Art 86.3
Leisure Activities 79.4
Sitting 79
Musical Instrument 71.6
Guitar 70
Musician 69.6
Furniture 65.8
Female 64.9
Drawing 63.8
Photography 63.2
Photo 63.2
Portrait 62.2
Face 62.2
Girl 58.5
Painting 58.2
Canvas 56.9
Guitarist 56.9
Performer 56.9

Imagga
created on 2022-01-09

man 34.9
male 29.1
person 27.7
people 27.3
adult 23.8
television 22.2
room 21.4
happy 20
office 19.6
smiling 18.1
portrait 16.8
sitting 16.3
senior 15
smile 15
computer 14.8
business 14.6
looking 14.4
men 13.7
group 13.7
home 13.6
laptop 13.4
couple 13.1
indoor 12.8
old 12.5
telecommunication system 12.4
businessman 12.4
fun 12
dark 11.7
working 11.5
light 11.4
mature 11.2
desk 11.1
lifestyle 10.8
black 10.8
indoors 10.5
love 10.3
casual 10.2
chair 9.8
modern 9.8
together 9.6
career 9.5
pretty 9.1
holding 9.1
fashion 9
classroom 9
one 9
lady 8.9
handsome 8.9
job 8.8
women 8.7
work 8.6
respirator 8.6
happiness 8.6
world 8.5
iron lung 8.3
team 8.1
sexy 8
family 8
case 7.9
patient 7.9
nightlife 7.8
education 7.8
teacher 7.7
attractive 7.7
crowd 7.7
husband 7.6
adults 7.6
device 7.5
human 7.5
board 7.2
music 7.2
hair 7.1
face 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.9
person 92.2
clothing 83.5
black and white 75.1
human face 73

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 91.4%
Surprised 24.1%
Calm 18.6%
Angry 17.3%
Sad 13.5%
Fear 12.5%
Happy 6.1%
Confused 4.8%
Disgusted 3%

AWS Rekognition

Age 22-30
Gender Male, 98.3%
Calm 86%
Sad 10%
Surprised 1.1%
Confused 0.8%
Happy 0.6%
Disgusted 0.5%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Male, 61.9%
Happy 58.6%
Calm 34.3%
Surprised 3%
Sad 2%
Confused 0.8%
Disgusted 0.6%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Guitar 70%

Captions

Microsoft

a group of people sitting in front of a laptop 59.3%
a group of people looking at a laptop 59.2%
a group of people sitting and looking at the camera 59.1%

Text analysis

Amazon

20291.
DC
in

Google

20291. 2024 20291.
20291.
2024