Human Generated Data

Title

Untitled (wedding guests seated at a reception)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10674

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated at a reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 98.9
Human 98.9
Person 98.5
Person 98.3
Person 95.4
Person 94.3
Person 93.2
Furniture 91.7
Couch 91.7
Person 91
Person 90.1
Person 89.7
Apparel 87.9
Clothing 87.9
Person 84
Person 83.2
Helmet 81.7
Person 80.2
Person 71.4
People 71.3
Sitting 70
Chair 68.2
Indoors 62.4
Room 62.4
Living Room 60.3
Suit 58.8
Overcoat 58.8
Coat 58.8
Photography 57.8
Photo 57.8
Screen 55.4
Electronics 55.4
Monitor 55.4
Display 55.4

Imagga
created on 2022-01-15

man 39.7
room 37.6
person 35
people 34.1
classroom 33.8
male 33.5
indoors 33.4
home 27.9
smiling 26.8
sitting 26.6
adult 26.6
teacher 26.5
professional 25
salon 23.6
hospital 23.6
men 23.2
office 22.2
women 22.2
group 21.8
meeting 21.7
happy 20.7
businessman 20.3
together 20.2
business 20.1
computer 19.3
table 19.1
couple 18.3
work 18.1
senior 17.8
patient 17.5
working 16.8
talking 16.2
indoor 15.5
team 15.2
lifestyle 15.2
educator 14.8
education 14.7
occupation 14.7
cheerful 14.6
colleagues 14.6
businesswoman 14.6
desk 14.2
interior 14.2
teamwork 13.9
smile 13.6
communication 13.4
worker 13.4
executive 13.1
mature 13
barbershop 12.8
happiness 12.5
student 12.3
laptop 12.2
chair 12.2
horizontal 11.7
color 11.7
two people 11.7
couch 11.6
child 11.5
businesspeople 11.4
shop 11.4
hairdresser 11.4
corporate 11.2
clinic 10.9
family 10.7
adults 10.4
school 10
holding 9.9
modern 9.8
nurse 9.8
discussion 9.7
technology 9.7
30s 9.6
females 9.5
togetherness 9.4
day 9.4
enjoyment 9.4
service 9.3
life 9.1
portrait 9.1
board 9.1
suit 9
case 8.9
20 24 years 8.9
living room 8.8
restaurant 8.8
retired 8.7
mid adult 8.7
class 8.7
elderly 8.6
sofa 8.6
friends 8.5
presentation 8.4
mercantile establishment 8.4
back 8.3
girls 8.2
children 8.2
sick person 8.1
looking 8
to 8
70s 7.9
casual clothing 7.8
boy 7.8
students 7.8
40s 7.8
studying 7.7
old 7.7
casual 7.6
two 7.6
college 7.6
enjoying 7.6
learning 7.5
screen 7.5
inside 7.4
20s 7.3
cup 7.2
face 7.1
kid 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 98.8
clothing 93.5
table 87.1
woman 86.2
text 78
house 69.9
group 56.9
wedding dress 54
man 52

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 97.8%
Happy 70.6%
Calm 27.3%
Surprised 1.1%
Disgusted 0.3%
Sad 0.3%
Confused 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 80.3%
Calm 99.8%
Happy 0.1%
Fear 0%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Sad 60.1%
Calm 13.1%
Happy 8.3%
Confused 8.3%
Disgusted 5.9%
Surprised 2.9%
Fear 0.9%
Angry 0.7%

AWS Rekognition

Age 24-34
Gender Male, 99.8%
Calm 44.3%
Disgusted 18.7%
Sad 12.3%
Angry 8.6%
Confused 7.9%
Surprised 4.9%
Happy 1.7%
Fear 1.4%

AWS Rekognition

Age 29-39
Gender Male, 64%
Calm 96%
Sad 2%
Happy 0.7%
Surprised 0.5%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 99.7%
Calm 90.3%
Surprised 5.9%
Sad 1.5%
Happy 1.2%
Confused 0.4%
Disgusted 0.4%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 95%
Happy 51.7%
Sad 30.9%
Confused 8.6%
Calm 4.7%
Disgusted 1.6%
Angry 1.5%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 53-61
Gender Male, 98.4%
Surprised 26.5%
Sad 23.6%
Confused 18%
Happy 10.8%
Angry 7.1%
Fear 6.2%
Disgusted 5.4%
Calm 2.5%

AWS Rekognition

Age 33-41
Gender Male, 98.8%
Calm 91%
Happy 6%
Sad 1.4%
Fear 0.5%
Confused 0.5%
Angry 0.2%
Disgusted 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Helmet 81.7%
Chair 68.2%

Captions

Microsoft

a group of people sitting at a table 84.3%
a group of people sitting around a table 84.2%
a group of people in a room 84.1%

Text analysis

Amazon

21
21 304.
OUT
304.
DON
CHARTERS
SAINT
DOS
NAOOX
DE BRE
MAN
are

Google

304.
21 304. 21 304• AMTZA3
21
304•
AMTZA3