Human Generated Data

Title

Untitled (two couples seated at table at wedding reception)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8979

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples seated at table at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.2
Person 99.1
Person 98.3
Tie 96.1
Accessories 96.1
Accessory 96.1
Clothing 93.9
Apparel 93.9
Clinic 86.8
Sunglasses 78.7
People 69.5
Furniture 61.9
Table 61.9
Coat 61.2
Photography 60
Photo 60
Meal 59.4
Food 59.4
Portrait 58.1
Face 58.1
Scientist 57.8
Suit 55.5
Overcoat 55.5
Hospital 55

Imagga
created on 2022-01-09

man 45
person 43.1
lab coat 41.3
executive 37.8
male 34.8
coat 34.5
senior 31.9
patient 30.8
people 30.7
adult 27.6
sitting 24.9
men 24.1
couple 23.5
home 23.1
elderly 22
smiling 21.7
office 21.7
happy 21.3
indoors 21.1
together 20.2
table 19.9
professional 19.4
specialist 18.9
casual 18.7
mature 18.6
business 18.2
sick person 18.1
doctor 17.9
case 17.8
work 17.3
medical 16.8
worker 16.3
desk 16.1
businessman 15.9
colleagues 15.6
laptop 15.5
group 15.3
meeting 15.1
working 15
women 15
day 14.9
old 14.6
lifestyle 14.5
team 14.3
job 14.2
garment 14
color 13.9
health 13.9
computer 13.6
clothing 13.6
talking 13.3
adults 13.3
smile 12.8
clinic 12.8
businesswoman 12.7
nurse 12.7
teacher 12.4
room 12.1
teamwork 12.1
camera 12
occupation 11.9
casual clothing 11.7
60s 11.7
40s 11.7
older 11.7
portrait 11.7
hospital 11.5
education 11.3
looking 11.2
discussion 10.7
retired 10.7
retirement 10.6
businesspeople 10.4
horizontal 10.1
indoor 10
care 9.9
coworkers 9.8
cheerful 9.8
technology 9.7
busy 9.6
30s 9.6
student 9.4
happiness 9.4
classroom 9.1
suit 9
family 8.9
discussing 8.8
mid adult 8.7
using 8.7
face 8.5
grandfather 8.5
two 8.5
modern 8.4
holding 8.3
practitioner 8.3
school 8.1
to 8
interior 8
medicine 7.9
associates 7.9
70s 7.9
bright 7.9
50s 7.8
two people 7.8
class 7.7
looking camera 7.7
husband 7.6
wife 7.6
writing 7.5
life 7.5
document 7.3
20s 7.3
active 7.3
paper 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.6
text 98.8
clothing 93
man 86.2
woman 81
black and white 59.7
table 53.3

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 98.3%
Calm 57%
Surprised 21.1%
Happy 16.1%
Angry 1.7%
Confused 1.2%
Fear 1.2%
Disgusted 1%
Sad 0.6%

AWS Rekognition

Age 49-57
Gender Male, 99%
Calm 57.2%
Happy 40.7%
Sad 0.8%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 96.1%
Sunglasses 78.7%

Captions

Microsoft

a group of people sitting at a table 95.8%
a group of people sitting around a table 95.7%
a group of men sitting at a table 94.7%

Text analysis

Amazon

10497.
10499.
AaE
AaE ١٨
١٨

Google

10497. 10497. 10497.
10497.