Human Generated Data

Title

Untitled (children in hunting clothes having tea, Junior Hunt Club, Whitemarch, Pennsylvania)

Date

1940, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.312

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children in hunting clothes having tea, Junior Hunt Club, Whitemarch, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.9
Human 98.9
Person 98.5
Food 98.4
Meal 98.4
Person 98.4
Restaurant 98.2
Person 98.1
Furniture 95.4
Couch 95.4
Person 94.8
Cafeteria 87.7
Dish 84.5
Living Room 80.5
Room 80.5
Indoors 80.5
Sitting 79.7
Lamp 78.3
Table Lamp 78.3
Shelf 74.7
Buffet 73.7
Food Court 72.8
Table 70.6
Home Decor 69
People 64.5
Face 64.5
Dining Table 59.9
Bookcase 58.1

Imagga
created on 2022-01-08

man 32.3
business 28.5
adult 28
office 27.9
people 27.3
businessman 26.5
corporate 25.8
male 25.8
kin 25.5
group 25
sitting 24.9
meeting 24.5
team 24.2
happy 23.8
groom 22.7
laptop 22
couple 21.8
person 21.5
businesswoman 20.9
work 20.4
professional 20.3
smiling 20.3
computer 20.1
job 18.6
adults 18
smile 17.8
indoors 17.6
together 17.5
women 17.4
home 16.7
men 16.3
executive 16
suit 15.3
businesspeople 15.2
communication 15.1
working 15
table 14.9
teamwork 14.8
attractive 14
colleagues 13.6
technology 13.4
happiness 13.3
lifestyle 13
20s 12.8
staff 12.5
room 12.5
desk 12.5
workplace 12.4
friends 12.2
two 11.9
worker 11.7
females 11.4
company 11.2
confident 10.9
elementary age 10.9
holding 10.7
30s 10.6
modern 10.5
success 10.5
talking 10.5
togetherness 10.4
portrait 10.4
friendship 10.3
classroom 10.2
casual 10.2
emotion 10.1
children 10
color 10
beverage 9.9
clothing 9.8
family 9.8
restaurant 9.8
cheerful 9.8
child 9.6
looking 9.6
glass 9.3
presentation 9.3
drink 9.2
indoor 9.1
pretty 9.1
teacher 9.1
handsome 8.9
days 8.8
couch 8.7
daytime 8.7
notebook 8.6
boss 8.6
expression 8.5
clothes 8.4
manager 8.4
food 8.4
daughter 8.3
occupation 8.2
friendly 8.2
dinner 8
interior 8
love 7.9
beverages 7.8
day 7.8
education 7.8
discussion 7.8
two people 7.8
emotions 7.8
partner 7.7
corporation 7.7
busy 7.7
diversity 7.7
youth 7.7
eating 7.6
career 7.6
house 7.5
building 7.4
successful 7.3
buddy 7.3
cup 7.2
dad 7.2
mother 7.1
romantic 7.1
black 7.1

Google
created on 2022-01-08

Food 93.2
Table 93.1
Tableware 89.9
Chair 88.6
Coat 87.2
Picture frame 86.6
Plate 84.4
Black-and-white 83.1
Sharing 82.2
Window 80.1
Dishware 77.2
Event 72.7
Monochrome photography 72
Monochrome 71
Serveware 70.7
Room 70.5
Suit 69.3
Sitting 68
Vintage clothing 66.1
Cooking 65.4

Microsoft
created on 2022-01-08

sofa 97.4
indoor 96.9
table 91.2
person 90.9
text 90.7
window 89.8
living 87
clothing 85.6
black and white 84.3
man 68.7
people 68.2
dining table 7.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Male, 99.3%
Calm 83.7%
Sad 9.5%
Fear 2.6%
Confused 2.2%
Angry 0.9%
Disgusted 0.5%
Surprised 0.3%
Happy 0.3%

AWS Rekognition

Age 9-17
Gender Female, 100%
Calm 87.7%
Happy 7.3%
Sad 1.3%
Angry 1.1%
Surprised 1.1%
Confused 0.6%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 18-26
Gender Female, 93.5%
Calm 50.6%
Sad 37.2%
Confused 5.7%
Fear 2.2%
Surprised 1.9%
Angry 1%
Disgusted 0.9%
Happy 0.5%

AWS Rekognition

Age 12-20
Gender Female, 100%
Happy 69%
Calm 11.9%
Angry 9.5%
Surprised 3.4%
Sad 2.1%
Confused 1.5%
Disgusted 1.4%
Fear 1.1%

AWS Rekognition

Age 16-22
Gender Female, 98.1%
Calm 92.2%
Sad 5.5%
Confused 0.7%
Surprised 0.5%
Disgusted 0.4%
Angry 0.3%
Fear 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 12
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people sitting around a living room 98.6%
a group of people sitting in a living room 98.3%
a group of people sitting at a table in a living room 97.7%