Human Generated Data

Title

Untitled (employee behind counter of Burger Queen)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8060

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (employee behind counter of Burger Queen)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.6
Person 99.6
Person 99.5
Person 97.1
Worker 91.4
Restaurant 91.3
Person 91.2
Meal 89
Food 89
Cafeteria 80.7
Person 78
Hairdresser 75.2
Clothing 71.4
Apparel 71.4
Sleeve 64.5
Long Sleeve 57.6
Cafe 56.3
Indoors 55.1

Imagga
created on 2022-01-15

barbershop 90
shop 82.4
mercantile establishment 55.7
salon 44.1
place of business 37.1
hairdresser 32.2
man 30.2
people 28.4
chair 22.4
adult 21.3
male 20.6
indoors 19.3
person 19.2
interior 18.6
establishment 18.5
restaurant 18.3
room 17.3
women 15.8
hospital 15.3
home 15.1
happy 15
work 14.9
furniture 14.6
lifestyle 14.5
medical 14.1
doctor 14.1
table 13.8
sitting 13.7
smiling 13.7
modern 13.3
barber chair 13.2
holding 13.2
senior 13.1
worker 12.9
men 12.9
looking 12.8
business 12.8
patient 12.7
office 12.4
care 12.3
color 12.2
clinic 12
health 11.8
portrait 11.6
couple 11.3
computer 11.2
day 11
professional 11
indoor 11
seat 10.7
family 10.7
working 10.6
medicine 10.6
mature 10.2
life 10.2
inside 10.1
smile 10
casual clothing 9.8
casual 9.3
two 9.3
mother 9.3
horizontal 9.2
old 9.1
nurse 8.8
standing 8.7
customer 8.6
illness 8.6
talking 8.6
service 8.3
kitchen 8
together 7.9
operation 7.9
food 7.9
surgeon 7.8
surgery 7.8
40s 7.8
two people 7.8
using 7.7
adults 7.6
store 7.6
house 7.5
buy 7.5
human 7.5
technology 7.4
lady 7.3
new 7.3
clothing 7.2
businessman 7.1
glass 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 99
black and white 94.1
clothing 89.8
text 86.8
table 81.1
preparing 72.9
food 70.5
bottle 53.6
working 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 85.2%
Happy 89.7%
Surprised 6.4%
Calm 2.7%
Disgusted 0.5%
Sad 0.3%
Angry 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 93%
Calm 99.9%
Surprised 0%
Sad 0%
Angry 0%
Happy 0%
Disgusted 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man and woman preparing food in a kitchen 82.3%
a man and a woman standing in a kitchen preparing food 80.6%
a man and woman preparing food while standing in front of a window 64.7%