Human Generated Data

Title

Untitled (woman pouring orange juice for her family)

Date

1952, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.210

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman pouring orange juice for her family)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Tie 99.4
Accessories 99.4
Accessory 99.4
Saucer 98.8
Pottery 98.8
Person 98.8
Human 98.8
Person 98.7
Person 98.7
Cup 98.1
Coffee Cup 98.1
Tie 97.3
Cafe 92.4
Restaurant 92.4
Person 79.8
Clothing 66.7
Overcoat 66.7
Coat 66.7
Apparel 66.7
Suit 66.7

Imagga
created on 2022-01-08

businessman 54.8
salon 51.3
office 50.7
business 49.2
man 48.4
male 40.5
computer 40.2
working 38
work 36.1
meeting 34.9
businesswoman 34.6
job 33.6
sitting 33.5
people 33.5
laptop 32.1
desk 31.3
corporate 30.9
businesspeople 30.4
communication 30.2
adult 29.3
person 29.1
group 29
barbershop 29
team 26.9
professional 26.6
suit 26.3
shop 25.7
executive 25.6
happy 24.5
confident 23.7
looking 22.4
men 22.3
table 21.8
smiling 21.7
teamwork 21.3
indoors 21.1
worker 20.9
chair 20.5
employee 20.4
call 20.3
indoor 20.1
career 19.9
colleagues 19.4
successful 19.2
workplace 19.1
handsome 18.7
restaurant 18.6
smile 18.5
success 18.5
technology 17.8
mercantile establishment 17.4
lifestyle 17.4
busy 16.4
hairdresser 15.9
company 15.8
together 15.8
discussing 14.7
barber chair 14.5
portrait 14.2
women 14.2
manager 14
planning 13.5
cheerful 13
discussion 12.7
employment 12.6
corporation 12.5
talking 12.4
mature 12.1
attractive 11.9
two 11.9
coworkers 11.8
happiness 11.8
businessmen 11.7
place of business 11.6
collar 11.5
keyboard 11.3
presentation 11.2
casual 11
occupation 11
cooperation 10.6
partner 10.6
seat 10.4
phone 10.1
building 10
holding 9.9
modern 9.8
conference 9.8
mid adult 9.7
30s 9.6
staff 9.6
formal 9.6
education 9.5
coffee 9.3
20s 9.2
one 9
associates 8.9
interior 8.9
30 35 years 8.8
partners 8.8
conversation 8.7
partnership 8.6
paper 8.6
boss 8.6
serious 8.6
wireless 8.6
reading 8.6
expression 8.5
furniture 8.5
pen 8.5
contemporary 8.5
pretty 8.4
hand 8.4
document 8.4
student 8.2
explaining 7.9
couple 7.8
monitor 7.8
40s 7.8
workers 7.8
secretary 7.8
notebook 7.7
jacket 7.7
four 7.7
customer 7.6
room 7.6
horizontal 7.5
smart 7.5
senior 7.5
service 7.4
friendly 7.3
alone 7.3
color 7.2
day 7.1
look 7

Google
created on 2022-01-08

Tableware 95.3
Table 93
Drinkware 86.5
Dishware 84.5
Serveware 84.4
Black-and-white 84
Style 84
Cup 83.5
Plate 79.6
Lamp 77.9
Hat 77.9
Saucer 75.8
Smile 74.7
Chair 74.4
Monochrome photography 73.2
Monochrome 71.9
Picture frame 70.8
Audio equipment 69.6
Sitting 69.1
Fedora 66

Microsoft
created on 2022-01-08

indoor 96
wall 95.8
person 95.4
text 94.8
clothing 94.2
black and white 81.3
table 69.8
coffee cup 69.3
tableware 65.9
newspaper 53.4
woman 52.6
dining table 8.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Happy 97.5%
Surprised 0.6%
Calm 0.4%
Disgusted 0.4%
Sad 0.4%
Angry 0.3%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 100%
Sad 53.6%
Happy 13.1%
Confused 11.1%
Calm 8.8%
Fear 4.3%
Angry 3.4%
Disgusted 2.9%
Surprised 2.9%

AWS Rekognition

Age 6-12
Gender Male, 92.7%
Happy 99.2%
Sad 0.4%
Confused 0.1%
Calm 0.1%
Surprised 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 14-22
Gender Male, 99.5%
Calm 49.9%
Fear 19.5%
Sad 10.1%
Angry 9.6%
Happy 5%
Confused 4.3%
Surprised 0.9%
Disgusted 0.7%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Tie 99.4%
Person 98.8%
Suit 66.7%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 60.1%
a person standing in front of a table 60%
a person sitting at a table 59.9%

Text analysis

Google

Www.
Www.