Human Generated Data

Title

Untitled (woman holding flour sifter at kitchen counter)

Date

c. 1944

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7132

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman holding flour sifter at kitchen counter)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1944

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Sitting 81.1
Clothing 79.5
Apparel 79.5
Building 67.4
Female 65.3
People 61.2
Indoors 60.5
Photo 60.3
Photography 60.3
Overcoat 59.2
Suit 59.2
Coat 59.2
Face 57.1

Imagga
created on 2021-12-15

laptop 42.4
computer 41.2
office 33.2
person 32.4
working 31.8
work 31.4
business 29.2
businesswoman 28.2
happy 26.3
people 26.2
professional 26
technology 26
smiling 24.6
sitting 23.2
job 21.2
corporate 20.6
desk 20.5
adult 20.4
executive 19.7
man 19.5
smile 18.5
looking 18.4
attractive 18.2
wireless 18.1
portrait 17.5
cheerful 17.1
male 17
notebook 17
indoors 16.7
workplace 16.2
worker 16
lifestyle 15.2
pretty 14.7
communication 14.3
table 14
phone 13.8
successful 13.7
confident 13.6
student 13.6
home 13.6
manager 13
secretary 13
typing 12.7
education 12.1
success 12.1
women 11.9
handsome 11.6
men 11.2
kitchen 10.9
suit 10.8
cute 10.8
businessman 10.6
lady 10.5
mobile 10.4
device 9.9
holding 9.9
modern 9.8
consultant 9.7
one 9.7
cup 9.6
talking 9.5
businesspeople 9.5
writing 9.4
happiness 9.4
expression 9.4
face 9.2
employee 9.1
standing 8.7
career 8.5
coffee 8.5
shirt 8.5
senior 8.4
mature 8.4
glasses 8.3
clothing 8.2
while 7.8
machine 7.8
lunch 7.7
telephone 7.7
support 7.7
casual 7.6
reading 7.6
study 7.5
director 7.4
equipment 7.4
inside 7.4
occupation 7.3
stylish 7.2
color 7.2
food 7.1
dinner 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 96.8
person 94.4
human face 93.6
tableware 92.9
clothing 89
indoor 88.2
black and white 86.8
table 86.5
coffee cup 78
woman 77.5
saucer 54.2

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 93.3%
Happy 51.6%
Calm 46%
Surprised 1.2%
Sad 0.4%
Confused 0.3%
Disgusted 0.3%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man and a woman sitting at a table 67.3%
a man and a woman sitting on a table 47.9%
a person sitting at a table 47.8%

Text analysis

Amazon

IOI
21949.
P.S.

Google

20949.
20949.