Human Generated Data

Title

Untitled (four women gathered around set dining room table with flowers, woman seated)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5318

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women gathered around set dining room table with flowers, woman seated)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99.3
Person 99
Person 98.2
Chair 96.7
Furniture 96.7
Room 95
Indoors 95
Interior Design 92
Person 84.1
People 76.4
Female 69.1
Clinic 68.1
Helmet 63.5
Clothing 63.5
Apparel 63.5
Dining Table 63.3
Table 63.3
Girl 60.7
Dining Room 57.3

Imagga
created on 2022-01-22

sketch 30.5
drawing 28.5
people 27.9
negative 22.1
person 20.6
film 18.7
adult 18.2
representation 17.3
man 16.1
team 14.3
table 14.2
work 13.5
modern 13.3
professional 13.2
life 13.1
photographic paper 12.9
male 12.8
dress 12.6
happy 12.5
interior 12.4
decoration 12.3
lifestyle 12.3
group 12.1
men 12
wedding 11.9
glass 11.7
business 11.5
indoors 11.4
human 11.2
women 11.1
portrait 11
house 10.9
home 10.4
shop 10.3
elegant 10.3
smiling 10.1
fashion 9.8
medical 9.7
worker 9.7
party 9.5
groom 9.3
elegance 9.2
city 9.1
hand 9.1
attractive 9.1
health 9
family 8.9
bride 8.9
working 8.8
celebration 8.8
urban 8.7
love 8.7
laboratory 8.7
test 8.7
photographic equipment 8.6
luxury 8.6
marriage 8.5
design 8.4
black 8.4
clothing 8.1
suit 8.1
romantic 8
science 8
decor 8
medicine 7.9
room 7.9
art 7.9
couple 7.8
smile 7.8
standing 7.8
lab 7.8
chemistry 7.7
setting 7.7
flower 7.7
old 7.7
formal 7.6
restaurant 7.6
two 7.6
office 7.5
leisure 7.5
teamwork 7.4
cheerful 7.3
paint 7.2
activity 7.2
architecture 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.3
person 95
drawing 92.5
table 90.7
sketch 87
furniture 85.8
clothing 73.3
old 64.9
posing 43.8

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 90.5%
Happy 96.3%
Sad 1.9%
Calm 1.3%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 68%
Calm 94.6%
Sad 3.9%
Confused 0.5%
Surprised 0.5%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 96.7%
Helmet 63.5%

Captions

Microsoft

a group of people posing for a photo 80%
a group of people standing in front of a window 71.2%
a group of people standing in a kitchen 71.1%

Text analysis

Amazon

6493

Google

6493
6493