Human Generated Data

Title

Untitled (women gathered for tea in sitting room)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12091

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women gathered for tea in sitting room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Chair 99.4
Furniture 99.4
Apparel 97.8
Clothing 97.8
Person 97.1
Human 97.1
Person 96.3
Person 91.5
Person 89.3
Person 88
People 82.6
Hat 71.6
Room 65.3
Indoors 65.3
Table 58.8
Chair 58.6
Bonnet 58
Icing 55.9
Food 55.9
Dessert 55.9
Cream 55.9
Creme 55.9
Cake 55.9

Imagga
created on 2022-01-15

room 47.7
table 36.6
classroom 36.5
person 30.7
people 27.3
man 26.9
male 25.5
home 24.7
restaurant 24.6
patient 21.8
interior 21.2
chair 20.9
meeting 20.7
adult 19.5
indoors 19.3
business 18.8
couple 18.3
office 18.3
sitting 18
businessman 17.7
drink 17.6
together 17.5
desk 16.5
dinner 15.9
food 15.4
team 15.2
sick person 14.8
businesswoman 14.5
meal 14.5
case 14.5
happy 14.4
worker 14.4
women 14.2
hospital 13.8
cafeteria 13.4
work 13.4
businesspeople 13.3
glass 13.2
mature 13
men 12.9
lunch 12.6
modern 12.6
family 12.5
talking 12.4
smiling 12.3
furniture 12.1
group 12.1
indoor 11.9
40s 11.7
lifestyle 11.6
30s 11.5
working 11.5
smile 11.4
corporate 11.2
service 11.1
day 11
children 10.9
dining 10.5
child 10.5
kitchen 10.4
professional 10.2
wine 10.2
clinic 10.1
20s 10.1
nurse 10.1
laptop 10
building 9.9
banquet 9.8
conference 9.8
colleagues 9.7
waiter 9.7
computer 9.6
student 9.6
boy 9.6
education 9.5
adults 9.5
teacher 9.4
happiness 9.4
manager 9.3
teamwork 9.3
cheerful 8.9
party 8.6
senior 8.4
eating 8.4
executive 8.4
eat 8.4
presentation 8.4
board 8.1
medical 7.9
dining room 7.9
boardroom 7.9
elementary age 7.9
catering 7.8
couples 7.8
teaching 7.8
portrait 7.8
daytime 7.7
elderly 7.7
health 7.6
formal 7.6
workplace 7.6
newspaper 7.6
enjoying 7.6
communication 7.6
clothing 7.5
house 7.5
inside 7.4
wedding 7.4
hall 7.3
breakfast 7.3
suit 7.2
mother 7.2
father 7.1
decor 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

table 96.1
chair 91.1
window 87.4
text 81.4
black and white 74.1
clothing 65.4
furniture 35.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 98.3%
Calm 51.9%
Surprised 42.5%
Happy 4.3%
Disgusted 0.4%
Sad 0.3%
Angry 0.2%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 22-30
Gender Female, 92.5%
Calm 99.3%
Surprised 0.2%
Sad 0.1%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 91.6%
Happy 34.4%
Sad 27.5%
Calm 25.3%
Confused 5.1%
Disgusted 2.4%
Angry 2.1%
Surprised 1.8%
Fear 1.4%

AWS Rekognition

Age 23-33
Gender Female, 57%
Calm 49.2%
Sad 48.7%
Confused 0.9%
Happy 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.4%
Person 97.1%

Captions

Microsoft

a group of people sitting on a bed 70%
a group of people on a bed 69.9%
a group of people in a room 69.8%

Text analysis

Amazon

VEEV
VEEV THE
ct
THE