Human Generated Data

Title

Untitled (children dressed for Halloween seated at table with food, woman standing next to table)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4857

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children dressed for Halloween seated at table with food, woman standing next to table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97
Human 97
Person 95.3
Person 95.1
Person 94
Person 93.5
Indoors 91.9
Room 91.9
Person 91.8
Person 88.1
People 83.7
Person 79.2
Furniture 77.7
Chair 77.2
Living Room 69.8
Classroom 64.8
School 64.8
Suit 59.5
Coat 59.5
Apparel 59.5
Clothing 59.5
Overcoat 59.5
Couch 59.5
Kindergarten 58.6
Dining Room 57.8

Imagga
created on 2022-01-23

people 34
kin 33.8
man 32.2
couple 31.3
smiling 29.6
home 27.9
groom 27.7
happy 27.6
adult 26.4
indoors 26.3
person 26.1
male 24.9
women 21.3
love 21.3
happiness 18.8
lifestyle 18.8
cheerful 18.7
together 18.4
room 18.2
sitting 18
men 17.2
family 16.9
husband 16.2
smile 14.2
portrait 14.2
senior 14
group 13.7
two 13.5
togetherness 13.2
mature 13
indoor 12.8
grandfather 12.3
wife 12.3
bride 12.2
laptop 11.8
business 11.5
interior 11.5
enjoying 11.4
child 10.9
table 10.8
businessman 10.6
color 10.6
flowers 10.4
20s 10.1
face 9.9
mother 9.8
attractive 9.8
two people 9.7
computer 9.6
office 9.4
meeting 9.4
day 9.4
nurse 9.4
wedding 9.2
pretty 9.1
dress 9
new 8.9
casual clothing 8.8
60s 8.8
celebration 8.8
having 8.7
work 8.6
drinking 8.6
businesspeople 8.5
modern 8.4
communication 8.4
old 8.4
drink 8.3
professional 8.3
bouquet 8.3
fun 8.2
care 8.2
window 8.2
businesswoman 8.2
lady 8.1
working 7.9
life 7.9
living room 7.8
mid adult 7.7
clothing 7.7
married 7.7
elderly 7.7
loving 7.6
casual 7.6
talking 7.6
marriage 7.6
bed 7.6
females 7.6
friends 7.5
human 7.5
friendship 7.5
restaurant 7.4
teacher 7.4
coffee 7.4
inside 7.4
hospital 7.3
aged 7.2
holiday 7.2
handsome 7.1
romantic 7.1
job 7.1
desk 7

Google
created on 2022-01-23

Black 89.7
Chair 88.5
Black-and-white 86.5
Style 84.1
Adaptation 79.3
Monochrome 76.7
Monochrome photography 75.2
Snapshot 74.3
Suit 72.2
Event 70.9
Room 68.6
Art 67.9
Table 67.2
Vintage clothing 66.7
Sitting 65.3
Stock photography 62.3
Font 61.1
History 60.2
Visual arts 54.5
Picture frame 54.1

Microsoft
created on 2022-01-23

text 99.2
table 87.4
boy 82.2
furniture 80.9
chair 77.9
house 73.7
clothing 61.5
old 60.1

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Happy 85.4%
Calm 4.7%
Surprised 4.4%
Sad 3.1%
Angry 0.8%
Confused 0.6%
Fear 0.5%
Disgusted 0.5%

AWS Rekognition

Age 36-44
Gender Female, 73.2%
Happy 75.5%
Sad 10.7%
Calm 7.2%
Angry 2.9%
Surprised 1.3%
Confused 1%
Disgusted 0.8%
Fear 0.4%

AWS Rekognition

Age 37-45
Gender Male, 74.7%
Happy 95.5%
Calm 2.1%
Sad 0.7%
Surprised 0.5%
Confused 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 96.2%
Calm 93%
Sad 2.4%
Happy 1.9%
Confused 1.1%
Fear 0.6%
Disgusted 0.6%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 25-35
Gender Female, 96.1%
Calm 56%
Happy 25.9%
Sad 10.3%
Confused 2.1%
Angry 2%
Fear 1.4%
Disgusted 1.2%
Surprised 1.1%

AWS Rekognition

Age 45-51
Gender Male, 80.1%
Calm 57.1%
Confused 31.6%
Sad 4.8%
Happy 2.5%
Angry 1.4%
Disgusted 0.9%
Fear 0.9%
Surprised 0.8%

AWS Rekognition

Age 27-37
Gender Male, 89.7%
Sad 69.4%
Calm 14.9%
Confused 6.3%
Happy 4.2%
Fear 2.1%
Disgusted 1.3%
Surprised 1.1%
Angry 0.7%

AWS Rekognition

Age 26-36
Gender Female, 86.4%
Calm 80%
Happy 14.2%
Sad 2.6%
Surprised 1.2%
Angry 0.8%
Confused 0.6%
Fear 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97%
Chair 77.2%

Captions

Microsoft

a group of people in an old photo of a boy 48.5%
an old photo of a group of people in a room 48.4%
a group of people in a room 48.3%

Text analysis

Amazon

CHINESE
3
CHINESE ART
58602
ART
20985.

Google

209 85. 20985•
85.
20985•
209