Human Generated Data

Title

Untitled (twins seated on couch with mother for portrait, Haverford, PA)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7368

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (twins seated on couch with mother for portrait, Haverford, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Clothing 99
Apparel 99
Person 96.4
Person 94.3
Female 89.1
Shorts 83.4
Face 83.4
Dress 79.8
Sailor Suit 79.5
People 75.2
Furniture 72.7
Chair 72.7
Girl 71.5
Woman 70.8
Accessory 66.2
Sunglasses 66.2
Accessories 66.2
Photography 66
Photo 66
Portrait 66
Text 60
Nurse 58.8
Pants 55.4

Imagga
created on 2022-01-08

swimsuit 47.8
clothing 43.9
garment 39.1
maillot 33.1
bikini 25.4
covering 23.7
people 22.3
sexy 21.7
person 21.4
fashion 20.3
model 20.2
silhouette 19.9
adult 18.8
style 17.8
attractive 17.5
black 17.4
sport 17.4
portrait 16.8
body 16.8
consumer goods 15.8
man 15.6
pose 15.4
grunge 15.3
posing 15.1
human 14.2
sketch 13.8
pretty 13.3
art 12.9
male 12.8
women 12.6
dress 12.6
dance 12.6
lady 12.2
drawing 12
hair 11.9
sensuality 11.8
beachwear 11.2
athlete 10.9
stylish 10.8
face 10.6
design 10.2
figure 10
fitness 9.9
active 9.9
party 9.5
skin 9.3
slim 9.2
modern 9.1
dirty 9
team 9
shape 8.9
brunette 8.7
lifestyle 8.7
play 8.6
men 8.6
elegant 8.6
elegance 8.4
studio 8.4
dark 8.3
vintage 8.3
retro 8.2
exercise 8.2
gorgeous 8.2
graphic 8
standing 7.8
health 7.6
casual 7.6
fashionable 7.6
poster 7.6
happy 7.5
event 7.4
representation 7.3
sensual 7.3
group 7.3

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.1
posing 91.9
sketch 87.2
drawing 80
old 72.5
cartoon 59.9
clothing 59.3
person 56.6

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 98.3%
Surprised 50.1%
Calm 20.8%
Disgusted 17.6%
Sad 2.7%
Angry 2.7%
Confused 2.4%
Happy 2.3%
Fear 1.3%

AWS Rekognition

Age 27-37
Gender Female, 51.8%
Happy 83%
Surprised 13.9%
Calm 1.2%
Sad 0.5%
Fear 0.4%
Confused 0.3%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 16-22
Gender Male, 99.5%
Calm 87.6%
Sad 5.2%
Surprised 1.9%
Fear 1.4%
Angry 1.3%
Confused 1.2%
Happy 0.8%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Sunglasses 66.2%

Captions

Microsoft

a group of people posing for a photo 91.9%
an old photo of a group of people posing for the camera 89.8%
an old photo of a group of people posing for a picture 89.7%

Text analysis

Amazon

13294.

Google

13294.
294-
13 294- 13294.
13