Human Generated Data

Title

Untitled (twins seated on couch with mother for portrait, Haverford, PA)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7367

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (twins seated on couch with mother for portrait, Haverford, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Apparel 99.8
Clothing 99.8
Person 99.8
Human 99.8
Shorts 97.4
Person 96.5
Chair 95.8
Furniture 95.8
Person 95.7
Female 94.6
Face 94.5
Dress 92.1
Sailor Suit 85.3
Woman 81.7
Smile 81.4
Portrait 73.5
Photo 73.5
Photography 73.5
Girl 72.5
People 67.3
Kid 62.7
Child 62.7
Man 62
Plant 61.6
Text 61
Shirt 58
Pants 57.6
Sea 57.1
Nature 57.1
Water 57.1
Outdoors 57.1
Ocean 57.1

Imagga
created on 2022-01-08

silhouette 29
clothing 23.1
people 22.3
grunge 18.7
swimsuit 18.2
black 18.1
fashion 18.1
person 17
sexy 16.9
garment 16
sport 15.7
posing 15.1
style 14.8
art 14.8
man 13.7
adult 13.6
design 13.5
women 13.4
drawing 13.1
dance 12.8
body 12.8
dress 12.6
model 11.7
maillot 11.3
bikini 11.3
party 11.2
covering 11.1
elegance 10.9
sensuality 10.9
pretty 10.5
athlete 10.5
portrait 10.3
play 10.3
active 9.9
retro 9.8
sketch 9.8
hair 9.5
representation 9.3
holiday 9.3
creation 9.1
stylish 9
team 9
player 8.8
love 8.7
elegant 8.6
male 8.5
poster 8.5
flower 8.5
modern 8.4
attractive 8.4
event 8.3
vintage 8.3
human 8.2
mug shot 8.2
pattern 8.2
lady 8.1
symbol 8.1
activity 8.1
night 8
grass 7.9
summer 7.7
old 7.7
figure 7.6
bride 7.5
dark 7.5
competition 7.3
shape 7.3
decoration 7.2
cartoon 7.1
game 7.1
consumer goods 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.9
drawing 93.2
sketch 88.8
cartoon 80.2
clothing 79.5
posing 75
person 73.9
old 70.6
woman 58.6
smile 58.3
black and white 56.4

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 97.2%
Surprised 58%
Disgusted 14.5%
Happy 10.5%
Calm 7.5%
Confused 3%
Angry 2.7%
Sad 2.3%
Fear 1.4%

AWS Rekognition

Age 16-22
Gender Male, 98.7%
Calm 88.6%
Fear 2.9%
Surprised 2.7%
Sad 1.9%
Confused 1.5%
Angry 0.9%
Happy 0.9%
Disgusted 0.7%

AWS Rekognition

Age 25-35
Gender Female, 99.3%
Calm 69.4%
Surprised 23.5%
Happy 4.4%
Sad 0.8%
Disgusted 0.6%
Fear 0.6%
Angry 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people posing for a photo 86.8%
an old photo of a group of people posing for the camera 85%
an old photo of a group of people posing for a picture 84.9%

Text analysis

Amazon

13298
13298.

Google

298
-
13 298 - NAGON- YT37A2-MAMTZA3 •862E1 13298·
13
•862E1
13298·
NAGON-
YT37A2-MAMTZA3