Human Generated Data

Title

Untitled (family posing for Christmas card with instruments and flag)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8463

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family posing for Christmas card with instruments and flag)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8463

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Clothing 99.4
Apparel 99.4
Person 97.9
Person 97.2
Wheel 96.9
Machine 96.9
Person 96.8
Person 94
Vegetation 85.3
Plant 85.3
Shorts 83.5
Female 76.2
Person 72.7
Land 70.2
Nature 70.2
Outdoors 70.2
Tree 68.9
Woman 61.4
Swimwear 59
People 58.3

Clarifai
created on 2023-10-26

people 99.9
group 98.4
adult 98.2
military 98.2
soldier 96.8
war 96.8
man 96.2
uniform 94.7
wear 94.2
child 93.6
woman 92.4
weapon 92.3
gun 92.2
group together 92.1
veil 91.8
rifle 91.5
outfit 91.5
many 88.8
lid 87.5
art 86.6

Imagga
created on 2022-01-15

television 63
telecommunication system 49.3
newspaper 22.9
product 17.6
man 17.5
person 17.4
black 16.8
old 16.7
people 16.2
grunge 16.2
male 15.6
vintage 14.9
creation 14.5
art 14
adult 13.6
silhouette 12.4
one 11.9
dirty 11.8
portrait 11
human 10.5
dark 10
sensuality 10
retro 9.8
decoration 9.6
body 9.6
water 9.3
sport 9.1
posing 8.9
symbol 8.8
antique 8.7
pattern 8.2
style 8.2
paint 8.1
design 7.9
blackboard 7.8
outdoor 7.6
texture 7.6
serene 7.5
frame 7.5
aged 7.2
sexy 7.2
lifestyle 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.6
tree 98.3
person 95.5
clothing 89.4
man 84.3
black and white 81.3
musical instrument 80
old 59.4
drum 53.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 86%
Happy 73.9%
Calm 25%
Surprised 0.3%
Disgusted 0.3%
Sad 0.1%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Female, 63.5%
Sad 60.6%
Happy 20.3%
Calm 15.6%
Confused 1.4%
Fear 0.8%
Surprised 0.5%
Disgusted 0.5%
Angry 0.2%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 98.1%
Sad 0.6%
Fear 0.5%
Surprised 0.3%
Confused 0.3%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Female, 75.5%
Sad 63.5%
Calm 35.9%
Surprised 0.2%
Happy 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 51-59
Gender Male, 95.3%
Calm 55.9%
Sad 41.2%
Surprised 1%
Confused 0.9%
Angry 0.4%
Happy 0.3%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Wheel 96.9%

Categories

Captions

Text analysis

Amazon

14621.
st

Google

14621,
14621,