Human Generated Data

Title

Untitled (family posing for Christmas card with instruments and flag)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8464

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family posing for Christmas card with instruments and flag)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8464

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 98.7
Human 98.7
Wheel 94.4
Machine 94.4
Person 93.4
Person 93
Person 85.2
Musician 83.6
Musical Instrument 83.6
Art 82.9
Clothing 78.6
Apparel 78.6
Person 73.9
Drawing 73.7
Leisure Activities 67.3
Sketch 59.9
Music Band 57.2
Brick 56.8
Person 48.1
Person 41.6

Clarifai
created on 2023-10-26

people 99.9
group 98.6
adult 98.5
man 98
illustration 94.5
wear 93.1
art 92.4
leader 92.4
drum 90.3
print 90.2
military 89.8
group together 89.5
outfit 89
several 87.8
war 87.3
woman 87.1
veil 87.1
music 85.8
many 85.7
soldier 85.6

Imagga
created on 2022-01-15

graffito 100
decoration 66.5
grunge 23
old 19.5
dirty 18.1
vintage 15.7
art 15.2
black 15
man 13.4
retro 11.5
dark 10.9
male 10.6
texture 10.4
style 10.4
silhouette 9.9
person 9.7
graphic 9.5
people 9.5
frame 9.2
sport 9.1
adult 9.1
building 8.7
light 8.7
antique 8.7
wall 8.5
space 8.5
portrait 8.4
pattern 8.2
aged 8.1
steel drum 8
body 8
musical instrument 8
design 7.9
urban 7.9
paper 7.8
play 7.8
scary 7.7
empty 7.7
grungy 7.6
power 7.6
percussion instrument 7.4
history 7.2
face 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.6
outdoor 97.2
drawing 89.6
black and white 89
person 86.4
musical instrument 71.5
man 67
clothing 57.6
painting 52.6
old 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 93.4%
Calm 97.5%
Sad 2.2%
Happy 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 94.7%
Calm 99.9%
Sad 0.1%
Happy 0%
Fear 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 93%
Calm 98.6%
Sad 0.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 38-46
Gender Male, 73.7%
Happy 76.5%
Calm 21.6%
Disgusted 0.6%
Sad 0.5%
Surprised 0.3%
Confused 0.3%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 52-60
Gender Male, 91%
Calm 53.7%
Sad 36%
Surprised 4.3%
Confused 2.5%
Angry 1.1%
Fear 1%
Disgusted 0.9%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Wheel 94.4%

Categories

Text analysis

Amazon

14623
14623.

Google

1Y523 . 14623.
1Y523
.
14623.