Human Generated Data

Title

Untitled (four women standing by memorial)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7048

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women standing by memorial)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7048

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.3
Human 99.3
Clothing 98.9
Apparel 98.9
Person 96.8
Female 93.1
Person 92.8
Dress 90.9
Shorts 89.9
Person 88.9
Face 85.2
Woman 77
Outdoors 75.1
Wedding Cake 74.5
Food 74.5
Dessert 74.5
Cake 74.5
Girl 73.8
Portrait 70.8
Photography 70.8
Photo 70.8
Helmet 67.1
People 64.8
Chair 63.5
Furniture 63.5
Hat 58.8
Costume 57.7
Nature 56.8
Kid 56.6
Child 56.6

Clarifai
created on 2023-10-15

people 99.7
man 98.8
adult 98.3
veil 98
group 96.1
lid 95.6
wear 91.7
monochrome 89
group together 88.3
illustration 85.3
engraving 84.9
administration 84.8
many 83.5
headscarf 83
etching 82.1
crowd 82
woman 81
actor 80.6
street 79.4
gown (clothing) 77.2

Imagga
created on 2021-12-15

fan 24
man 22.2
person 21
follower 19.4
negative 16.3
black 15.6
male 15.6
blackboard 15.6
newspaper 15.4
people 15.1
old 14.6
film 13.4
musical instrument 12.9
product 12
water 12
business 11.5
creation 11.5
sky 11.5
building 11.4
menorah 11.2
businessman 10.6
men 10.3
outdoor 9.9
silhouette 9.9
memorial 9.7
outdoors 9.7
mask 9.6
couple 9.6
park 9.1
dirty 9
sign 9
candelabrum 9
art 8.7
light 8.7
lifestyle 8.7
grunge 8.5
vintage 8.3
holding 8.2
industrial 8.2
photographic paper 8.2
suit 8.1
success 8
accordion 7.9
world 7.8
travel 7.7
wall 7.7
sport 7.6
dark 7.5
adult 7.5
danger 7.3
aged 7.2
computer 7.2
office 7.2
color 7.2
history 7.1
portrait 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.4
outdoor 98.5
black and white 89.5
person 89
clothing 78.2
posing 57.7
old 46.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 46-64
Gender Male, 92.2%
Calm 71.3%
Surprised 19.9%
Fear 4.2%
Happy 1.7%
Disgusted 0.9%
Angry 0.9%
Confused 0.8%
Sad 0.3%

AWS Rekognition

Age 29-45
Gender Male, 51.1%
Calm 73.7%
Disgusted 16.7%
Sad 2.5%
Angry 2.1%
Happy 1.6%
Confused 1.4%
Fear 1.2%
Surprised 0.8%

AWS Rekognition

Age 43-61
Gender Male, 80.7%
Calm 37.7%
Sad 37.6%
Fear 9.1%
Disgusted 4.8%
Happy 3.9%
Confused 3.5%
Angry 2.1%
Surprised 1.4%

AWS Rekognition

Age 40-58
Gender Male, 98.3%
Calm 96.9%
Sad 1.6%
Angry 0.7%
Happy 0.4%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Wedding Cake 74.5%
Helmet 67.1%

Text analysis

Amazon

2
MEMORY
19610.
30 MEMORY
TOF OUR LOVEDIONES
30

Google

19610.
19610.