Human Generated Data

Title

Untitled (four women standing at floral memorial)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7112

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women standing at floral memorial)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.1
Human 99.1
Person 98.4
Person 96.7
Person 93.7
Apparel 89
Clothing 89
Helmet 89
People 80.4
Face 79.5
Female 75.7
Photography 65
Portrait 65
Photo 65
Wedding Cake 64.5
Dessert 64.5
Food 64.5
Cake 64.5
Girl 62.4
Outdoors 60.5
Costume 57.9
Art 56.3
Drawing 56.3
Shorts 55

Imagga
created on 2021-12-15

person 14.9
old 13.9
fan 13.9
water 13.3
negative 13.3
man 12.9
people 12.8
black 12.6
seller 12.5
grunge 11.9
follower 11.2
film 10.9
fountain 10.9
outdoor 10.7
outdoors 10.4
menorah 10.3
men 10.3
sky 10.2
silhouette 9.9
male 9.9
travel 9.9
sign 9.8
product 9.6
newspaper 9.5
business 9.1
park 9.1
dirty 9
history 8.9
creation 8.9
architecture 8.6
sport 8.4
city 8.3
leisure 8.3
vintage 8.3
candelabrum 8.3
light 8
structure 7.9
art 7.8
building 7.8
mask 7.7
vacation 7.4
danger 7.3
color 7.2
lifestyle 7.2
river 7.1
summer 7.1
businessman 7.1

Google
created on 2021-12-15

Hat 87.1
Font 80.6
Adaptation 79.4
People 78.7
Vintage clothing 73.7
History 67.1
Monochrome photography 65.3
Monochrome 62.7
Uniform 61.8
Photo caption 57.3
Room 54.5
Retro style 53.7
Art 52.5
Suit 50.7

Microsoft
created on 2021-12-15

text 99.6
outdoor 95.9
grass 95.5
clothing 89.5
person 82.2
black and white 79.6
old 78.3
wedding dress 77.4
dress 67.2
bride 59.8
woman 57.4
posing 49.5

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Male, 75.1%
Calm 86.9%
Fear 6%
Surprised 2.7%
Happy 1.3%
Sad 1.2%
Confused 0.8%
Angry 0.6%
Disgusted 0.6%

AWS Rekognition

Age 26-42
Gender Female, 72.8%
Disgusted 47.4%
Calm 34.1%
Angry 6.3%
Sad 5.6%
Fear 2.5%
Confused 2.3%
Happy 1%
Surprised 0.8%

AWS Rekognition

Age 44-62
Gender Male, 96.6%
Calm 97%
Sad 1.5%
Angry 0.6%
Happy 0.3%
Fear 0.3%
Confused 0.3%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 48-66
Gender Male, 68.8%
Calm 46.3%
Sad 19.5%
Happy 13.6%
Fear 7.8%
Confused 7.4%
Disgusted 2.2%
Surprised 1.9%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Helmet 89%
Wedding Cake 64.5%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 84.9%
a vintage photo of a group of people posing for a picture 84.8%
a vintage photo of a man 84.7%

Text analysis

Amazon

MEMORY
19610A.
IN MEMORY
IN
17610A.
OF OUR LOVED.ONES

Google

17610A.
A.
17610A. 19610 A. 1
19610
1