Human Generated Data

Title

Untitled (woman walking near street)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4510

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman walking near street)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 94.1
Clothing 91
Apparel 91
Person 88
Female 80.2
Text 68.1
Woman 66.5
Person 63.4
People 61.4
Portrait 60.3
Photography 60.3
Photo 60.3
Face 60.3
Chef 55.5
Crowd 55.2

Imagga
created on 2022-01-23

person 33.1
people 23.4
adult 23.2
man 22.8
exercise 20.9
sport 18.6
lifestyle 18.1
male 17.8
portrait 17.5
active 17.1
golfer 16.5
happy 15.7
fitness 15.4
health 15.3
fashion 15.1
fun 15
player 14.8
body 14.4
joy 14.2
cleaner 12.8
pretty 12.6
attractive 12.6
happiness 12.5
model 12.4
summer 12.2
one 11.9
freedom 11.9
dress 11.7
sexy 11.2
casual 11
pose 10.9
sky 10.8
black 10.8
healthy 10.7
posing 10.7
lady 10.5
outdoors 10.5
standing 10.4
style 10.4
clothing 10.4
javelin 10.2
smiling 10.1
smile 10
leisure 10
outdoor 9.9
contestant 9.9
stick 9.6
beach 9.4
cute 9.3
action 9.3
art 9.1
athlete 9
human 9
recreation 9
activity 9
weapon 8.8
dance 8.6
club 8.5
energy 8.4
life 8.3
dancer 8.3
alone 8.2
teenager 8.2
spear 8.2
sports equipment 8
cool 8
love 7.9
bright 7.9
brunette 7.8
face 7.8
sword 7.6
elegance 7.6
enjoy 7.5
professional 7.5
blond 7.5
silhouette 7.4
stylish 7.2
sand 7.2
holiday 7.2
women 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99
man 90.3
outdoor 89.6
black and white 87.2
person 85.9
standing 83
white 74
clothing 70.5
posing 68
drawing 67.3
old 65.5

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 82.6%
Calm 87.5%
Happy 10.7%
Surprised 0.6%
Sad 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 24-34
Gender Female, 69.4%
Calm 97.4%
Angry 0.8%
Happy 0.6%
Surprised 0.4%
Fear 0.3%
Sad 0.2%
Confused 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a vintage photo of a man 89.8%
a person standing posing for the camera 89.7%
a person posing for the camera 89.6%

Text analysis

Amazon

in
32A8
32A8 YE3
٢٤/٣١٢
...9.H.E.
YE3
saw ...9.H.E.
saw

Google

32A8
A3OM3330
32A8 YT3RA2 A3OM3330
YT3RA2