Human Generated Data

Title

Untitled (three men with artificial legs running over a hill)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8251

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men with artificial legs running over a hill)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 98.7
Person 98
Shorts 96.9
Clothing 96.9
Apparel 96.9
People 94.6
Sport 82.9
Team 82.9
Team Sport 82.9
Sports 82.9
Football 73.6
Road 66.1
Asphalt 59.7
Tarmac 59.7
Text 56.7
Sphere 56.5
Outdoors 55.2

Imagga
created on 2022-01-08

runner 64.5
athlete 58.8
contestant 40.6
person 27.7
sport 22.3
man 19.5
silhouette 19
black 18.2
people 17.9
sunset 17.1
male 15.6
run 15.4
grunge 14.5
art 13.7
exercise 13.6
outdoors 13.4
outdoor 13
men 12.9
fitness 12.6
player 12.6
leisure 11.6
sky 11.5
beach 10.5
couple 10.5
old 10.4
decoration 10.3
action 10.2
lifestyle 10.1
active 9.9
adult 9.8
fun 9.7
dark 9.2
freedom 9.1
portrait 9.1
skateboard 9
design 9
recreation 9
activity 9
color 8.9
dance 8.5
legs 8.5
summer 8.4
frame 8.3
vintage 8.3
park 8.2
retro 8.2
happy 8.1
water 8
body 8
women 7.9
grass 7.9
boy 7.8
athletic 7.7
wheeled vehicle 7.7
graffito 7.6
health 7.6
texture 7.6
foot 7.6
energy 7.6
ballplayer 7.6
field 7.5
pattern 7.5
sand 7.5
billboard 7.4
paint 7.2
copy space 7.2
team 7.2
structure 7.2
love 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.6
outdoor 97.6
drawing 92.3
person 86.3
black and white 85.8
cartoon 74.3
sketch 64.4
old 50.6

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 79.1%
Calm 97.8%
Sad 1.8%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%
Happy 0%
Angry 0%

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Calm 99.4%
Surprised 0.5%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%
Sad 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

an old photo of a boy 63.3%
a group of baseball players that are standing in the grass 32.4%
an old photo of a person 32.3%

Text analysis

Amazon

7676

Google

76
76 7676 7676
7676