Human Generated Data

Title

Untitled (man pointing in the air with two other men)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8536

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man pointing in the air with two other men)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8536

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Clothing 99.6
Apparel 99.6
Person 99.5
Person 99.4
Shorts 94.4
Shoe 90.2
Footwear 90.2
Car 89.6
Transportation 89.6
Vehicle 89.6
Automobile 89.6
Sport 85.7
Sports 85.7
Person 82.7
Hat 69.5
Person 65.7
People 65.3
Portrait 64.9
Photography 64.9
Face 64.9
Photo 64.9
Person 64
Sun Hat 63.8
Golf 62.3
Person 60.4
Ground 58.7
Shoe 57.2

Clarifai
created on 2023-10-25

people 99.8
group together 98.3
adult 97.2
man 96
group 93.5
wear 92.9
golfer 92.5
sports equipment 92
woman 91.8
two 91.6
many 90.5
several 89.6
veil 89.1
three 89
lid 88.5
golf club 87.2
child 82.4
outfit 80.2
five 79.2
facial expression 79

Imagga
created on 2022-01-09

crutch 44.4
sport 34.8
staff 34.5
sword 29.6
stick 28.4
man 28.2
weapon 27.7
people 25.7
golf 22
adult 21.4
golfer 21
active 20.8
person 19.9
male 19.1
grass 19
ball 18.6
sky 17.9
exercise 17.2
player 16.9
outdoors 16.4
play 16.4
course 16.2
summer 16.1
outdoor 16.1
club 16
sports equipment 15.6
leisure 14.9
playing 14.6
lifestyle 14.5
silhouette 14.1
couple 13.9
hole 13.4
competition 12.8
beach 12.6
sunset 12.6
happy 12.5
senior 12.2
men 12
fitness 11.7
golfing 11.7
game 11.6
walking 11.4
action 11.1
outside 11.1
activity 10.7
cricket equipment 10.6
fun 10.5
javelin 10.4
happiness 10.2
recreation 9.9
swing 9.8
practice 9.7
sun 9.7
love 9.5
water 9.3
training 9.2
athlete 9.2
field 9.2
black 9
sea 8.6
drive 8.5
spear 8.3
sports 8.3
ocean 8.3
freedom 8.2
equipment 8.2
tee 8
putt 7.9
driver 7.9
standing 7.8
athletic 7.7
two 7.6
dark 7.5
landscape 7.4
life 7.4
vacation 7.4
sand 7.2
romance 7.1
women 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

golf 98
outdoor 96.6
grass 95.1
text 90.8
person 88.5
black and white 76.5
player 64.5
white 61.5
posing 38.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Confused 50.2%
Disgusted 23.2%
Happy 9.9%
Calm 5.6%
Surprised 5.2%
Angry 4.1%
Sad 1.1%
Fear 0.6%

AWS Rekognition

Age 50-58
Gender Male, 98.3%
Surprised 71.3%
Disgusted 18%
Calm 6.6%
Happy 1.9%
Angry 0.8%
Fear 0.6%
Confused 0.5%
Sad 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 90.2%
Car 89.6%

Categories

Imagga

people portraits 54.9%
paintings art 42%

Text analysis

Amazon

16102.
16102
NAMTSAR

Google

16102. 16102.
16102.