Human Generated Data

Title

Untitled (two men wrestling on mat surrounded by spectators)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12099

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men wrestling on mat surrounded by spectators)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12099

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 99.2
Person 98.3
Person 96.7
Person 92.9
Person 90.3
Fitness 86.5
Working Out 86.5
Sport 86.5
Exercise 86.5
Sports 86.5
Person 85
Dance Pose 83.5
Leisure Activities 83.5
Person 76.1
Shorts 70.4
Clothing 70.4
Apparel 70.4
Person 68.2
Person 64.3
Acrobatic 60.2
Shoe 56.5
Footwear 56.5
Person 46

Clarifai
created on 2023-10-25

dancer 99.5
people 98.9
ballet 97.7
dancing 97.4
ballerina 97
monochrome 96.5
woman 96.2
adult 95.2
motion 94.8
balance 94.1
body 94
skill 94
action energy 93.4
ballet dancer 93.1
athlete 92.9
man 92.4
exercise 92.3
group together 91.9
two 90.9
one 90.7

Imagga
created on 2022-01-15

billboard 21.5
signboard 17.4
man 16.1
car 15.9
adult 15.5
person 15
clothing 13.8
pose 13.6
people 13.4
attractive 13.3
vehicle 13.1
sport 12.9
body 12.8
black 12.7
structure 12.5
portrait 12.3
killer whale 12
exercise 11.8
happy 11.3
pretty 11.2
sitting 11.2
television 10.9
model 10.9
male 10.6
driving 10.6
fashion 10.5
fun 10.5
one 10.4
sexy 10.4
lifestyle 10.1
leisure 10
fitness 9.9
driver 9.8
posing 9.8
summer 9.6
dolphin 9.6
automobile 9.6
healthy 9.4
cute 9.3
relaxation 9.2
style 8.9
looking 8.8
hair 8.7
smile 8.5
mirror 8.4
screen 8.3
sunglass 8.1
equipment 8.1
road 8.1
transportation 8.1
women 7.9
look 7.9
travel 7.7
eyes 7.7
casual 7.6
dark 7.5
human 7.5
covering 7.5
outdoors 7.5
window 7.4
fit 7.4
light 7.3
toothed whale 7.2
activity 7.2
performer 7
sky 7
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 95.6
dance 86
outdoor 85.7
black and white 53.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Calm 98.5%
Sad 1.3%
Confused 0.1%
Happy 0.1%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 84.9%
Calm 70.6%
Sad 10.1%
Happy 7%
Confused 3.1%
Angry 2.6%
Surprised 2.4%
Fear 2.4%
Disgusted 1.7%

AWS Rekognition

Age 34-42
Gender Male, 96.4%
Calm 54.5%
Happy 23.5%
Surprised 8%
Sad 5.3%
Confused 3.1%
Angry 2.5%
Disgusted 1.9%
Fear 1.2%

Feature analysis

Amazon

Person 99.2%
Shoe 56.5%

Categories

Imagga

paintings art 98.9%

Text analysis

Amazon

PP