Human Generated Data

Title

Untitled (man playing with paddle ball and hitting ball with his foot)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14503

Human Generated Data

Title

Untitled (man playing with paddle ball and hitting ball with his foot)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 96.9
Human 96.9
Apparel 93.8
Clothing 93.8
Flooring 93.4
Floor 87
Tie 84
Accessories 84
Accessory 84
Pants 78.6
Rug 76.4
Door 75.4
Photography 70.1
Photo 70.1
Portrait 70.1
Face 70.1
Female 65.5
Coat 64.5
Overcoat 64.5
Suit 64.5
Leisure Activities 62.9
Kicking 60.2
Dance Pose 59.8
Indoors 58.3
Finger 57.9
Sports 56.5
Sport 56.5
Martial Arts 55.3

Imagga
created on 2022-01-29

adult 25.1
people 23.4
man 22.8
cleaner 22.4
sport 21.9
person 21.4
exercise 20.9
body 18.4
silhouette 18.2
fitness 18.1
male 17.7
crutch 17.5
sunset 17.1
active 15.4
posing 15.1
staff 14.8
stick 14.6
dancer 13.9
lifestyle 13.7
beach 13.6
attractive 13.3
portrait 12.9
black 12.6
fashion 12.1
sexy 12
women 11.9
health 11.8
model 11.7
performer 11.6
athlete 11.3
human 11.2
action 11.1
summer 10.9
dance 10.8
outdoor 10.7
brass 10.5
sunrise 10.3
men 10.3
ocean 10.2
energy 10.1
sand 10.1
water 10
leisure 10
pose 10
dress 9.9
pretty 9.8
outdoors 9.7
sea 9.4
leg 9.1
wind instrument 9.1
recreation 9
sky 8.9
lady 8.9
style 8.9
sun 8.9
standing 8.7
running 8.6
legs 8.5
elegance 8.4
dark 8.4
training 8.3
vacation 8.2
sensuality 8.2
musical instrument 8.1
run 7.7
motion 7.7
skill 7.7
trombone 7.7
walking 7.6
healthy 7.6
landscape 7.4
sword 7.4
art 7.4
danger 7.3
happiness 7.1
modern 7

Microsoft
created on 2022-01-29

dance 97.3
wall 95.1
man 91.8
holding 90.3
text 85.6
clothing 84.4
person 75.8
black and white 66.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 100%
Happy 75.1%
Calm 21.7%
Disgusted 0.8%
Surprised 0.7%
Confused 0.6%
Sad 0.5%
Fear 0.3%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%
Tie 84%
Rug 76.4%

Captions

Microsoft

a man holding a kite 31.2%
a man holding a gun 31.1%
a man holding a kite while standing in a room 31%

Text analysis

Amazon

MJI7
MJI7 YT37AS
YT37AS