Human Generated Data

Title

Untitled (couple dancing the jitterbug and holding hands)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14644

Human Generated Data

Title

Untitled (couple dancing the jitterbug and holding hands)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.8
Human 98.8
Person 98.6
Leisure Activities 94.1
Dance Pose 94.1
Sport 80.7
Sports 80.7
Sleeve 75.1
Clothing 75.1
Apparel 75.1
Stage 68.5
Shoe 64.3
Footwear 64.3
Long Sleeve 55.4

Imagga
created on 2022-01-29

dancer 100
performer 100
entertainer 76.5
person 56.2
sport 34.9
dance 31.1
people 30.7
male 29.8
man 29.6
sword 26.4
active 25.2
exercise 24.5
adult 23.5
fitness 23.5
action 22.3
motion 21.4
weapon 21.3
lifestyle 21
body 20
athlete 19.5
silhouette 19
dancing 18.3
black 18
fun 18
fashion 16.6
style 16.3
group 15.3
posing 15.1
cool 14.2
men 13.7
team 13.4
performance 13.4
happy 13.2
training 12.9
teenager 12.8
player 12.7
pose 12.7
boy 12.2
fit 12
modern 11.9
attractive 11.9
elegance 11.8
hip 11.7
leisure 11.6
running 11.5
studio 11.4
human 11.3
sports 11.1
women 11.1
portrait 11
art 11
happiness 11
joy 10.9
healthy 10.7
ball 10.7
run 10.6
move 10.6
athletic 10.5
one 10.5
casual 10.2
street 10.1
model 10.1
energy 10.1
skill 9.6
jump 9.6
play 9.5
movement 9.4
youth 9.4
guy 9.2
competition 9.2
health 9
recreation 9
activity 9
acrobat 8.9
hop 8.9
sexy 8.8
aerobics 8.8
together 8.8
urban 8.7
couple 8.7
exercising 8.7
crowd 8.6
muscular 8.6
legs 8.5
pretty 8.4
dark 8.4
outdoors 8.2
looking 8
arts 7.8
fight 7.7
summer 7.7
outdoor 7.6
enjoying 7.6
runner 7.5
dress 7.2
music 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

dance 99.7
text 96.3
person 87.3

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 89.5%
Calm 99.9%
Surprised 0%
Happy 0%
Sad 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Female, 98.4%
Surprised 62.5%
Happy 30.5%
Calm 4.8%
Fear 0.8%
Sad 0.5%
Angry 0.4%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Shoe 64.3%

Captions

Microsoft

a man jumping in the air 69.3%
a close up of a man jumping in the air 59.9%
a man that is jumping in the air 59.8%

Text analysis

Amazon

MJI7
MJI7 YT3RAS ACHA
YT3RAS
ACHA
80

Google

MJI7 YT3RA2 A
A
MJI7
YT3RA2