Human Generated Data

Title

Untitled (young couple dancing the Jitterbug)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14389

Human Generated Data

Title

Untitled (young couple dancing the Jitterbug)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99
Person 99
Person 98
Dance Pose 96.8
Leisure Activities 96.8
Sport 82.5
Sports 82.5
Clothing 77.9
Shorts 77.9
Apparel 77.9
Stage 74.7
Dance 73.8
Footwear 69.9
Shoe 69.9
Sleeve 58.3
Fencing 56
Shoe 53.3

Imagga
created on 2022-01-29

dancer 92
performer 70.8
entertainer 50
person 44.1
sport 38.3
man 35.6
male 34.1
exercise 31.8
active 29
people 28.5
fitness 27.1
sword 25.3
ball 24.7
adult 24.3
player 23.7
lifestyle 21.7
athlete 21.1
fun 21
weapon 20.3
dance 20.1
action 18.6
competition 18.3
play 16.4
motion 16.3
body 16
court 15.6
men 15.5
health 15.3
activity 15.2
professional 15
boy 14.8
athletic 14.4
recreation 14.3
racket 13.8
fit 13.8
happy 13.8
tennis 13.6
one 13.4
outdoors 13.4
game 13.4
leisure 13.3
healthy 13.2
black 13.2
outdoor 13
teacher 12.8
style 12.6
attractive 12.6
dancing 12.5
silhouette 12.4
summer 12.2
sports 12
team 11.6
run 11.6
posing 11.6
running 11.5
women 11.1
casual 11
playing 10.9
joy 10.9
exercising 10.6
fashion 10.6
performance 10.5
outside 10.3
happiness 10.2
street 10.1
teen 10.1
teenager 10
pose 10
cool 9.8
human 9.8
group 9.7
couple 9.6
basketball 9
educator 8.9
businessman 8.8
urban 8.7
full length 8.7
match 8.7
equipment 8.6
move 8.6
model 8.6
youth 8.5
business 8.5
portrait 8.4
pretty 8.4
training 8.3
together 7.9
tournament 7.8
serve 7.8
practice 7.7
modern 7.7
studio 7.6
net 7.6
balance 7.6
runner 7.6
energy 7.6
elegance 7.6
vacation 7.4
looking 7.2
sky 7

Microsoft
created on 2022-01-29

dance 99.1
text 94.4
ice skating 84
footwear 60.6

Feature analysis

Amazon

Person 99%
Shoe 69.9%

Captions

Microsoft

a man standing next to a window 53.7%
a man standing in front of a window 53.6%
an old photo of a man 53.5%

Text analysis

Google

The
The