Human Generated Data

Title

Untitled (young couple dancing the Jitterbug)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14386

Human Generated Data

Title

Untitled (young couple dancing the Jitterbug)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Dance Pose 99.9
Leisure Activities 99.9
Human 98.4
Person 98.4
Person 97.7
Dance 90
Stage 86.9
Clothing 62
Apparel 62
Ballet 56.7
Female 55.5

Imagga
created on 2022-01-29

sword 94.3
weapon 74.6
dancer 47.1
person 37.9
sport 32.5
performer 29.9
man 29.6
male 26.3
exercise 23.6
people 23.4
active 22.7
action 22.3
style 22.3
fitness 21.7
lifestyle 21.7
adult 20.5
fashion 18.9
entertainer 18.2
motion 18
dance 18
fun 17.2
attractive 16.8
posing 15.1
teenager 14.6
pretty 14
black 13.8
athlete 13.8
pose 13.6
portrait 13.6
dancing 13.5
women 13.5
one 13.4
cool 13.3
modern 13.3
happy 13.2
player 12.9
sexy 12.9
model 12.5
studio 12.2
men 12
street 12
youth 11.9
ball 11.8
happiness 11.8
dress 11.8
hip 11.7
court 11.7
leisure 11.6
performance 11.5
lady 11.4
standing 11.3
outdoors 11.2
body 11.2
casual 11
professional 11
freestyle 10.9
hop 10.8
healthy 10.7
move 10.6
couple 10.5
boy 10.4
sports 10.2
teen 10.1
playing 10
stylish 10
outdoor 9.9
activity 9.9
hip hop 9.9
aerobics 9.8
tennis 9.7
summer 9.7
jump 9.6
looking 9.6
love 9.5
guy 9.4
joy 9.2
health 9
human 9
urban 8.7
play 8.6
smile 8.6
balance 8.5
energy 8.4
elegance 8.4
competition 8.2
recreation 8.1
game 8
smiling 8
rap 7.9
together 7.9
artist 7.7
break 7.6
racket 7.6
dark 7.5
boxer 7.5
suit 7.5
life 7.3
cheerful 7.3
business 7.3
cute 7.2
bright 7.2
day 7.1

Microsoft
created on 2022-01-29

dance 99.7
wall 96.4
text 92.2
clothing 80.8
ballet 80.3
person 74.9
woman 65.8
dress 63.5
dancing 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 78.8%
Surprised 41.9%
Happy 22.7%
Calm 15.3%
Confused 11.9%
Sad 2.6%
Disgusted 2.2%
Fear 1.7%
Angry 1.7%

AWS Rekognition

Age 24-34
Gender Female, 54.4%
Calm 99.7%
Sad 0.2%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a man standing next to a window 48.9%
a man standing in front of a window 48.8%
a man posing for a picture 48.7%

Text analysis

Amazon

MJI7
MJI7 YT3RAS
M
1 M
1
YT3RAS