Human Generated Data

Title

Untitled (two couples dancing in room with flag)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14288

Human Generated Data

Title

Untitled (two couples dancing in room with flag)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Leisure Activities 99.5
Dance Pose 99.5
Person 98.7
Human 98.7
Person 98.1
Clothing 96.5
Apparel 96.5
Person 92.7
Dance 91.1
Tango 85.2
Fashion 69.3
Gown 69.3
Robe 67.6
Female 64.9
Wedding 62.7
Bridegroom 58.2
Hug 57.7
Photography 56.9
Portrait 56.9
Face 56.9
Photo 56.9
Wedding Gown 56.8

Imagga
created on 2022-01-29

dancer 34.6
person 34.1
adult 31.3
people 29
fashion 27.9
model 24.9
attractive 22.4
dance 21.8
performer 21.5
portrait 20.7
posing 20.5
pretty 20.3
dress 19.9
lady 19.5
elegance 19.3
body 17.6
black 17.5
hair 17.4
women 17.4
human 16.5
lifestyle 15.9
man 15.6
elegant 15.4
sexy 14.5
legs 14.2
style 14.1
brunette 13.9
teacher 13.8
male 13.7
sensuality 13.6
exercise 13.6
fitness 13.6
professional 13.5
indoors 13.2
look 13.1
wall 12.8
active 12.6
sport 12.4
entertainer 12.1
face 12.1
domestic 12
motion 12
passion 11.3
looking 11.2
street 11.1
cute 10.8
happy 10.7
dancing 10.6
educator 10.6
performance 10.5
modern 10.5
urban 10.5
fashionable 10.4
clothing 10.4
sitting 10.3
men 10.3
action 10.2
casual 10.2
pose 10
gorgeous 10
studio 9.9
fun 9.7
one 9.7
shoes 9.6
standing 9.6
city 9.2
teenager 9.1
sensual 9.1
make 9.1
life 8.8
full length 8.7
couple 8.7
love 8.7
moving 8.6
walking 8.5
health 8.3
makeup 8.2
alone 8.2
indoor 8.2
leg 8.2
lovely 8
home 8
interior 8
smiling 8
business 7.9
smile 7.8
child 7.8
run 7.7
silhouette 7.5
room 7.4
fit 7.4
light 7.4
girls 7.3
group 7.3
romantic 7.1
cool 7.1
art 7.1
happiness 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 94.4
footwear 86.3
clothing 82.5
window 80.2
person 73.4
woman 70.3
black and white 54.7

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a person standing in front of a window 74%
a man and a woman standing in front of a window 42.4%
a person standing next to a window 42.3%

Text analysis

Amazon

MJI7
MJI7 YESTAL A70A
A70A
YESTAL

Google

MJ17
YT37A2
A73
A
MJ17 YT37A2 A73 A