Human Generated Data

Title

Untitled (semi-nude dancer on stage)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20217

Human Generated Data

Title

Untitled (semi-nude dancer on stage)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.4
Apparel 99.4
Person 97.9
Human 97.9
Dress 96.9
Female 96.5
Chair 90.1
Furniture 90.1
Woman 88.9
Floor 87
Dance Pose 86
Leisure Activities 86
Wheel 72.7
Machine 72.7
Face 71.6
Portrait 71.6
Photography 71.6
Photo 71.6
Indoors 71.4
Girl 67.8
Door 61.5
Flooring 60.9
Home Decor 59.6
Suit 59.4
Coat 59.4
Overcoat 59.4
Animal 58.9
Mammal 58.9
Canine 58.9
Costume 55.9
Living Room 55.5
Room 55.5

Imagga
created on 2022-03-05

harp 24.5
dress 23.5
people 19
statue 18.2
person 17.3
weapon 15.9
old 15.3
portrait 14.9
sculpture 14.9
bride 14.4
religion 14.3
man 14.1
building 13.9
adult 13.9
wedding 13.8
lady 13.8
fashion 13.6
device 13.6
male 13.5
architecture 13.4
stringed instrument 13.4
pretty 13.3
church 12.9
art 12.7
day 12.5
happy 12.5
city 12.5
outdoors 11.9
couple 11.3
looking 11.2
women 11.1
musical instrument 11
model 10.9
clothing 10.3
love 10.3
holy 9.6
god 9.6
ancient 9.5
happiness 9.4
house 9.4
religious 9.4
monument 9.3
column 9.3
two 9.3
smile 9.3
elegance 9.2
face 9.2
posing 8.9
sacred 8.8
sword 8.7
temple 8.7
faith 8.6
bow 8.5
support 8.2
one 8.2
catholic 8.1
family 8
interior 8
smiling 8
attractive 7.7
gown 7.7
married 7.7
life 7.6
marriage 7.6
human 7.5
traditional 7.5
antique 7.4
style 7.4
groom 7.3
makeup 7.3
cheerful 7.3
hair 7.1
travel 7
wall 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 90.1
dress 85.8
text 72.2
wedding dress 64.9
woman 55.5

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 99.1%
Happy 73.4%
Sad 16.3%
Calm 6.1%
Confused 1.4%
Surprised 1.1%
Disgusted 0.8%
Fear 0.4%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Wheel 72.7%

Captions

Microsoft

a person doing a trick on a skateboard 26.2%
a person standing in a room 26.1%

Text analysis

Amazon

SAA

Google

YT33A°2-
YT33A°2- XAGON
XAGON