Human Generated Data

Title

Ladies Room

Date

1979

People

Artist: Sharon Smith, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1669

Human Generated Data

Title

Ladies Room

People

Artist: Sharon Smith, American 20th century

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1669

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.6
Human 98.6
Person 96.6
Smoking 87.6
Smoke 87.6
Text 56.6
Skin 55.6

Clarifai
created on 2023-10-26

people 99.3
portrait 98.7
woman 98.4
adult 97.2
monochrome 95.9
one 94
man 93.5
retro 93.1
two 91
fashion 89.2
actor 88.3
girl 88.2
music 87.3
wear 87
screen 86.7
mirror 85.3
model 82
business 80.1
technology 79.5
couple 79.4

Imagga
created on 2022-01-09

adult 37
portrait 29.8
person 28.4
people 25.1
attractive 23.8
black 23.2
love 22.9
sexy 22.5
man 22.2
pretty 20.3
happy 20.1
couple 20
male 19.3
face 19.2
model 18.7
fashion 15.8
brunette 15.7
smile 15.7
microphone 15.2
hair 15.1
bride 14.4
style 13.4
holding 13.2
smiling 13
lifestyle 13
women 12.7
sitting 12
close 12
sensual 11.8
groom 11.1
two 11
happiness 11
dress 10.8
cheerful 10.6
looking 10.4
skin 10.3
car 10.2
wedding 10.1
dark 10
cover girl 10
sensuality 10
gorgeous 10
office 9.9
bow tie 9.9
kiss 9.8
lady 9.7
loving 9.5
eyes 9.5
erotic 9.4
passion 9.4
youth 9.4
business 9.1
lingerie 9.1
make 9.1
human 9
necktie 9
romance 8.9
handsome 8.9
intimacy 8.9
husband 8.9
passionate 8.8
smasher 8.8
look 8.8
professional 8.7
hug 8.7
cute 8.6
corporate 8.6
wife 8.5
head 8.4
computer 8.3
businesswoman 8.2
posing 8
working 8
kissing 7.9
hands 7.8
embrace 7.8
lovers 7.7
boyfriend 7.7
elegant 7.7
girlfriend 7.7
expression 7.7
talking 7.6
studio 7.6
elegance 7.6
device 7.3
girls 7.3
confident 7.3
body 7.2
romantic 7.1
lovely 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

human face 96.9
text 88.9
person 87.9
woman 79.6
black and white 74.7
portrait 52.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 100%
Surprised 79.2%
Calm 9.1%
Sad 3.7%
Fear 3.3%
Confused 2%
Angry 1.5%
Happy 0.8%
Disgusted 0.4%

AWS Rekognition

Age 24-34
Gender Female, 99.9%
Calm 73%
Sad 19.4%
Angry 4.3%
Fear 0.9%
Disgusted 0.8%
Confused 0.6%
Surprised 0.6%
Happy 0.5%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Categories

Imagga

pets animals 99.9%

Captions

Microsoft
created on 2022-01-09

graphical user interface 35.9%

Text analysis

Amazon

Ladies
Room
The Ladies Room
The
197
Smith
the Smith
the

Google

The hadies Laem
hadies
The
Laem