Human Generated Data

Title

Untitled (two women, seated, helf-length, striped background)

Date

c. 1940

People

Artist: Michael Disfarmer, American 1884 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2007.287

Human Generated Data

Title

Untitled (two women, seated, helf-length, striped background)

People

Artist: Michael Disfarmer, American 1884 - 1959

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.5
Person 99.5
Person 99.4
Face 97.7
Accessories 95.5
Glasses 95.5
Accessory 95.5
Clothing 88.8
Apparel 88.8
Text 86.8
Female 79.8
Smile 69.5
Dating 64.4
Photo 64.3
Photography 64.3
Portrait 64.3
Girl 60.8
Woman 60.5
Hair 60.3
Man 56.7

Imagga
created on 2022-03-05

mug shot 100
photograph 80.8
representation 62.5
creation 45.8
attractive 31.5
portrait 30.4
people 27.4
adult 27.2
pretty 26.6
person 26.6
model 21.8
hair 21.4
sexy 20.9
fashion 20.4
happy 20.1
face 19.9
man 17.5
women 17.4
two 17
lifestyle 16.6
love 16.6
couple 16.6
cute 16.5
business 16.4
body 16
brunette 14.8
smiling 14.5
male 14.2
computer 13.6
lady 13
smile 12.8
skin 12.8
laptop 12.8
sensual 12.7
erotic 12.3
posing 11.6
black 11.4
human 11.3
sitting 11.2
youth 11.1
sensuality 10.9
star 10.8
cheerful 10.6
style 10.4
expression 10.2
indoor 10
sibling 9.9
girlfriend 9.6
looking 9.6
home 9.6
actor 9.5
mother 9.3
old 9.1
blond 8.9
romantic 8.9
office 8.8
husband 8.6
happiness 8.6
professional 8.5
passion 8.5
dark 8.4
makeup 8.2
fun 8.2
alone 8.2
technology 8.2
student 8.2
performer 8
daughter 7.9
boy 7.8
eyes 7.8
corporate 7.7
naked 7.7
boyfriend 7.7
sexual 7.7
money 7.7
wife 7.6
elegance 7.6
executive 7.6
one 7.5
vintage 7.4
room 7.4
care 7.4
girls 7.3
confident 7.3
businesswoman 7.3
make 7.3
pose 7.3
currency 7.2
handsome 7.1
family 7.1
working 7.1
modern 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 99.9
text 99.7
human face 99.1
smile 98.5
person 97.4
indoor 96.6
clothing 95.8
woman 84.9
handwriting 84.7
glasses 69.4
posing 69.3
picture frame 9.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 100%
Happy 94.6%
Disgusted 1.6%
Surprised 1.3%
Fear 1%
Confused 0.6%
Angry 0.5%
Sad 0.3%
Calm 0.2%

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 96.6%
Calm 1.6%
Surprised 0.8%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%
Sad 0.1%

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Glasses 95.5%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 84.9%
a man and woman standing in front of a mirror posing for the camera 58.8%
a couple of people standing in front of a mirror posing for the camera 58.7%

Text analysis

Amazon

Pouline
Pouline Verser
anna
Verser
anna Lee Versex
Versex
Lee

Google

Lie.
Verser
Pouline
Viseu
Pouline Viseu Anna Lie. Verser
Anna