Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1066

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1066

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Apparel 99.8
Clothing 99.8
Person 97.6
Human 97.6
Skin 96
Person 95.2
Coat 84.1
Hat 71.8
Face 69
Cap 68.2
Jacket 67.3
People 62.8
Overcoat 60.3
Scarf 57
Feather Boa 57

Clarifai
created on 2018-03-23

people 99.9
adult 99
woman 97.8
two 97.4
one 97.3
portrait 97.3
wear 96
street 95.2
man 94.1
administration 92.6
music 92.3
monochrome 89.8
movie 88.7
musician 88.3
veil 87.5
outerwear 87.5
actress 86.2
actor 85.9
child 85.5
outfit 85.2

Imagga
created on 2018-03-23

adult 30.4
portrait 29.1
sexy 28.9
fashion 28.7
face 28.4
person 28
people 27.3
pretty 23.1
hair 23
man 22.9
model 22.6
attractive 22.4
dark 20.9
hair spray 19.4
black 19.3
male 18.6
sensual 18.2
sensuality 16.4
human 15.8
hairdresser 15.6
toiletry 15.5
lady 15.4
posing 15.1
love 15
one 14.9
body 14.4
style 14.1
couple 13.9
expression 13.7
dress 13.6
women 13.4
studio 12.9
elegant 12.9
skin 12.7
clothing 12.4
looking 12
makeup 12
salon 11.8
happy 11.3
lips 11.1
make 10.9
sexual 10.6
erotic 10.6
passion 10.3
room 10.3
happiness 10.2
lifestyle 10.1
elegance 10.1
romantic 9.8
together 9.6
mystery 9.6
brunette 9.6
20s 9.2
fantasy 9
vogue 8.7
desire 8.7
cute 8.6
eyes 8.6
hairstyle 8.6
simpleton 8.6
hand blower 8.6
wall 8.6
two 8.5
world 8.4
costume 8.4
mother 8.3
vintage 8.3
mask 8.1
device 8.1
sadness 7.8
luxury 7.7
coat 7.7
relaxation 7.5
relationship 7.5
emotion 7.4
retro 7.4
indoor 7.3
girls 7.3
parent 7.3
gorgeous 7.2
smiling 7.2
smile 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 100
woman 95.9
outdoor 95.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 99.2%
Happy 3.6%
Disgusted 7.6%
Surprised 3.7%
Calm 51.1%
Angry 6.6%
Sad 22.3%
Confused 5.2%

AWS Rekognition

Age 17-27
Gender Male, 61.7%
Confused 6.7%
Surprised 2.5%
Happy 0.8%
Angry 5.3%
Disgusted 2.3%
Sad 11.6%
Calm 70.8%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%

Text analysis

Amazon

AMB
WRIOLIY
LIGS

Google

AMB WRIO
AMB
WRIO