Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2838

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2838

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Body Part 100
Finger 100
Hand 100
Clothing 100
Face 99.7
Head 99.7
Photography 99.7
Portrait 99.7
Person 99.4
Adult 99.4
Female 99.4
Woman 99.4
Person 99.1
Adult 99.1
Female 99.1
Woman 99.1
Hat 98.1
Person 97.2
Accessories 96.9
Bag 96.9
Handbag 96.9
Glasses 93
Person 86.5
Sun Hat 79.5
Person 79.5
Wallet 79.3
Lady 75.2
Hat 71.9
Cap 57.6
Electrical Device 57.1
Microphone 57.1
Purse 56.4
Crowd 55.7
Baseball Cap 55.7
Blouse 55.2
Sunglasses 55.2
Electronics 55.1
Phone 55.1

Clarifai
created on 2018-05-10

people 99.9
adult 99.4
woman 98.2
portrait 97.9
wear 97
one 95.3
actress 94.9
two 93.7
monochrome 93.5
facial expression 93.1
man 92.9
veil 91.2
administration 90.4
musician 89.9
street 88.1
outfit 88.1
music 87.5
group 86.6
singer 83.9
indoors 81.4

Imagga
created on 2023-10-05

person 28.9
portrait 28.5
people 27.3
adult 26.8
man 24.9
face 22.7
sexy 22.5
model 21.8
fashion 21.1
male 20.9
black 18.6
attractive 17.5
chain mail 17.1
happy 16.9
dress 16.3
pretty 16.1
smiling 15.9
style 15.6
smile 15
lady 14.6
clothing 14.3
armor 14.1
body armor 13.6
women 13.4
couple 13.1
blond 13
hat 12.9
elegant 12.8
hair 12.7
love 12.6
lifestyle 12.3
one 11.9
vintage 11.6
spectator 11.4
human 11.2
body 11.2
expression 11.1
sensual 10.9
elegance 10.9
posing 10.7
child 10.6
looking 10.4
passion 10.3
work 10.3
youth 10.2
indoors 9.7
performer 9.7
together 9.6
brunette 9.6
men 9.4
happiness 9.4
worker 9.4
cute 9.3
casual 9.3
bow tie 9.2
mother 9.2
traditional 9.1
old 9.1
fun 9
luxury 8.6
necktie 8.3
leisure 8.3
inside 8.3
retro 8.2
cheerful 8.1
family 8
boy 7.8
vogue 7.7
two 7.6
wearing 7.6
studio 7.6
hot 7.5
relationship 7.5
shop 7.5
world 7.5
holding 7.4
light 7.3
makeup 7.3
indoor 7.3
sensuality 7.3
make 7.3
stylish 7.2
home 7.2
handsome 7.1
romantic 7.1
protective covering 7.1
kid 7.1
job 7.1
interior 7.1
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.9
outdoor 94.3
people 67.1
crowd 1.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 100%
Calm 86.4%
Surprised 8.1%
Confused 6.3%
Fear 6.3%
Sad 2.3%
Happy 1%
Angry 0.8%
Disgusted 0.5%

AWS Rekognition

Age 45-51
Gender Female, 100%
Confused 95.7%
Surprised 6.4%
Fear 6.1%
Sad 2.2%
Calm 1.6%
Angry 1.1%
Disgusted 0.4%
Happy 0.1%

AWS Rekognition

Age 10-18
Gender Male, 89%
Calm 70.2%
Surprised 12.8%
Fear 6.6%
Sad 5.1%
Happy 4.2%
Disgusted 3.8%
Confused 3%
Angry 1.4%

AWS Rekognition

Age 4-10
Gender Male, 77.2%
Calm 98.4%
Surprised 6.3%
Fear 6.1%
Sad 2.3%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Angry 0%

Microsoft Cognitive Services

Age 68
Gender Male

Microsoft Cognitive Services

Age 43
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Female 99.4%
Woman 99.4%
Hat 98.1%
Glasses 93%
Wallet 79.3%

Categories