Human Generated Data

Title

Untitled (Bethune Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2931

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Bethune Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.5
Person 99.5
Person 99.2
Shorts 89.7
Apparel 89.7
Clothing 89.7
Bag 64.9
Barefoot 57.2
Sack 56.1

Clarifai
created on 2018-03-23

people 100
adult 99.1
one 98.4
two 98
portrait 95
administration 93.6
woman 93.5
man 93.4
three 91.6
wear 89.5
group 86.5
group together 85.1
actress 84.5
furniture 83.6
leader 83
music 82.9
facial expression 80.7
seat 79.2
boxer 77.2
child 76.7

Imagga
created on 2018-03-23

adult 33.7
person 30.8
people 27.9
attractive 26.6
child 25.4
man 24.9
male 22.1
happy 21.9
portrait 21.3
family 20.5
lifestyle 19.5
love 18.9
pretty 18.9
interior 18.6
fashion 18.1
casual 17.8
home 17.5
couple 16.5
parent 16.1
sexy 16.1
together 15.8
holding 15.7
couch 15.5
sofa 15.3
mother 15.1
hair 15.1
sibling 14.5
smiling 13.7
sitting 13.7
happiness 13.3
model 13.2
lady 13
smile 12.8
one 12.7
women 12.6
father 12.1
black 12.1
human 12
neck brace 12
style 11.9
call 11.7
blond 11.6
cute 11.5
looking 11.2
20s 11
sensual 10.9
dress 10.8
world 10.8
face 10.6
old 10.4
body 10.4
relationship 10.3
two 10.2
room 10.1
elegance 10.1
dad 10
romantic 9.8
brace 9.7
adults 9.5
elegant 9.4
care 9
daughter 9
life 9
fun 9
romance 8.9
handsome 8.9
kid 8.9
brunette 8.7
boy 8.7
support 8.6
jeans 8.6
expression 8.5
passion 8.5
kin 8.4
playing 8.2
hugging 7.8
sepia 7.8
hug 7.7
modern 7.7
married 7.7
youth 7.7
relax 7.6
females 7.6
ball 7.6
joy 7.5
chair 7.5
leisure 7.5
emotion 7.4
retro 7.4
cheerful 7.3
strengthener 7.3
sensuality 7.3
son 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.7
sitting 98.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Female, 95.1%
Disgusted 1.4%
Surprised 0.9%
Angry 1.1%
Sad 2.8%
Confused 1.9%
Happy 0.6%
Calm 91.3%

AWS Rekognition

Age 4-9
Gender Female, 98%
Angry 1.8%
Disgusted 1%
Calm 12.5%
Happy 0.3%
Sad 81.8%
Surprised 1.1%
Confused 1.5%

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person sitting on a bench 85.9%
a man and a woman sitting on a bench 75.9%
a person sitting on a bench talking on a cell phone 61.2%