Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2873

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2873

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.5
Person 99.5
Person 99
Footwear 93.7
Apparel 93.7
Shoe 93.7
Clothing 93.7
Accessories 90.8
Accessory 90.8
Tie 90.8
Sitting 84.9
Furniture 56

Clarifai
created on 2018-03-23

people 99.9
adult 99.3
group 97.5
man 96.9
two 96.1
one 95.3
furniture 94
war 93.6
military 93.5
seat 93.2
group together 92.7
administration 92.5
sit 91.5
three 90.7
music 89.9
leader 89.5
room 89.5
woman 89
wear 87.8
chair 86.9

Imagga
created on 2018-03-23

blackboard 45.5
grunge 39.2
old 35.5
vintage 29.8
wall 28.2
texture 25.7
aged 24.4
antique 24.2
grungy 22.8
dirty 21.7
retro 21.3
ancient 20.7
space 20.2
musical instrument 19.4
empty 18.9
frame 17.5
pattern 16.4
decay 15.4
rusty 15.2
art 15
textured 14.9
border 14.5
old fashioned 14.3
dark 14.2
structure 13.8
paper 13.3
black 13.2
design 12.9
paint 12.7
material 12.5
stained 12.5
percussion instrument 12.1
keyboard instrument 12.1
upright 12.1
car 11.8
rough 11.8
industrial 11.8
wheeled vehicle 11.6
wallpaper 11.5
dirt 11.5
damaged 11.4
weathered 11.4
blank 11.1
vehicle 11.1
snow 11
abandoned 10.7
room 10.6
concrete 10.5
canvas 10.4
graphic 10.2
grain 10.1
piano 10
backdrop 9.9
building 9.8
parchment 9.6
freight car 9.6
brown 9.6
worn 9.5
danger 9.1
style 8.9
color 8.9
faded 8.8
stains 8.7
stringed instrument 8.7
text 8.7
torn 8.7
decorative 8.3
historic 8.2
device 8.2
transportation 8.1
detail 8
decoration 8
destruction 7.8
grime 7.8
fracture 7.8
crumpled 7.8
stain 7.7
aging 7.7
spot 7.7
barrow 7.7
wind instrument 7.4
light 7.4

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 98.5
man 92.8
outdoor 86.3
old 80.8
black 66.6
white 60.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 93.2%
Confused 0.8%
Happy 4.5%
Sad 2.1%
Angry 1.2%
Disgusted 1.3%
Surprised 1.4%
Calm 88.9%

AWS Rekognition

Age 57-77
Gender Male, 98.4%
Disgusted 1.1%
Surprised 0.9%
Angry 1.9%
Sad 6.4%
Confused 0.9%
Happy 1.2%
Calm 87.6%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 93.7%
Tie 90.8%