Human Generated Data

Title

Untitled (Japan)

Date

March 14, 1960-April 22, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3258

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Japan)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 14, 1960-April 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3258

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.6
Human 99.6
Transportation 95.8
Vehicle 95.8
Bicycle 95.8
Bike 95.8
Shop 90.5
Person 86.1
Newsstand 82.3
Text 79.5
Bicycle 71.7
Wheel 63
Machine 63
Silhouette 57.7

Clarifai
created on 2018-03-23

people 99.8
adult 98.9
one 98.3
group 98.1
vehicle 97.9
war 96.6
two 96.4
administration 95.6
man 95.2
commerce 93.5
room 92.2
group together 92
three 90.6
wear 89
many 88.3
military 87.7
several 86.7
education 86.4
transportation system 86.1
home 83.5

Imagga
created on 2018-03-23

shop 22.3
old 20.9
wall 20.5
newspaper 20.5
building 19.7
person 18.2
warehouse 18
mercantile establishment 17.6
city 17.4
business 16.4
man 16.1
product 15.9
urban 15.7
construction 15.4
aged 15.4
tobacco shop 14.6
locker 14
industry 13.6
industrial 13.6
work 13.3
texture 13.2
adult 13.1
people 12.8
grunge 12.8
dirty 12.6
creation 12.6
brick 12.5
vintage 12.4
male 12.1
place of business 11.6
door 11.4
fastener 11.2
worker 10.9
office 10.9
outdoors 10.4
antique 10.4
black 10.3
pattern 10.2
device 10.2
stall 9.5
room 9.2
street 9.2
safety 9.2
house 9.2
wood 9.2
window 9.2
box 9.1
interior 8.8
working 8.8
wooden 8.8
restraint 8.8
home 8.8
equipment 8.2
retro 8.2
music 8.1
metal 8
structure 7.9
design 7.9
gate 7.9
paper 7.8
architecture 7.8
ancient 7.8
travel 7.7
entrance 7.7
money 7.6
storage 7.6
finance 7.6
sign 7.5
file 7.5
close 7.4
world 7.4
exterior 7.4
teen 7.3
financial 7.1
job 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 93.1
outdoor 85.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Female, 54.5%
Happy 50.1%
Surprised 47.9%
Disgusted 45.2%
Confused 45.4%
Angry 45.5%
Calm 45.1%
Sad 45.8%

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Bicycle 95.8%
Wheel 63%

Categories

Text analysis

Amazon

e.