Human Generated Data

Title

Untitled (Artists' Union demonstration?, New York City)

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5589

Human Generated Data

Title

Untitled (Artists' Union demonstration?, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5589

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Musical Instrument 99.7
Tuba 99.7
Brass Section 99.7
Euphonium 99.7
Horn 99.7
Human 97.8
Person 97.8
Person 92.4
Person 44.5

Clarifai
created on 2019-11-11

people 99.4
vehicle 98.9
adult 98.1
man 96.9
one 95.9
monochrome 95.4
transportation system 95.1
military 90.3
music 90.3
war 88.1
group together 87.7
uniform 87.6
outfit 86.9
group 86.8
wear 85.7
portrait 85.4
street 82.8
aircraft 82.7
two 82.4
musician 81.6

Imagga
created on 2019-11-11

bass 73.7
brass 53.5
wind instrument 42.7
sax 36.1
musical instrument 25.9
baritone 23.9
metal 20.1
machine 19.8
industrial 19
music 18.6
man 16.8
steel 16.8
gramophone 16.3
industry 16.2
device 15.7
technology 15.6
factory 15.4
black 13.8
protection 13.6
military 13.5
record player 13.1
cornet 12.9
safety 12.9
soldier 12.7
mask 12.6
worker 12.4
male 12
work 11.8
manufacturing 11.7
war 10.6
smoke 10.2
power 10.1
light 10
danger 10
welding 9.9
equipment 9.8
gun 9.8
uniform 9.7
gas 9.6
welder 8.9
musician 8.8
job 8.8
helmet 8.8
concert 8.7
rock 8.7
skill 8.7
engine 8.7
protect 8.7
musical 8.6
modern 8.4
future 8.4
hot 8.4
hand 8.4
person 8.3
occupation 8.2
horn 8.2
retro 8.2
transportation 8.1
weld 7.9
weapon 7.8
pipe 7.8
3d 7.7
engineering 7.6
sound 7.5
instrument 7.5
trombone 7.1

Google
created on 2019-11-11

Microsoft
created on 2019-11-11

text 95.6
musical instrument 92.7
brass 90.2
music 84.1
trumpet 59.5
person 56

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-57
Gender Male, 54%
Sad 49.5%
Calm 47.3%
Surprised 45.6%
Confused 45.3%
Fear 45.5%
Happy 46%
Disgusted 45.1%
Angry 45.6%

AWS Rekognition

Age 22-34
Gender Male, 50.8%
Angry 45.2%
Confused 45.1%
Fear 45.1%
Disgusted 45.2%
Calm 51.1%
Happy 45.3%
Surprised 45.1%
Sad 48%

AWS Rekognition

Age 29-45
Gender Female, 52.3%
Disgusted 45%
Sad 45%
Fear 45.1%
Surprised 52%
Happy 45%
Calm 47.8%
Confused 45.1%
Angry 45%

AWS Rekognition

Age 23-35
Gender Female, 53.5%
Angry 6.2%
Sad 18%
Disgusted 1.7%
Surprised 12.5%
Happy 7.4%
Fear 21.5%
Calm 30.7%
Confused 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%

Categories

Text analysis

Google

SP
SP