Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2920

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2920

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Smoke Pipe 98.7
Clothing 98.7
Coat 98.7
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Captain 95.6
Officer 95.6
Adult 95.1
Male 95.1
Man 95.1
Person 95.1
Face 77.1
Head 77.1
Accessories 65.5
Bag 65.5
Handbag 65.5
Hat 56.8
Overcoat 56.6
Military 55.4
Military Uniform 55.1

Clarifai
created on 2018-05-10

people 99.9
administration 98.9
adult 98.5
group together 98.1
military 97.8
leader 97.2
uniform 96.8
group 96.6
man 96.1
soldier 95.9
war 95.2
outfit 93.6
wear 93.4
military uniform 93.1
several 90.7
portrait 89
two 88.1
one 87.6
police 87.5
three 86.7

Imagga
created on 2023-10-06

prison 99.7
correctional institution 80.2
penal institution 60.2
institution 40
cell 28.2
building 26.1
architecture 22.7
establishment 20.5
industry 20.5
man 19.5
construction 18
adult 17.5
urban 16.6
people 16.2
city 15.8
travel 15.5
industrial 15.4
business 14.6
person 14.5
work 14.1
old 13.9
worker 13.3
men 12.9
house 12.5
wall 12
structure 11.1
safety 11
black 10.9
helmet 10.7
male 10.6
steel 10.6
modern 10.5
portrait 10.3
window 10.2
street 10.1
uniform 9.9
labor 9.7
factory 9.6
hat 9.1
tourism 9.1
transportation 9
builder 8.8
job 8.8
interior 8.8
concrete 8.6
engineering 8.6
clothing 8.4
iron 8.4
inside 8.3
outdoors 8.2
metal 8
women 7.9
engineer 7.7
office 7.6
stone 7.6
power 7.6
brick 7.5
site 7.5
equipment 7.5
manager 7.4
town 7.4
new 7.3
home 7.2
balcony 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99
person 96.9
man 93.4
old 71.9
white 65.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Male, 100%
Confused 54.1%
Calm 18%
Surprised 12.2%
Fear 7.2%
Angry 6.6%
Sad 3.9%
Disgusted 3.2%
Happy 2.2%

AWS Rekognition

Age 31-41
Gender Female, 60.5%
Calm 90.8%
Surprised 6.6%
Fear 6%
Confused 5.2%
Sad 2.4%
Happy 1.1%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 56-64
Gender Male, 99.8%
Confused 56.3%
Calm 20.1%
Sad 15.2%
Surprised 7%
Fear 6.3%
Happy 3.4%
Disgusted 1.6%
Angry 1.5%

Microsoft Cognitive Services

Age 63
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Smoke Pipe 98.7%
Adult 97.6%
Male 97.6%
Man 97.6%
Person 97.6%
Handbag 65.5%
Hat 56.8%

Text analysis

Amazon

MEMBER