Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2981

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2981

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 99.1
Person 99
Adult 99
Male 99
Man 99
Person 98.8
Person 98.4
Person 97.5
Adult 97.5
Male 97.5
Man 97.5
Person 97.4
Person 96.9
Adult 96.9
Male 96.9
Man 96.9
Machine 90.3
Wheel 90.3
People 90
Person 89.3
Person 88.9
Outdoors 72
Clothing 65.4
Footwear 65.4
Shoe 65.4
Funeral 64.5
Hat 62.4
Worker 57.9
Shoe 56.6
Architecture 55.8
Building 55.8
Factory 55.8
Coat 55.3
Wood 55

Clarifai
created on 2018-05-10

people 100
group 99.7
group together 99.6
adult 99.4
many 99.2
military 97.5
two 96.9
several 96.8
vehicle 95.9
man 95.8
three 95.7
administration 95.2
uniform 94.9
war 93.1
wear 91.8
four 91.4
soldier 91.1
five 91.1
one 90.4
leader 89.9

Imagga
created on 2023-10-06

vehicle 92.2
half track 86.1
military vehicle 75.9
tracked vehicle 74.2
conveyance 44.9
wheeled vehicle 40.9
old 23.7
bench 18.9
transportation 15.2
grass 15
military 14.5
car 14.4
tree 13.8
war 13.6
wheel 13.2
industrial 12.7
landscape 12.6
stretcher 12.4
tank 12.4
rural 12.3
gun 12.3
work 11.8
dirty 11.7
architecture 11.7
wood 11.7
city 11.6
park 11.5
park bench 11.5
industry 11.1
transport 11
road 10.8
farm 10.7
travel 10.6
seat 10.5
building 10.5
truck 10.5
tractor 10.4
litter 10.4
weapon 10.3
uniform 10.2
field 10
soldier 9.8
army 9.7
wagon 9.7
cannon 9.6
dirt 9.5
snow 9.5
machine 9.4
power 9.2
outdoor 9.2
artillery 9.1
history 8.9
sky 8.9
camouflage 8.8
hay 8.8
military uniform 8.3
danger 8.2
clothing 8.1
yellow 7.9
day 7.8
scene 7.8
antique 7.8
construction 7.7
winter 7.7
house 7.5
iron 7.5
street 7.4
protection 7.3
color 7.2
tire 7.1
male 7.1
working 7.1
country 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 94
old 56.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 30-40
Gender Male, 99.7%
Calm 59.9%
Happy 20.7%
Surprised 18.9%
Fear 5.9%
Confused 3.2%
Sad 2.2%
Angry 0.7%
Disgusted 0.5%

AWS Rekognition

Age 28-38
Gender Female, 93.4%
Happy 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 55.9%
Confused 18.1%
Happy 13.2%
Surprised 9.1%
Fear 6.2%
Sad 3.6%
Angry 2.5%
Disgusted 1.2%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Happy 63.6%
Calm 28.6%
Surprised 6.6%
Fear 6%
Sad 4%
Angry 1.1%
Disgusted 0.9%
Confused 0.4%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Happy 97.5%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Calm 0.6%
Disgusted 0.6%
Angry 0.6%
Confused 0.1%

AWS Rekognition

Age 31-41
Gender Male, 98.4%
Happy 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Calm 0.4%
Disgusted 0%
Confused 0%
Angry 0%

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 52
Gender Male

Feature analysis

Amazon

Person 99.1%
Adult 99%
Male 99%
Man 99%
Wheel 90.3%
Shoe 65.4%
Hat 62.4%