Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

Date

1939

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3572

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3572

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Architecture 98.5
Building 98.5
Outdoors 98.5
Shelter 98.5
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 98.1
Person 98.1
Adult 97.9
Person 97.9
Female 97.9
Woman 97.9
Person 97.9
Person 97.8
Adult 96.8
Male 96.8
Man 96.8
Person 96.8
Person 96.6
People 96
Person 95.6
Person 94.2
Person 92.9
Person 87
Person 80
Bus Stop 69.6
Boarding 67.6
Head 67.4
Person 64.4
Nature 60.2
Machine 55.8
Wheel 55.8
Hospital 55.2
Urban 55.2
Back 55.1
Body Part 55.1
Clothing 55
Coat 55
Monastery 55

Clarifai
created on 2018-05-10

people 100
many 99.7
group 99.4
group together 99.3
adult 99.2
vehicle 97.6
military 96.8
war 96.2
administration 95.6
man 94.8
several 92.4
wear 91.5
transportation system 91.3
woman 90.4
leader 88.7
child 87.6
soldier 86.7
aircraft 85.3
police 84.1
offense 82.5

Imagga
created on 2023-10-07

stretcher 46.9
litter 37.4
uniform 35
military uniform 29.5
conveyance 28.9
clothing 24.7
man 20.1
danger 20
people 19.5
male 18.4
military 17.4
street 16.6
war 16.4
person 15.1
soldier 14.7
helmet 14.4
city 13.3
gun 12.9
army 12.7
consumer goods 12.2
covering 12.1
weapon 11.7
urban 11.4
building 11.4
men 11.2
safety 11
industrial 10.9
transportation 10.8
destruction 10.7
adult 10.6
protection 10
camouflage 9.8
disaster 9.8
old 9.7
work 9.4
fire 9.4
vehicle 9.2
dirty 9
road 9
scene 8.7
architecture 8.6
dangerous 8.6
business 8.5
travel 8.4
room 8.3
sport 8.1
activity 8.1
car 7.9
black 7.8
accident 7.8
protective 7.8
emergency 7.7
power 7.6
smoke 7.4
equipment 7.4
group 7.3
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

group 96
person 87.5
outdoor 86.5
standing 84.7
people 79

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 68.6%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 5.8%
Confused 2.9%
Angry 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 10-18
Gender Male, 86.1%
Calm 65.4%
Sad 17.3%
Confused 11.3%
Surprised 7.6%
Fear 6%
Disgusted 2.1%
Angry 1.5%
Happy 1.2%

AWS Rekognition

Age 18-26
Gender Female, 68%
Sad 99.8%
Calm 21%
Fear 7.2%
Surprised 6.6%
Angry 2.2%
Confused 2%
Happy 1.1%
Disgusted 0.7%

AWS Rekognition

Age 21-29
Gender Female, 95.8%
Fear 52.2%
Calm 27.7%
Sad 18.5%
Surprised 10.5%
Happy 3%
Disgusted 1.7%
Angry 1.6%
Confused 0.5%

AWS Rekognition

Age 6-12
Gender Female, 50.9%
Calm 67.4%
Confused 12.6%
Fear 10.8%
Surprised 7%
Sad 3.2%
Happy 2.3%
Disgusted 2%
Angry 1.5%

Feature analysis

Amazon

Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Female 97.9%
Woman 97.9%
Wheel 55.8%

Categories