Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1895

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1895

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Worker 99.7
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 98.3
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
War 96.8
Person 96.4
Person 94.9
Clothing 91.8
Glove 91.8
Hat 86.2
Machine 82.9
Wheel 82.9
Person 77.8
Person 74.8
Person 70.8
Person 69.4
Person 65.4
Face 64
Head 64
Construction 57.4
Oilfield 56.8
Outdoors 56.8

Clarifai
created on 2018-05-11

people 99.8
group together 98.9
man 97.9
adult 97.8
group 97.6
military 94.1
vehicle 93.8
many 93.6
war 90.4
transportation system 90.2
two 88.5
three 87.6
watercraft 86.3
woman 84.1
five 82.7
four 82.1
soldier 81.3
several 80.6
street 77.7
wear 76.6

Imagga
created on 2023-10-06

marimba 100
percussion instrument 100
musical instrument 89.2
man 25.5
male 24.8
people 24
silhouette 19
vibraphone 17.7
person 17.6
work 17.2
device 16.6
men 16.3
construction 12.8
industry 12.8
industrial 12.7
adult 12.3
worker 11.5
black 11.4
sitting 11.2
sport 10.9
lifestyle 10.8
outdoor 10.7
stretcher 10.6
equipment 10.4
team 9.8
fun 9.7
working 9.7
gas 9.6
outdoors 8.9
sky 8.9
uniform 8.7
helmet 8.7
winter 8.5
business 8.5
smoke 8.4
playing 8.2
protection 8.2
danger 8.2
technology 8.2
recreation 8.1
group 8.1
guitar 7.9
women 7.9
day 7.8
machine 7.8
portrait 7.8
outside 7.7
litter 7.7
two 7.6
hand 7.6
power 7.5
tool 7.5
platform 7.4
color 7.2
sunset 7.2
job 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

man 94
person 92.7
outdoor 85.1
old 68.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 81.4%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 14-22
Gender Male, 99.9%
Sad 92.9%
Calm 37.6%
Angry 10.2%
Fear 7%
Surprised 6.6%
Disgusted 2%
Confused 1.6%
Happy 0.3%

AWS Rekognition

Age 14-22
Gender Female, 97.9%
Calm 82.4%
Surprised 6.8%
Fear 6.7%
Angry 6.3%
Sad 3.5%
Disgusted 2.2%
Happy 1.7%
Confused 0.7%

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Glove 91.8%
Wheel 82.9%

Categories