Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1669

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1669

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Fence 99
Outdoors 98.4
Person 97.9
Nature 97.8
Person 97.2
Person 97.1
Yard 94.2
Coat 90.6
Face 70.2
Head 70.2
Hat 69.3
Picket Fence 67.1
Garden 57.7
Picket 57.3
Gardener 57.2
Gardening 57.2
Sun Hat 57
Handrail 56.7
Overcoat 55.7
Formal Wear 55.2
Suit 55.2
Cap 55

Clarifai
created on 2018-05-11

people 99.9
adult 98.9
one 98.9
group together 98.2
man 97.8
two 96.3
group 96.2
home 95.3
wear 94.8
administration 94.2
street 93.2
veil 92.8
outfit 92.4
vehicle 92.3
leader 91
war 90.4
military 89.1
many 88.9
recreation 85.9
soldier 82.7

Imagga
created on 2023-10-06

marimba 100
percussion instrument 100
musical instrument 89.5
snow 29.6
winter 27.2
cold 22.4
man 21.5
trees 17.8
people 17.3
landscape 17.1
outdoor 16.8
building 16
male 15.6
house 15
park 14.8
season 14.8
outdoors 13.4
sky 13.4
old 13.2
person 13.2
tree 13.1
balcony 12.5
city 12.5
forest 12.2
worker 12.1
street 12
architecture 11.7
snowy 10.7
couple 10.4
home 10.4
men 10.3
business 9.7
adult 9.7
frost 9.6
frozen 9.5
construction 9.4
happy 9.4
weather 9.3
fence 9.2
road 9
fun 9
history 8.9
work 8.6
black 8.4
sport 8.2
light 8
river 8
smiling 8
scenic 7.9
holiday 7.9
urban 7.9
day 7.8
wood 7.5
town 7.4
new 7.3
active 7.2
happiness 7
rural 7
travel 7
steel drum 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99
black 65.3
white 65.2
old 45.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 97%
Surprised 6.7%
Fear 5.9%
Sad 2.4%
Confused 0.5%
Angry 0.5%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 14-22
Gender Male, 99.6%
Calm 62%
Angry 34.1%
Surprised 6.3%
Fear 6.1%
Sad 2.9%
Happy 0.4%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 11-19
Gender Male, 98.6%
Calm 68.9%
Angry 22.9%
Surprised 7.1%
Fear 6.1%
Sad 3.1%
Confused 1.5%
Disgusted 1.1%
Happy 0.7%

AWS Rekognition

Age 21-29
Gender Male, 81.1%
Angry 70.8%
Calm 20.6%
Surprised 7%
Fear 6%
Happy 3.4%
Sad 2.9%
Confused 0.8%
Disgusted 0.5%

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Coat 90.6%
Hat 69.3%

Categories