Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1880

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1880

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Worker 100
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.9
Adult 97.9
Male 97.9
Man 97.9
Person 96.8
Machine 92.6
Wheel 92.6
Person 90.7
Person 90.7
Person 64.9
Face 64.6
Head 64.6
Construction 55.9
Clothing 55.3
Coat 55.3
Hat 55.3

Clarifai
created on 2018-05-11

people 100
group together 99.6
adult 99.4
group 99.3
man 98.2
two 98
vehicle 97.1
many 95.6
one 95.2
military 94.6
three 93.8
war 93.5
four 93.3
five 90.9
woman 89.9
recreation 89.9
watercraft 89.8
transportation system 89.8
soldier 88.9
wear 87.6

Imagga
created on 2023-10-06

marimba 100
percussion instrument 100
musical instrument 100
man 30.9
male 29.1
people 28.4
silhouette 23.2
person 19.1
adult 17
lifestyle 15.9
sitting 15.5
sunset 15.3
couple 14.8
men 13.7
outdoors 11.9
women 11.9
device 11.8
vibraphone 11.6
business 11.5
hand 11.4
together 11.4
happy 11.3
happiness 11
group 10.5
boy 10.4
portrait 10.3
evening 10.3
love 10.3
work 10.2
day 10.2
human 9.7
outdoor 9.2
music 9
black 9
musician 9
sky 8.9
technology 8.9
family 8.9
smiling 8.7
two 8.5
summer 8.4
fun 8.2
laptop 8.2
cheerful 8.1
job 8
businessman 7.9
life 7.9
table 7.8
construction 7.7
casual 7.6
relax 7.6
togetherness 7.5
relaxation 7.5
friendship 7.5
leisure 7.5
teen 7.3
teenager 7.3
playing 7.3
office 7.2
home 7.2
worker 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 95.6
wall 95.5
man 95.4
black 69.3
old 64.5
white 63.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-54
Gender Male, 99.1%
Calm 98%
Surprised 6.3%
Fear 6%
Sad 2.2%
Disgusted 0.7%
Angry 0.3%
Confused 0.2%
Happy 0.1%

AWS Rekognition

Age 33-41
Gender Male, 95.2%
Sad 64.1%
Angry 43.6%
Calm 12.6%
Surprised 9%
Fear 6.6%
Happy 3.5%
Disgusted 1.8%
Confused 1.3%

AWS Rekognition

Age 26-36
Gender Male, 98.8%
Calm 63.7%
Disgusted 12.4%
Surprised 6.9%
Fear 6.3%
Confused 5.8%
Angry 5.6%
Sad 5.2%
Happy 3.6%

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Wheel 92.6%

Categories