Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.101

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.101

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 99.9
Coat 99.5
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
People 98.9
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Person 98.2
Person 97.9
Person 97.7
Person 97.4
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Person 96.9
Adult 96.6
Adult 96.6
Person 96.6
Bride 96.6
Female 96.6
Female 96.6
Wedding 96.6
Woman 96.6
Person 94.5
Baby 94.5
Person 92.7
Baby 92.7
Art 91.6
Painting 91.6
Face 87.9
Head 87.9
Adult 87.8
Person 87.8
Female 87.8
Woman 87.8
Person 87.4
Person 87.3
Photography 80.9
Portrait 80.9
Person 79.1
Person 77.6
Jeans 72.1
Pants 72.1
Outdoors 69.1
Hat 60.8
Nature 57.4
Bonnet 56.6
Architecture 56.6
Building 56.6
Hospital 56.6
Cap 55.7
Wall 55.5
Dress 55.1

Clarifai
created on 2018-05-11

people 99.9
group 99.1
many 97.8
group together 96.8
adult 95.6
administration 94.9
man 94.4
child 93.6
several 89.8
woman 89.8
leader 88.9
chair 85
education 84.9
room 82.1
wear 80.7
home 78.9
war 78.7
family 77.9
boy 77.5
sit 76.7

Imagga
created on 2023-10-05

man 27.5
classroom 24.8
room 24.6
person 23.7
people 22.9
male 20.7
old 17.4
adult 16.2
couple 15.7
religion 13.4
family 13.3
hospital 12
home 12
happy 11.9
teacher 11.3
religious 11.2
men 11.2
patient 11.1
happiness 11
history 10.7
businessman 10.6
nurse 10.4
mother 10.4
portrait 10.3
child 10.1
20s 10.1
tourism 9.9
statue 9.9
team 9.8
business 9.7
medical 9.7
group 9.7
30s 9.6
standing 9.6
catholic 9.5
day 9.4
smiling 9.4
architecture 9.4
senior 9.4
church 9.2
clothing 9.1
40s 8.8
ancient 8.6
god 8.6
building 8.5
mature 8.4
father 8.3
camera 8.3
uniform 8.3
care 8.2
aged 8.1
new 8.1
worker 8
indoors 7.9
boy 7.8
black 7.8
half length 7.8
school 7.8
white 7.8
monk 7.8
colleagues 7.8
worship 7.7
student 7.7
faith 7.7
two 7.6
temple 7.6
monument 7.5
historic 7.3
women 7.1
love 7.1
to 7.1
working 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.7
people 87.9
outdoor 87
group 85.4
old 48.5
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 87.2%
Surprised 7%
Fear 6.1%
Sad 4.6%
Happy 2.1%
Angry 1.8%
Confused 0.8%
Disgusted 0.8%

AWS Rekognition

Age 4-12
Gender Female, 100%
Calm 72.3%
Sad 40.5%
Surprised 6.4%
Fear 6%
Confused 2.6%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 40-48
Gender Male, 95.4%
Calm 79.4%
Happy 17.9%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 94%
Surprised 7.1%
Fear 6%
Sad 3%
Angry 0.7%
Disgusted 0.5%
Confused 0.3%
Happy 0.1%

AWS Rekognition

Age 13-21
Gender Male, 85.3%
Calm 81.9%
Sad 10.9%
Fear 6.7%
Surprised 6.5%
Happy 2.3%
Angry 0.6%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Female, 53%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Disgusted 0%
Happy 0%
Confused 0%

Microsoft Cognitive Services

Age 14
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Bride 96.6%
Female 96.6%
Woman 96.6%
Baby 94.5%
Jeans 72.1%
Hat 60.8%

Categories