Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.97

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.97

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Adult 99.7
Female 99.7
Person 99.7
Woman 99.7
Adult 99.4
Female 99.4
Person 99.4
Woman 99.4
Adult 98.6
Person 98.6
Male 98.6
Man 98.6
Adult 98.3
Person 98.3
Male 98.3
Man 98.3
Adult 98.1
Person 98.1
Male 98.1
Man 98.1
Adult 98.1
Person 98.1
Male 98.1
Man 98.1
Adult 98
Female 98
Person 98
Woman 98
Adult 97.9
Female 97.9
Person 97.9
Woman 97.9
Person 97
Adult 96.7
Person 96.7
Male 96.7
Man 96.7
Person 95.3
Person 92.7
Person 92
Footwear 89.7
Shoe 89.7
Person 89.2
Hat 83.1
Shoe 82.8
Hat 82.8
Face 82.4
Head 82.4
Lady 81.4
Coat 81.2
Machine 79.7
Wheel 79.7
Dress 79.1
Shoe 78.7
People 71
Shoe 67.1
Hat 66.5
Hat 66.4
Hat 64.2
Sun Hat 64.1
Person 63.7
Shoe 63.2
Shoe 61.9
Shoe 61.9
Hat 61.5
Shoe 58.3
Boarding 57.6
Bus Stop 56.2
Outdoors 56.2
Architecture 55.9
Building 55.9
Hospital 55.9
Accessories 55.7
Bag 55.7
Handbag 55.7
Bonnet 55.7
Hat 55.2

Clarifai
created on 2018-05-11

people 100
group 99.7
many 99.1
adult 98.9
group together 98.2
several 97
woman 96.8
wear 96.6
man 95.8
administration 95.4
child 93.5
leader 92.6
war 92.1
veil 91.8
military 89.8
boy 83.2
five 82.8
vehicle 81.5
outfit 81.4
recreation 80.7

Imagga
created on 2023-10-07

uniform 68.8
military uniform 68.5
clothing 50.5
consumer goods 30.3
covering 30
man 26.2
people 25.1
person 21.8
military 17.4
male 17.1
old 16.7
adult 16.3
soldier 15.6
war 14.4
commodity 13.7
senior 13.1
religious 12.2
tradition 12
weapon 11.7
kin 11.6
religion 11.6
kimono 11.4
travel 11.3
hat 11.1
protection 10.9
army 10.7
tourism 10.7
statue 10.5
couple 10.4
outdoors 10.4
culture 10.2
danger 10
leisure 10
gun 9.9
family 9.8
portrait 9.7
men 9.4
two 9.3
garment 9.3
traditional 9.1
city 9.1
robe 9.1
engineer 9
equipment 9
to 8.8
camouflage 8.8
together 8.8
elderly 8.6
private 8.3
historic 8.2
activity 8.1
history 8
home 8
ancient 7.8
mask 7.7
faith 7.6
happy 7.5
nurse 7.5
mature 7.4
sport 7.4
vacation 7.4
guy 7.3
lady 7.3
group 7.2
aged 7.2
recreation 7.2
holiday 7.2
women 7.1
worker 7.1
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 100
group 91.1
standing 88.4
people 63.1
dressed 42.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 89.5%
Sad 99.7%
Calm 21.4%
Surprised 8.5%
Fear 6.7%
Happy 2.4%
Confused 1.7%
Angry 0.7%
Disgusted 0.5%

AWS Rekognition

Age 58-66
Gender Female, 89.5%
Calm 99.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 53-61
Gender Male, 98.9%
Happy 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0%
Calm 0%
Disgusted 0%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 79.3%
Sad 25.9%
Surprised 6.3%
Fear 5.9%
Confused 0.5%
Disgusted 0.2%
Happy 0.2%
Angry 0.1%

AWS Rekognition

Age 41-49
Gender Male, 99.4%
Calm 92.7%
Surprised 6.3%
Fear 5.9%
Happy 4.4%
Sad 2.8%
Angry 0.4%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 4.8%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 71
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.7%
Female 99.7%
Person 99.7%
Woman 99.7%
Male 98.6%
Man 98.6%
Shoe 89.7%
Hat 83.1%
Wheel 79.7%