Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.93

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.93

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 99.8
Apparel 99.8
Hat 99.8
Person 99.6
Human 99.6
Person 99.5
Person 99
Person 95.8
Glasses 69.4
Accessories 69.4
Accessory 69.4
Headband 67.3
Sun Hat 60
Turban 59.6
Icing 59.6
Food 59.6
Dessert 59.6
Cake 59.6
Cream 59.6
Creme 59.6
Helmet 58.5

Clarifai
created on 2023-10-15

people 99.8
group 98.9
man 98.2
woman 97.8
adult 97.8
lid 97.3
portrait 97.3
group together 95.4
veil 94.9
three 94.3
leader 90
family 88.6
four 86.6
two 86.5
wear 83.5
child 83.2
retro 81.6
elderly 81.4
recreation 78.4
religion 77.5

Imagga
created on 2021-12-15

person 43.1
man 37.6
senior 31.9
people 31.8
male 27.7
white 27
nurse 24.3
happy 21.3
elderly 20.1
adult 20.1
couple 19.2
old 17.4
groom 16.2
portrait 15.5
retired 15.5
home 15.2
smiling 14.5
retirement 14.4
looking 14.4
love 14.2
mature 13.9
professional 13.7
husband 13.5
medical 13.2
health 13.2
sitting 12.9
smile 12.8
hospital 12.6
happiness 12.5
bride 12.5
together 12.3
doctor 12.2
men 12
women 11.9
worker 11.7
lifestyle 11.6
working 11.5
room 11.5
wife 11.4
patient 10.8
surgeon 10.7
care 10.7
married 10.5
marriage 10.4
uniform 10.4
clothing 10.3
work 10.2
wedding 10.1
face 9.9
family 9.8
business 9.7
medicine 9.7
hat 9.5
glasses 9.3
camera 9.2
attractive 9.1
cap 8.9
life 8.6
illness 8.6
two 8.5
suit 8.1
computer 8
office 8
handsome 8
70s 7.9
surgery 7.8
teacher 7.7
emergency 7.7
casual 7.6
reading 7.6
enjoying 7.6
meeting 7.5
outdoors 7.5
occupation 7.3
book 7.3
indoor 7.3
aged 7.2
hair 7.1
indoors 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 98.4
human face 97.7
clothing 93.2
text 92.6
smile 86
window 84.4
fashion accessory 76.5
people 75.8
woman 63.4
man 60.3
hat 54.3
black and white 51.5
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-66
Gender Female, 98.4%
Calm 50.9%
Happy 33.4%
Confused 9%
Sad 3.9%
Surprised 1.1%
Disgusted 0.8%
Angry 0.6%
Fear 0.3%

Microsoft Cognitive Services

Age 67
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.8%
Person 99.6%
Glasses 69.4%
Helmet 58.5%

Categories

Imagga

paintings art 98.2%