Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1822

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1822

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Sun Hat 100
Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 98.6
Child 98.6
Female 98.6
Girl 98.6
Person 98.2
Baby 98.2
Footwear 89.7
Shoe 89.7
Hat 88.9
Shoe 87.1
Hat 86
Reading 57.2
Coat 56.5
Bonnet 56

Clarifai
created on 2018-05-11

people 100
child 99.5
two 98.7
lid 97.6
group 97.6
adult 97.3
man 97.2
three 97
woman 96.9
boy 96.1
wear 95.2
sit 95
veil 94.7
recreation 94.1
four 94
group together 91.8
son 91.4
sibling 90.8
offspring 90.7
portrait 89.2

Imagga
created on 2023-10-05

child 36.1
man 30.9
people 26.2
male 23.3
outdoors 21.1
person 21
sport 21
outdoor 17.6
parent 16.8
sky 16.6
adult 16.3
father 15.8
happy 15
dad 14.5
lifestyle 14.5
playing 13.7
summer 13.5
fun 12.7
two 12.7
love 12.6
active 12.6
day 12.6
happiness 12.5
leisure 12.5
joy 11.7
portrait 11.6
boy 11.3
outside 11.1
mother 11
holding 10.7
park 10.7
juvenile 10.6
grass 10.3
smiling 10.1
beach 10.1
athlete 9.9
vacation 9.8
family 9.8
couple 9.6
action 9.5
smile 9.3
field 9.2
freedom 9.1
hand 9.1
hat 9.1
exercise 9.1
together 8.8
water 8.7
world 8.6
clouds 8.5
black 8.4
attractive 8.4
sports 8.3
danger 8.2
spring 7.8
standing 7.8
play 7.8
travel 7.7
casual 7.6
equipment 7.6
relationship 7.5
oriental 7.4
dress 7.2
activity 7.2
holiday 7.2
women 7.1
son 7.1
clothing 7.1
work 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.6
outdoor 95.1
old 65.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Male, 99.2%
Calm 93.4%
Surprised 6.3%
Fear 5.9%
Confused 4.3%
Sad 2.8%
Angry 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 49-57
Gender Male, 71.6%
Sad 100%
Angry 9.7%
Surprised 6.3%
Fear 6%
Calm 3.3%
Confused 1.7%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 1-7
Gender Male, 98.4%
Sad 98.1%
Calm 44.9%
Surprised 6.3%
Fear 5.9%
Disgusted 0.4%
Angry 0.1%
Happy 0%
Confused 0%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%
Child 98.6%
Female 98.6%
Girl 98.6%
Baby 98.2%
Shoe 89.7%
Hat 88.9%

Categories

Captions