Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

Date

August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3339

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (relief station, Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3339

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Adult 99.4
Female 99.4
Person 99.4
Woman 99.4
Adult 99.4
Person 99.4
Male 99.4
Man 99.4
Person 98.5
Person 98.3
Baby 98.3
Face 95.3
Head 95.3
Hat 92.7
Photography 88.5
Portrait 88.5
Footwear 84
Shoe 84
Hat 82.4
Shoe 73.5
Bonnet 56
Cowboy Hat 55.7

Clarifai
created on 2018-05-10

people 100
child 99.7
two 99.1
sit 98.2
three 98
lid 98
group 98
boy 96.8
adult 96.4
veil 96.1
woman 96
man 95.9
wear 95.4
four 95.4
recreation 94.9
group together 93.5
five 91.8
portrait 91.4
elderly 90.3
facial expression 89.8

Imagga
created on 2023-10-06

statue 34.3
sculpture 26.9
architecture 20.3
monument 19.6
man 19.5
child 18.8
stone 16.9
male 16.6
old 16
person 15.7
history 15.2
people 15.1
ancient 14.7
building 14.4
art 14.4
city 14.1
culture 13.7
landmark 13.5
religion 13.4
mother 12.7
travel 12.7
love 12.6
tourism 12.4
parent 12.3
portrait 11.6
marble 11.6
monk 11.4
face 11.4
religious 11.2
outdoors 10.6
world 10.6
detail 10.5
historical 10.4
historic 10.1
adult 9.9
antique 9.5
black 9
sepia 8.7
decoration 8.7
god 8.6
fountain 8.6
two 8.5
outdoor 8.4
famous 8.4
memorial 8.4
juvenile 8.3
sky 8.3
father 8.2
tourist 7.7
palace 7.7
dad 7.5
kin 7.4
lifestyle 7.2
smile 7.1
family 7.1
life 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.7
outdoor 91.9
old 83.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Male, 99.5%
Calm 98%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Confused 1.2%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 49-57
Gender Male, 98.1%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.3%
Confused 0.3%
Angry 0.2%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 1-7
Gender Male, 95.4%
Calm 91.3%
Sad 6.4%
Surprised 6.3%
Fear 5.9%
Disgusted 0.3%
Angry 0.1%
Happy 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Female 99.4%
Person 99.4%
Woman 99.4%
Male 99.4%
Man 99.4%
Baby 98.3%
Hat 92.7%
Shoe 84%

Captions

Microsoft
created on 2018-05-10

a vintage photo of a person 94.4%
an old photo of a person 94.3%
old photo of a person 93.5%