Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.409

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.409

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Fence 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Cap 99
Clothing 99
Hat 99
Baseball Cap 98.7
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Outdoors 97.6
Male 97.2
Person 97.2
Boy 97.2
Child 97.2
Nature 95
Yard 95
Face 93.5
Head 93.5
Picket 82.9
Car 82.2
Transportation 82.2
Vehicle 82.2
Machine 80
Wheel 80
Person 78.7
Wheel 73.1
Photography 57.5
Portrait 57.5
People 55.4
Body Part 55.3
Finger 55.3
Hand 55.3

Clarifai
created on 2018-05-11

people 100
adult 99.2
group 98
group together 97.8
man 96.8
administration 93.9
woman 93.9
several 93.2
many 91.2
leader 87.8
war 84.8
four 84.2
three 84
military 84
portrait 83.8
wear 81.3
child 81.2
two 81
street 78.3
facial expression 78.1

Imagga
created on 2023-10-06

spectator 30.1
marimba 29.5
percussion instrument 28.1
musical instrument 25.2
people 22.9
man 22.2
male 15.8
city 15.8
men 13.7
dad 13.4
love 13.4
urban 13.1
couple 13.1
outdoors 12.7
adult 12.3
park 11.9
father 11.7
house 11.7
outdoor 10.7
happy 10.6
together 10.5
child 10.2
silhouette 9.9
old 9.7
person 9.7
portrait 9.7
parent 9.7
street 9.2
travel 9.1
holding 9.1
black 9
fun 9
home 8.8
women 8.7
smiling 8.7
day 8.6
architecture 8.6
friends 8.4
pretty 8.4
life 8
lifestyle 7.9
world 7.9
glass 7.9
youth 7.7
two 7.6
friendship 7.5
building 7.5
leisure 7.5
vacation 7.4
handsome 7.1
family 7.1
interior 7.1
summer 7.1
sky 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.5
person 97.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 36-44
Gender Male, 99%
Calm 96.9%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Angry 1.1%
Confused 1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 25-35
Gender Female, 93.2%
Calm 99.1%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Male, 84.4%
Sad 90.1%
Calm 55.1%
Surprised 6.3%
Fear 6%
Confused 0.8%
Happy 0.6%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 24-34
Gender Male, 91.1%
Sad 87.9%
Calm 52%
Surprised 6.7%
Fear 6.5%
Confused 2.9%
Disgusted 0.8%
Angry 0.6%
Happy 0.6%

Microsoft Cognitive Services

Age 56
Gender Male

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Boy 97.2%
Child 97.2%
Car 82.2%
Wheel 80%