Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.446

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.446

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.1
Female 99.1
Person 99.1
Woman 99.1
Machine 99
Wheel 99
Female 98.9
Person 98.9
Child 98.9
Girl 98.9
Adult 98.5
Female 98.5
Person 98.5
Woman 98.5
Person 98.2
Baby 98.2
Person 97.9
Clothing 96.8
Face 96.8
Head 96.8
Person 94.2
Wheel 93.6
Amusement Park 91.4
Fun 90.6
Theme Park 90.6
Person 88.4
Person 77
Photography 73.5
Portrait 73.5
Car 71.3
Transportation 71.3
Vehicle 71.3
Art 68.1
Painting 68.1
Outdoors 63.9
Spoke 57.6
Plant 56
Tree 56
Hat 56
Sun Hat 55.6

Clarifai
created on 2018-05-11

people 99.9
adult 98.8
vehicle 98.7
group 98.6
group together 97.2
two 96.8
transportation system 96.3
man 94.7
three 94.4
wear 93.2
woman 93.1
one 93
child 92.4
many 89
several 88.3
four 87.7
administration 86.5
military 85.9
war 85.8
leader 85.2

Imagga
created on 2023-10-05

vehicle 67.1
military vehicle 42.3
tracked vehicle 40.7
half track 40.3
wheeled vehicle 30.3
machine 28.2
conveyance 22.6
industry 19.6
tank 19.1
industrial 19.1
device 18.2
old 18.1
transportation 17.9
tractor 17.6
bulldozer 17.1
metal 16.9
wheel 16
building 15.2
car 14.5
transport 13.7
construction 13.7
steel 13.7
power 13.4
track 13.3
iron 13.1
machinery 12.7
work 12.6
military 12.6
war 12.5
equipment 11.7
architecture 11.7
heavy 11.4
dirt 10.5
outside 10.3
dirty 9.9
destruction 9.8
outdoors 9.7
factory 9.6
rusty 9.5
engine 9.4
gun 9
machinist 8.9
army 8.8
truck 8.7
train 8.7
tie 8.5
armored vehicle 8.4
city 8.3
part 8.3
steam shovel 7.8
power shovel 7.7
earth 7.7
structure 7.7
outdoor 7.6
stone 7.6
land 7.4
self-propelled vehicle 7.3
danger 7.3
brace 7.1
weapon 7.1
travel 7
sky 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Female, 62.6%
Sad 100%
Confused 12.6%
Surprised 6.4%
Fear 6%
Angry 1.9%
Calm 1.2%
Disgusted 0.5%
Happy 0.1%

AWS Rekognition

Age 57-65
Gender Male, 93.1%
Sad 100%
Calm 13.2%
Surprised 6.4%
Fear 6%
Happy 0.9%
Angry 0.6%
Confused 0.4%
Disgusted 0.3%

AWS Rekognition

Age 20-28
Gender Male, 89.2%
Sad 99.7%
Calm 28.7%
Surprised 7.4%
Fear 6.1%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.2%

AWS Rekognition

Age 54-64
Gender Male, 99.8%
Angry 45%
Sad 42.7%
Calm 20.7%
Surprised 6.8%
Fear 6.6%
Disgusted 4.3%
Happy 1.7%
Confused 0.8%

AWS Rekognition

Age 23-31
Gender Male, 99.4%
Calm 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 19-27
Gender Male, 83.3%
Calm 84.7%
Fear 13.1%
Surprised 6.3%
Sad 2.5%
Happy 0.3%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 0-4
Gender Female, 94.3%
Sad 100%
Calm 13.3%
Surprised 6.4%
Fear 6%
Confused 4.1%
Disgusted 0.6%
Angry 0.5%
Happy 0.2%

Microsoft Cognitive Services

Age 64
Gender Male

Microsoft Cognitive Services

Age 77
Gender Female

Microsoft Cognitive Services

Age 7
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Female 99.1%
Person 99.1%
Woman 99.1%
Wheel 99%
Child 98.9%
Girl 98.9%
Baby 98.2%
Car 71.3%
Hat 56%