Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.693

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.693

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 99.6
Head 99.6
Photography 99.6
Portrait 99.6
Person 99.3
Adult 99.3
Female 99.3
Woman 99.3
Person 98.6
Female 98.6
Child 98.6
Girl 98.6
Machine 98.2
Wheel 98.2
Person 97.8
Person 97.2
Person 96.2
Baby 96.2
Person 95.5
Adult 95.5
Female 95.5
Woman 95.5
Amusement Park 90.2
Person 88.2
Plant 83.6
Tree 83.6
Outdoors 81
Wheel 80.3
Person 77.1
Car 76.5
Transportation 76.5
Vehicle 76.5
Fun 68.6
Wheel 61.3
Theme Park 57.9
Animal 57.7
Canine 57.7
Mammal 57.7
Carousel 57.7
Play 57.7
Grass 57.6
Nature 57.6
Park 57.6
Pet 57.5
Spoke 57.1
Clothing 56.8
Hat 56.8
Blouse 56.5
Dress 56.4
Play Area 56
Lady 55.3
Bicycle 55

Clarifai
created on 2018-05-11

people 99.9
vehicle 99.4
adult 98.8
group 98.8
transportation system 98.7
two 97.5
group together 95.6
one 95.2
child 95
man 94.1
woman 94
three 93.3
watercraft 92.7
wear 90.1
war 87.3
many 87.2
train 86.9
furniture 85.8
actress 83.3
four 82.3

Imagga
created on 2023-10-06

old 23
architecture 21.2
sculpture 19
ancient 17.3
statue 17.1
building 16.7
decoration 15.9
city 15.8
history 14.3
structure 13.1
art 12.7
religion 12.5
monument 11.2
culture 11.1
stone 11
historic 11
device 10.6
grunge 10.2
travel 9.9
scene 9.5
wall 9.3
people 8.9
metal 8.8
fountain 8.7
shop 8.7
brick 8.6
vehicle 8.5
famous 8.4
house 8.4
town 8.3
dirty 8.1
landmark 8.1
antique 8
balcony 7.9
temple 7.7
religious 7.5
vintage 7.4
toyshop 7.4
street 7.4
facade 7.2
machine 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 93.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 81.7%
Angry 11.8%
Surprised 6.5%
Fear 6.2%
Sad 3.1%
Disgusted 1.2%
Happy 0.8%
Confused 0.4%

AWS Rekognition

Age 60-70
Gender Male, 75.4%
Sad 99.4%
Calm 32%
Surprised 6.4%
Fear 6.2%
Angry 1.8%
Happy 1.7%
Disgusted 0.8%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99.7%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Disgusted 0%
Angry 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 14-22
Gender Female, 98.3%
Sad 100%
Confused 12%
Surprised 6.3%
Fear 5.9%
Calm 0.5%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Female, 79.8%
Calm 95.2%
Surprised 6.7%
Fear 6.3%
Sad 3%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 1-7
Gender Female, 88.1%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 3.9%
Calm 1.9%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 64
Gender Male

Microsoft Cognitive Services

Age 62
Gender Female

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Female 99.3%
Woman 99.3%
Child 98.6%
Girl 98.6%
Wheel 98.2%
Baby 96.2%
Car 76.5%

Categories

Imagga

paintings art 99.6%