Human Generated Data

Title

Untitled (horse races, Lancaster, Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.675

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (horse races, Lancaster, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.675

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 100
Clothing 99.8
Hat 99.8
Person 99.1
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98
Adult 98
Male 98
Man 98
Person 97.7
Person 97.3
Adult 97.3
Male 97.3
Man 97.3
Person 97.2
Person 97
Person 96.9
Person 96.7
Adult 96.7
Male 96.7
Man 96.7
Crowd 96.5
Person 96.3
Person 95.1
Adult 95.1
Male 95.1
Man 95.1
Person 94.2
Person 93.6
Person 91.6
Person 91.6
Person 91.3
Person 89.6
Outdoors 89.1
Person 83.9
Person 82.2
Nature 79.3
Person 76
Person 75.1
Person 71
Person 68
Person 67.1
Person 65.9
Shorts 62.8
Head 62.3
Field 57.8
Grass 57.5
Plant 57.5
Cap 57.5
Huddle 57.4
T-Shirt 56.7
Airport 56.5
Architecture 56.5
Building 56.5
Monastery 56.5
Back 56.3
Body Part 56.3
Garden 56.1
Baseball Cap 55.7
Countryside 55.7
Photography 55

Clarifai
created on 2018-05-11

people 99.9
group together 99.6
many 99.4
group 98.9
adult 96
man 96
administration 93.1
leader 92.1
crowd 92.1
child 86.7
woman 86.5
spectator 86.2
several 85.8
military 85
war 83.3
boy 80
recreation 78
chair 77.4
wear 75.5
school 74.2

Imagga
created on 2023-10-06

spectator 40.9
pedestrian 25.5
uniform 21.9
landscape 20.1
military uniform 18.6
outdoor 18.3
sky 17.8
summer 17.3
travel 16.9
grass 16.6
man 16.1
people 15.6
sport 14.3
male 14.2
city 14.1
park 14
outdoors 13.6
architecture 13.3
clothing 12.7
day 12.5
field 12.5
old 12.5
person 12.5
clouds 11.8
world 11.1
tourism 10.7
trees 10.7
military 10.6
horse 10.4
walking 10.4
monument 10.3
spring 10.2
countryside 10
history 9.8
adult 9.8
war 9.7
country 9.7
sunny 9.5
outside 9.4
cloudy 9.4
natural 9.4
tree 9.2
street 9.2
building 8.9
tourist 8.9
covering 8.8
together 8.8
couple 8.7
active 8.7
men 8.6
leisure 8.3
mountain 8
home 8
consumer goods 7.9
season 7.8
gun 7.7
house 7.5
vacation 7.4
historic 7.3
road 7.2
weapon 7.2
landmark 7.2
rural 7
passenger 7

Google
created on 2018-05-11

photograph 95.2
black and white 86.7
vehicle 72.4
monochrome photography 70.3
history 65.8
tree 64.9
monochrome 64.7
troop 61.7
recreation 56.1
car 55.7
crowd 55.6
crew 53.4
team 51.4

Microsoft
created on 2018-05-11

outdoor 99.2
person 99.1
tree 98.9
sky 98.3
standing 84.9
group 82.9
people 66.1
posing 60
crowd 3.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Calm 48.5%
Angry 29.2%
Fear 9.9%
Surprised 8.4%
Confused 5.2%
Sad 2.9%
Disgusted 2.6%
Happy 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Calm 71.3%
Sad 35.1%
Surprised 6.8%
Fear 6.1%
Confused 3%
Happy 0.7%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 30-40
Gender Male, 63.9%
Surprised 98.3%
Fear 9%
Happy 7.8%
Sad 3.6%
Angry 2.1%
Disgusted 0.5%
Calm 0.4%
Confused 0.3%

AWS Rekognition

Age 42-50
Gender Male, 97.3%
Calm 69.4%
Confused 11.6%
Surprised 8%
Fear 6.3%
Disgusted 5.9%
Sad 4.2%
Happy 3.3%
Angry 0.8%

AWS Rekognition

Age 6-16
Gender Female, 94.4%
Calm 64.6%
Fear 15%
Sad 14.4%
Surprised 6.6%
Confused 1.9%
Happy 1.2%
Disgusted 1%
Angry 1%

Microsoft Cognitive Services

Age 36
Gender Male

Feature analysis

Amazon

Person 99.1%
Adult 98.2%
Male 98.2%
Man 98.2%