Human Generated Data

Title

Untitled (Fourth of July Carnival and Fishfry, Ashville, Ohio)

Date

July 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.704

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourth of July Carnival and Fishfry, Ashville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.704

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Person 97.6
Person 97.6
Back 96.9
Body Part 96.9
Clothing 96.7
Stage 95.6
Person 94.8
Person 94.3
Male 94.2
Person 94.2
Boy 94.2
Child 94.2
Electrical Device 90.8
Microphone 90.8
Person 90.2
Person 88.8
Adult 88.2
Person 88.2
Female 88.2
Woman 88.2
Outdoors 85.9
People 82.9
Person 77
Person 76.7
Person 73.2
Head 71.5
Crowd 70.1
Face 69.8
Hat 68.1
Person 61.3
Baby 61.3
Architecture 58
Building 58
Shelter 58
Person 57.1
Shorts 55.8
Tent 55.8
Circus 55.7
Leisure Activities 55.7
Countryside 55.2
Hut 55.2
Nature 55.2
Rural 55.2

Clarifai
created on 2018-05-11

people 100
group 99.5
group together 99.1
many 99
adult 99
several 97
military 96.8
vehicle 96.6
war 96
administration 95.3
man 94.5
leader 92.6
tent 91
soldier 88.6
woman 86.1
wear 85
child 83.9
transportation system 82
watercraft 81.4
four 81.3

Imagga
created on 2023-10-06

stage 91.6
platform 72.6
sky 18.5
building 17.4
industry 17.1
city 16.6
architecture 16.4
old 14.6
industrial 12.7
house 12.5
smoke 12.1
construction 12
destruction 11.7
power 10.9
danger 10.9
transportation 10.8
travel 10.6
urban 10.5
truck 10.4
bridge 10.4
structure 10.1
transport 10
steel 9.7
factory 9.6
water 9.3
dirty 9
tower 8.9
landscape 8.9
night 8.9
vehicle 8.7
light 8.7
roof 8.6
street 8.3
landmark 8.1
black 7.8
disaster 7.8
track 7.8
steam 7.8
military 7.7
war 7.7
garbage truck 7.6
vintage 7.4
historic 7.3
protection 7.3
people 7.2
drum 7.2
road 7.2
history 7.2
river 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.2
person 95.9
old 67.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 88.4%
Sad 87.2%
Calm 26.6%
Surprised 10.4%
Fear 8.7%
Happy 8.2%
Confused 5.2%
Disgusted 4%
Angry 3.1%

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%
Boy 94.2%
Child 94.2%
Female 88.2%
Woman 88.2%
Hat 68.1%
Baby 61.3%

Text analysis

Amazon

SHOW
FREE SHOW
FREE