Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.795

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.795

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Architecture 99.9
Building 99.9
Outdoors 99.9
Shelter 99.9
Neighborhood 99.7
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99
Male 99
Man 99
Person 99
Person 97.5
Clothing 96.7
Hat 96.7
Nature 95.5
People 91.9
Coat 85
Yard 75.7
Housing 73.9
Backyard 72.1
Countryside 58
House 57.6
Grass 57.2
Plant 57.2
Garden 56.9
Gardener 56.9
Gardening 56.9
Barrel 56.9
Rain Barrel 56.9
Face 56.1
Head 56.1
Rural 55.5

Clarifai
created on 2018-05-11

people 100
group together 99.5
adult 99.3
group 99.3
man 96.8
home 95.6
several 95.2
administration 94.9
many 94.8
leader 94.4
military 93
soldier 91.7
child 91
four 90.9
three 90.1
war 89.8
five 89
education 88.8
uniform 87.9
two 87.5

Imagga
created on 2023-10-07

uniform 51.6
military uniform 48.6
clothing 34.7
man 27.5
percussion instrument 22.4
people 21.7
covering 19.8
consumer goods 19.6
military 18.3
musical instrument 18.1
steel drum 18.1
male 17.7
war 17.3
person 16.5
street 15.6
adult 13.8
soldier 13.7
ashcan 12.8
weapon 12.4
outdoor 12.2
protection 11.8
danger 11.8
city 11.6
home 11.2
old 11.1
gun 11
travel 10.6
outdoors 10.4
men 10.3
bin 10.2
mask 10.1
helmet 10
world 9.9
army 9.7
commodity 9.7
spectator 9.6
container 9.3
industrial 9.1
portrait 9.1
sport 9
accident 8.8
disaster 8.8
urban 8.7
pedestrian 8.6
active 8.4
house 8.3
activity 8.1
to 8
together 7.9
camouflage 7.8
destruction 7.8
building 7.7
walk 7.6
fire 7.5
historic 7.3
dirty 7.2
transportation 7.2
women 7.1
family 7.1
architecture 7

Google
created on 2018-05-11

photograph 94.9
black and white 85.6
standing 79.1
car 77.7
house 74.8
vehicle 74.3
monochrome photography 69.5
monochrome 63.9
tree 61.2
vintage clothing 58.3
street 57.5
history 55.5
staff 52.3
troop 50.2

Microsoft
created on 2018-05-11

building 100
outdoor 99.9
person 96.7
standing 80.5
group 77.7
people 59.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 56-64
Gender Male, 94%
Calm 76.8%
Happy 9.9%
Surprised 9%
Fear 6.3%
Sad 3.1%
Angry 2.4%
Disgusted 1.5%
Confused 1%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 97.7%
Surprised 6.5%
Fear 6%
Sad 2.3%
Confused 0.9%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 27-37
Gender Female, 90.9%
Calm 77.9%
Sad 9.2%
Surprised 8.4%
Fear 6.1%
Angry 2.5%
Happy 2.3%
Disgusted 1.5%
Confused 0.7%

AWS Rekognition

Age 35-43
Gender Female, 66.8%
Sad 78.6%
Calm 53%
Surprised 8.5%
Fear 6%
Angry 4.2%
Confused 1.8%
Disgusted 0.9%
Happy 0.3%

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%

Categories