Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.239

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.239

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Person 99.1
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Person 98.2
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98.1
Person 97.9
Person 97.8
Person 97.3
Person 96.5
Person 95.9
Adult 95.9
Male 95.9
Man 95.9
Outdoors 93.5
Person 91.8
Person 91.5
Car 89.3
Transportation 89.3
Vehicle 89.3
Car 86.8
Nature 86.7
Car 84.6
Person 82.1
Backyard 81.6
Yard 81.6
Wood 77.6
Person 76.3
Car 74.8
Machine 71.5
Wheel 71.5
Fence 70.2
Footwear 66
Shoe 66
Head 62.1
People 62
Car 61.7
Plant 57.5
Tree 57.5
Weapon 56.4
Firearm 56.2
Gun 56.2
Rifle 56.2
Garden 56
Utility Pole 56
Hat 55.4
Sun Hat 55.2

Clarifai
created on 2018-05-11

people 100
group together 99.6
group 99.6
many 99.3
adult 99.2
man 97.3
war 96.6
military 96.5
administration 95.3
several 94.4
woman 92.5
vehicle 92.4
transportation system 90.6
soldier 90.1
wear 90
leader 87.8
watercraft 85.7
five 85.1
three 83
four 81.3

Imagga
created on 2023-10-07

locomotive 31
steam locomotive 21.9
city 21.6
architecture 21.1
old 20.2
factory 20.1
building 18.9
steam 18.4
transportation 17.9
transport 17.4
industry 17.1
travel 16.9
train 16.4
history 16.1
railroad 15.7
railway 15.7
industrial 15.4
steel 15.2
vintage 14.1
smoke 13.9
power 13.4
metal 12.9
sky 12.8
engine 12.6
pollution 12.5
track 12.4
structure 12.3
uniform 11.4
percussion instrument 11.4
wheeled vehicle 11.3
historic 11
black 10.8
urban 10.5
energy 10.1
military uniform 9.9
coal 9.8
vacation 9.8
sculpture 9.7
plant 9.6
vehicle 9.5
construction 9.4
iron 9.3
street 9.2
tower 8.9
musical instrument 8.9
tie 8.9
chimney 8.8
pipe 8.7
military 8.7
antique 8.7
container 8.6
cityscape 8.5
engineer 8.5
historical 8.5
statue 8.4
landscape 8.2
rail 7.9
turbine 7.8
station 7.7
fuel 7.7
device 7.7
house 7.6
gun 7.5
boiler 7.5
tourism 7.4
brace 7.4
technology 7.4
town 7.4
landmark 7.2
center 7.2
tank 7.1
river 7.1
cannon 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

sky 99.5
bench 99.2
outdoor 98.9
person 92.8
wooden 63
park 61.7
crowd 0.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 98.6%
Calm 89.8%
Surprised 8.2%
Fear 6.2%
Sad 2.7%
Happy 1.3%
Disgusted 1.2%
Angry 1%
Confused 0.8%

AWS Rekognition

Age 40-48
Gender Male, 98.8%
Calm 95.2%
Surprised 6.5%
Fear 6%
Sad 2.7%
Happy 0.8%
Angry 0.6%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 18-24
Gender Female, 63.2%
Calm 42.3%
Sad 26.1%
Confused 14.3%
Disgusted 12.8%
Fear 8%
Surprised 7.2%
Happy 2.6%
Angry 1.9%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 37.4%
Surprised 31.8%
Fear 13.3%
Sad 7.8%
Angry 7.6%
Happy 3.9%
Disgusted 2.9%
Confused 2.7%

AWS Rekognition

Age 23-31
Gender Male, 95.7%
Calm 91.9%
Surprised 8.2%
Fear 5.9%
Happy 3.3%
Sad 2.2%
Disgusted 0.6%
Angry 0.5%
Confused 0.1%

AWS Rekognition

Age 30-40
Gender Female, 56.5%
Calm 96.4%
Surprised 6.7%
Fear 5.9%
Sad 2.4%
Angry 0.8%
Happy 0.5%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 22-30
Gender Male, 94.5%
Calm 69.9%
Surprised 11.9%
Sad 11.4%
Fear 6.3%
Disgusted 4.8%
Angry 2.1%
Happy 1.3%
Confused 0.4%

Feature analysis

Amazon

Person 99.1%
Adult 98.9%
Male 98.9%
Man 98.9%
Car 89.3%
Wheel 71.5%
Shoe 66%
Hat 55.4%