Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.909

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.909

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 99
Male 99
Man 99
Person 99
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Person 98
Person 98
Person 98
Person 97.9
Person 97.5
Clothing 97.2
Hat 97.2
Person 97.1
Outdoors 97.1
Person 96.6
Backyard 95.5
Nature 95.5
Yard 95.5
Fence 93
Adult 91.9
Male 91.9
Man 91.9
Person 91.9
Garden 89
Person 86.8
Person 85.8
People 85.1
Car 81.3
Transportation 81.3
Vehicle 81.3
Wood 76.1
Car 75
Car 73.4
Car 71.2
Architecture 70.9
Building 70.9
Factory 70.9
Person 68.6
Car 57.6
Construction 56.7
Gardening 56.2
Carpenter 55.6
Manufacturing 55.6
Worker 55.6
Utility Pole 55.3
Gardener 55.1
Firearm 55
Gun 55
Rifle 55
Weapon 55

Clarifai
created on 2018-05-11

people 100
adult 99.5
group 99.4
group together 99.1
man 97.8
many 97.6
military 95.9
war 93.7
vehicle 92.8
administration 91.7
transportation system 91.5
soldier 90.4
woman 90.2
watercraft 90.1
wear 87.3
several 87.1
three 85.4
two 85
five 83.9
four 83.6

Imagga
created on 2023-10-07

container 29.5
factory 20.4
city 20
architecture 19.5
building 18.1
ashcan 17.5
sky 17.2
industrial 16.3
industry 15.4
bin 14.9
landscape 14.9
steel 14.5
house 13.4
milk can 13
plant 12.9
water 12.7
power 12.6
old 12.5
pollution 12.5
urban 12.2
structure 12
energy 11.8
fuel 11.6
construction 11.1
transportation 10.8
vessel 10.7
turbine 10.2
can 10.1
travel 9.9
history 9.8
steam 9.7
gas 9.6
turnstile 9.5
gate 9.4
iron 9.3
church 9.2
street 9.2
machine 9.2
transport 9.1
environment 9
metal 8.8
scene 8.7
skyline 8.5
tank 8.2
boiler 8.2
rural 7.9
refinery 7.9
disaster 7.8
equipment 7.8
pipe 7.8
death 7.7
device 7.7
concrete 7.7
cityscape 7.6
chimney 7.5
smoke 7.4
technology 7.4
station 7.2
sea 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

sky 99.5
outdoor 98.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Calm 83.4%
Disgusted 10.2%
Surprised 6.5%
Fear 6.1%
Sad 3%
Confused 1.9%
Happy 0.7%
Angry 0.4%

AWS Rekognition

Age 38-46
Gender Male, 95.1%
Happy 46.6%
Calm 45.8%
Surprised 8.3%
Fear 5.9%
Sad 2.5%
Confused 1%
Disgusted 0.9%
Angry 0.9%

AWS Rekognition

Age 34-42
Gender Female, 67.1%
Calm 95.4%
Surprised 8.1%
Fear 5.9%
Sad 2.3%
Angry 0.5%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%

AWS Rekognition

Age 33-41
Gender Male, 92.6%
Calm 80.2%
Fear 9.1%
Surprised 7.4%
Sad 3.2%
Happy 2.9%
Disgusted 2.2%
Confused 1.6%
Angry 1%

AWS Rekognition

Age 6-14
Gender Female, 87%
Calm 79.5%
Sad 9%
Surprised 6.8%
Fear 6.1%
Confused 3.5%
Happy 2.2%
Disgusted 1.4%
Angry 1.2%

AWS Rekognition

Age 26-36
Gender Male, 95.1%
Calm 99.4%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Angry 0%

AWS Rekognition

Age 18-26
Gender Male, 92.9%
Calm 48.1%
Confused 24.2%
Disgusted 7.9%
Surprised 6.8%
Happy 6.7%
Fear 6.7%
Sad 6.7%
Angry 1.6%

AWS Rekognition

Age 28-38
Gender Male, 97.2%
Calm 92.4%
Surprised 10.6%
Fear 5.9%
Sad 2.2%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Car 81.3%

Categories