Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1940

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1940

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.2
Hat 99.2
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Person 99
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Person 98.2
Person 98.2
Person 97.8
Person 97.4
Person 97.3
Person 97.2
Person 96.9
Adult 95.4
Male 95.4
Man 95.4
Person 95.4
People 94.9
Wood 91.3
Outdoors 90.4
Person 87.7
Person 86.2
Car 82.9
Transportation 82.9
Vehicle 82.9
Car 81.7
Car 77.9
Car 77.4
Person 74.3
Car 72.8
Nature 71.3
Person 70.9
Architecture 65.1
Building 65.1
Factory 65.1
Person 64.3
Head 63.5
Footwear 58.8
Shoe 58.8
Fence 58
Worker 57.7
Backyard 57.6
Yard 57.6
Pants 57
Weapon 57
Construction 56.9
Manufacturing 55.8
Cap 55.3

Clarifai
created on 2018-05-11

people 100
adult 99
group 98.7
group together 98.7
man 98.1
many 94.8
military 94.5
war 93.4
woman 90.3
soldier 89.9
wear 88
transportation system 87.7
administration 86.7
watercraft 85.1
uniform 84.4
vehicle 84.1
five 80
several 78.6
three 74
four 72.3

Imagga
created on 2023-10-06

container 44.6
ashcan 26.7
factory 24.1
bin 23.2
milk can 23.1
city 19.1
can 18.8
architecture 18.7
building 17.3
steel 17.1
vessel 16.7
industrial 16.3
industry 16.2
old 16
boiler 15.7
metal 14.5
structure 14.3
house 14.2
sky 13.4
urban 13.1
iron 13.1
machine 12.8
energy 12.6
power 12.6
travel 12
landscape 11.9
transportation 11.7
steam 11.6
fuel 11.6
plant 11.4
street 11
history 10.7
pipe 10.7
water 10.7
turbine 10.4
construction 10.3
transport 10
refinery 9.9
cemetery 9.8
gas 9.6
pollution 9.6
device 9.6
town 9.3
environment 9
antique 8.8
engine 8.7
train 8.7
scene 8.7
skyline 8.5
tank 8.5
black 8.4
oil 8.4
vintage 8.3
technology 8.2
production 7.8
station 7.7
rust 7.7
grunge 7.7
smoke 7.4
work 7.4
church 7.4
exterior 7.4
new 7.3
chimney 7.2
landmark 7.2
to 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

sky 99.8
outdoor 97.4
person 85.4
wooden 61.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 53.9%
Calm 96.1%
Surprised 7.9%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Happy 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Male, 93.6%
Calm 82.2%
Happy 13%
Surprised 6.7%
Fear 5.9%
Sad 2.5%
Angry 1.4%
Confused 0.7%
Disgusted 0.4%

AWS Rekognition

Age 13-21
Gender Male, 58.8%
Calm 74.6%
Confused 16.6%
Fear 6.8%
Surprised 6.6%
Sad 2.9%
Happy 1.3%
Disgusted 1.3%
Angry 1%

AWS Rekognition

Age 13-21
Gender Female, 56.9%
Surprised 75.8%
Calm 28.7%
Fear 16.9%
Happy 2.6%
Sad 2.6%
Disgusted 2.3%
Confused 1.7%
Angry 1.1%

AWS Rekognition

Age 24-34
Gender Male, 98.9%
Calm 95%
Surprised 6.8%
Fear 5.9%
Sad 2.3%
Happy 2.1%
Disgusted 0.5%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 16-24
Gender Male, 85.3%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.2%
Happy 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 23-31
Gender Male, 96.4%
Calm 53.4%
Surprised 47.6%
Happy 7.8%
Fear 6.2%
Sad 3%
Disgusted 2.5%
Angry 1.8%
Confused 0.9%

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Car 82.9%
Shoe 58.8%

Categories