Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.709

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.709

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 99.9
Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Person 99.4
Adult 99.4
Female 99.4
Woman 99.4
Person 98.6
Adult 98.6
Male 98.6
Man 98.6
Machine 98.4
Wheel 98.4
Accessories 96.4
Sunglasses 96.4
Person 93.1
Hat 87.9
Cap 85
Amusement Park 84.6
Hat 79.1
Art 71.5
Painting 67.9
Car 59.8
Transportation 59.8
Vehicle 59.8
Carousel 57.1
Play 57.1
Baseball Cap 57
Fun 56.8
Theme Park 56.8
T-Shirt 56.3
Sun Hat 56.2
Architecture 55.5
Building 55.5
Factory 55.5
Manufacturing 55.5
Drawing 55.2

Clarifai
created on 2018-05-11

people 100
group 99.2
adult 99.2
man 98.5
group together 98.2
two 97.8
vehicle 97.6
lid 97.5
veil 96.8
many 96.6
watercraft 96.5
three 96.5
several 95.5
transportation system 95.3
one 95.1
wear 95
leader 93.9
four 91.3
woman 89.6
military 86.5

Imagga
created on 2023-10-07

industry 25.6
industrial 23.6
machine 22.5
metal 20.9
vehicle 20.6
work 20.5
man 19.5
steel 18.5
worker 16.9
old 16
power 15.1
factory 15
building 14.4
job 14.1
working 14.1
construction 13.7
equipment 13
male 12.8
iron 12.4
machinist 12.3
transportation 11.6
engineer 11.6
car 11.4
wheel 11.4
device 10.7
technology 10.4
vessel 10.1
person 10.1
people 10
boiler 10
dirty 9.9
engine 9.6
architecture 9.4
container 9
manufacturing 8.8
military 8.7
concrete 8.6
men 8.6
engineering 8.6
machinery 8.4
safety 8.3
transport 8.2
light 8
tank 7.9
business 7.9
helmet 7.8
labor 7.8
mechanical 7.8
travel 7.7
war 7.7
repair 7.7
grunge 7.7
heavy 7.6
gear 7.6
energy 7.6
fire 7.5
tractor 7.3
protection 7.3
danger 7.3
track 7.2
wheeled vehicle 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.7
outdoor 88.8
old 67.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 59-69
Gender Male, 94.3%
Happy 94.9%
Surprised 7.6%
Fear 6.1%
Sad 2.2%
Calm 0.9%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 59-67
Gender Male, 99.9%
Happy 73.6%
Calm 11.3%
Confused 8.3%
Surprised 7%
Fear 6.3%
Sad 2.7%
Disgusted 1.4%
Angry 1.3%

AWS Rekognition

Age 30-40
Gender Male, 98.7%
Confused 61.6%
Fear 17.3%
Angry 9.3%
Surprised 7.1%
Sad 3.5%
Calm 2.3%
Disgusted 2%
Happy 1.9%

Microsoft Cognitive Services

Age 88
Gender Female

Microsoft Cognitive Services

Age 78
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Female 99.4%
Woman 99.4%
Male 98.6%
Man 98.6%
Wheel 98.4%
Sunglasses 96.4%
Hat 87.9%
Car 59.8%