Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.556

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.556

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.8
Hardhat 99.8
Helmet 99.8
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Worker 90.5
Face 89
Head 89
Machine 77.7
Wheel 77.7
Wood 69.5
Transportation 68
Vehicle 68
Hat 65
Railway 61
Train 61
Outdoors 57.5
Baseball Cap 56.8
Cap 56.8
Construction 56.1
Photography 56
Portrait 56
Architecture 55.6
Building 55.6
Factory 55.6
Manufacturing 55.6

Clarifai
created on 2018-05-11

people 99.9
adult 99.4
vehicle 99.2
one 99
group together 98
transportation system 97.5
man 96.7
two 96.3
three 93.9
driver 91
watercraft 90.8
administration 90.4
group 90.1
aircraft 89.4
four 88.9
war 88.2
leader 87.4
wear 85.8
military 84.2
woman 82.2

Imagga
created on 2023-10-06

factory 39.9
machine 39.3
industry 32.4
industrial 28.1
construction 27.4
steel 24.1
work 23.5
equipment 21.7
device 20.5
plant 20.4
man 20.2
power 19.3
sky 17.2
structure 17.2
vehicle 17.1
building 16.9
worker 15.1
machinist 14.8
machinery 14.8
engineer 14.4
building complex 14
transport 13.7
metal 13.7
heavy 13.4
truck 13.3
site 13.1
business 12.8
transportation 12.5
people 12.3
water 12
energy 11.8
environment 11.5
job 11.5
working 11.5
engineering 11.4
male 11.3
outdoors 11.2
men 11.2
landscape 11.2
old 11.1
person 11
tractor 10.9
thresher 10.6
fuel 10.6
builder 10.5
dirt 10.5
rock 10.4
crane 10.4
tool 10.2
safety 10.1
dirty 9.9
labor 9.7
pipe 9.7
trailer 9.7
farm machine 9.7
helmet 9.6
architecture 9.4
iron 9.3
power saw 9.2
outdoor 9.2
summer 9
ship 8.9
car 8.8
mechanic 8.8
port 8.7
gas 8.7
wheel 8.5
hot 8.4
oil 8.4
heat 8.3
engine 8.3
occupation 8.2
backhoe 7.9
grass 7.9
excavator 7.9
fisherman 7.9
loading 7.9
sea 7.8
mechanical 7.8
outside 7.7
fishing 7.7
pollution 7.7
vessel 7.6
boat 7.6
earth 7.3
yellow 7.3
road 7.2
rural 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.8
tree 99.7
sky 99.2
man 98
person 97.5
standing 78.3
old 60.7
boat 18.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 56
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Wheel 77.7%
Train 61%

Text analysis

Amazon

QUHAR
PATRAT
APPLIED
PATRAT APPLIED PORI
PORI

Google

OUHAI
OUHAI