Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.571

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.571

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Shirt 100
Accessories 99.8
Formal Wear 99.8
Tie 99.8
Face 99.2
Head 99.2
Photography 99.2
Portrait 99.2
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Male 99
Man 99
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.5
Adult 97.5
Male 97.5
Man 97.5
Person 97.4
Adult 97.4
Bride 97.4
Female 97.4
Wedding 97.4
Woman 97.4
Person 96.2
Sun Hat 94.7
Machine 87.3
Wheel 87.3
Person 82.1
Adult 82.1
Male 82.1
Man 82.1
Hat 72.3
Outdoors 66.7
Cap 63.4
Transportation 57.7
Vehicle 57.7
Shorts 57.2
Baseball Cap 57.2
Hardhat 56.2
Helmet 56.2
Suit 56.1
Railway 56.1
Art 56
Vest 55.6
Blouse 55.4
Painting 55.3
T-Shirt 55.1

Clarifai
created on 2018-05-11

people 100
group together 98.9
adult 98.1
group 98.1
vehicle 97.2
man 96.7
transportation system 96.1
lid 95.8
several 93.9
veil 93.5
three 93.2
wear 93
four 90.6
five 89.6
two 88.4
uniform 86.3
watercraft 86
military 85.5
many 84.9
administration 84.2

Imagga
created on 2023-10-05

vehicle 93.2
military vehicle 57.1
tracked vehicle 56.1
half track 55.9
conveyance 35.2
wheeled vehicle 35.1
car 26.1
machine 21.9
tank 20.8
transportation 20.6
industry 19.6
tractor 18.4
industrial 18.2
power 17.6
work 17.5
transport 17.3
wheel 17.1
man 16.1
metal 15.3
bulldozer 14.7
old 14.6
steel 14.1
factory 13.9
machinery 13.7
engine 13.5
auto 12.4
drive 12.3
steamroller 12
construction 12
equipment 11.8
military 11.6
armored vehicle 11.4
road 10.8
outdoor 10.7
automobile 10.5
heavy 10.5
uniform 10.3
iron 10.3
sky 10.2
male 9.9
truck 9.8
working 9.7
dirt 9.5
rusty 9.5
protection 9.1
environment 9
dirty 9
wreck 9
weapon 9
outdoors 9
job 8.8
sand 8.8
building 8.7
war 8.7
broken 8.7
extreme 8.6
yellow 8.6
energy 8.4
land 8.3
safety 8.3
tire 8.3
danger 8.2
farm 8
grass 7.9
device 7.8
destruction 7.8
people 7.8
travel 7.7
summer 7.7
rust 7.7
track 7.7
engineering 7.6
gun 7.5
person 7.3
adult 7.1
engineer 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97.4
person 89.3
people 74.6
group 60.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-65
Gender Male, 100%
Happy 83.5%
Surprised 7.4%
Fear 6.6%
Calm 3.3%
Confused 3.2%
Sad 3.1%
Disgusted 2%
Angry 1.5%

AWS Rekognition

Age 43-51
Gender Male, 99.4%
Calm 82%
Sad 14.7%
Surprised 6.4%
Fear 6%
Angry 1.7%
Confused 0.5%
Happy 0.5%
Disgusted 0.3%

AWS Rekognition

Age 21-29
Gender Female, 99.8%
Surprised 99.6%
Fear 6.3%
Sad 2.2%
Calm 1.1%
Confused 0.2%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%

Microsoft Cognitive Services

Age 66
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Male 99.1%
Man 99.1%
Bride 97.4%
Female 97.4%
Woman 97.4%
Wheel 87.3%
Hat 72.3%

Categories