Human Generated Data

Title

Untitled (levee workers, Plaquemines, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1531

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (levee workers, Plaquemines, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1531

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Male 98.8
Person 98.8
Boy 98.8
Child 98.8
Person 98.8
Animal 95
Horse 95
Mammal 95
Horse 94.3
Person 92.4
Horse 87.4
Person 85.4
Face 78
Head 78
Outdoors 77.3
Nature 73.8
Transportation 55.9
Vehicle 55.9
Wagon 55.9
Bull 55.9
Donkey 55.8
Countryside 55.7

Clarifai
created on 2018-05-11

people 100
adult 99.3
cavalry 99.2
group together 99.1
group 99
vehicle 98.7
military 98.6
man 98.2
transportation system 97.9
war 97.1
many 96.5
soldier 94.7
mammal 94.5
two 93.7
four 93.6
several 93.1
seated 91.8
three 91
weapon 90.3
skirmish 89.5

Imagga
created on 2023-10-05

plow 92.1
tool 68.3
horse 39.8
horse cart 34.8
cart 33.9
cowboy 31.8
animal 27.3
wagon 25.3
farm 24.1
rural 22
grass 20.6
horses 20.5
mountain 19.6
outdoor 19.1
outdoors 17.9
landscape 17.9
travel 17.6
summer 16.7
animals 16.7
sport 16.5
field 15.9
brown 15.5
active 15.3
ranch 15
mammal 15
agriculture 14.9
man 14.8
backpack 14.7
cattle 14.6
hiking 14.4
wheeled vehicle 14.4
cow 14.4
adventure 14.2
mountains 13.9
countryside 12.8
male 12.8
equine 12.8
carriage 12.7
tourist 12.7
riding 12.7
farming 12.3
sky 12.1
people 11.7
ride 11.6
tourism 11.6
country 11.4
journey 11.3
trekking 10.8
herd 10.8
activity 10.8
wild 10.5
trek 9.8
group 9.7
laborer 9.6
harness 9.6
pasture 9.6
dirt 9.6
hill 9.4
speed 9.2
leisure 9.1
old 9.1
bull 9
meadow 9
cows 8.9
grazing 8.8
dairy 8.8
vehicle 8.7
two 8.5
clouds 8.5
park 8.2
scenic 7.9
horseback 7.9
hiker 7.9
agricultural 7.8
farmland 7.7
livestock 7.7
head 7.6
action 7.4
saddle 7.4
recreation 7.2
farmer 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
grass 97.1
transport 86.2
old 70.3
pulling 33.4

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Calm 63%
Confused 25.9%
Surprised 7.9%
Fear 6.2%
Sad 3.4%
Angry 2%
Disgusted 1.7%
Happy 0.5%

AWS Rekognition

Age 24-34
Gender Male, 99.2%
Calm 76.8%
Surprised 6.8%
Fear 6.4%
Happy 5.9%
Angry 5.8%
Sad 3.7%
Confused 2.9%
Disgusted 2.4%

Microsoft Cognitive Services

Age 31
Gender Male

Feature analysis

Amazon

Adult 98.8%
Male 98.8%
Man 98.8%
Person 98.8%
Boy 98.8%
Child 98.8%
Horse 95%