Human Generated Data

Title

Untitled (Natchez, Mississippi)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1510

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Natchez, Mississippi)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1510

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99
Male 99
Man 99
Person 99
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Machine 96.5
Wheel 96.5
Wheel 95.5
Wheel 94.5
War 89.5
Face 78.5
Head 78.5
Clothing 77.4
Footwear 77.4
Shoe 77.4
Shoe 76.9
Transportation 74.8
Vehicle 74.8
Shoe 62.5
Outdoors 60.8
Truck 60.7
Shoe 59.4
Tire 57.1
Photography 56.7
Wagon 56.4
Worker 55.6
Spoke 55.5
Shorts 55.3

Clarifai
created on 2018-05-11

people 99.9
vehicle 99.9
group together 99.5
transportation system 99.5
war 99.2
group 99
military 98.9
adult 98.6
man 96.8
soldier 96.4
many 95.6
driver 95.3
cart 95.1
two 94.5
gun 93.6
several 93.2
four 92.9
skirmish 92.8
weapon 92.4
military vehicle 91.8

Imagga
created on 2023-10-06

horse cart 100
cart 100
wagon 100
wheeled vehicle 75.2
vehicle 61.8
transportation 31.4
old 24.4
horse 23.7
transport 22.8
wheel 21.7
carriage 21.5
grass 19.8
farm 19.6
truck 18.4
rural 17.6
machine 17.2
road 16.3
hay 13.7
machinery 13.6
oxcart 13.3
animal 12.8
sky 12.8
wheels 12.7
tractor 11.8
outdoor 11.5
dirt 11.5
agriculture 11.4
car 11.4
antique 11.3
work 11
travel 10.6
landscape 9.7
architecture 9.4
historic 9.2
vintage 9.1
industrial 9.1
summer 9
equipment 9
working 8.8
tire 8.8
driver 8.7
building 8.7
military 8.7
war 8.7
field 8.4
tourism 8.3
danger 8.2
army 7.8
cargo 7.8
automobile 7.7
auto 7.7
rusty 7.6
farming 7.6
historical 7.5
city 7.5
outdoors 7.5
land 7.4
street 7.4
tourist 7.2
dirty 7.2
history 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
road 98.8
truck 97.2
cart 96
man 93.1
carriage 91.1
drawn 91
pulling 86.9
horse-drawn vehicle 61.1
military vehicle 55.9
trailer 18.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Happy 87.6%
Calm 9.2%
Surprised 6.7%
Fear 6%
Sad 2.3%
Confused 0.5%
Angry 0.5%
Disgusted 0.5%

AWS Rekognition

Age 34-42
Gender Male, 99.4%
Calm 91.2%
Surprised 7%
Fear 5.9%
Sad 3%
Angry 2.3%
Confused 1.2%
Disgusted 0.8%
Happy 0.5%

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Wheel 96.5%
Shoe 77.4%
Truck 60.7%

Categories

Imagga

cars vehicles 69.2%
paintings art 29.6%