Human Generated Data

Title

Untitled (Camden, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1419

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Camden, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1419

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Machine 99.3
Spoke 99.3
Adult 99
Male 99
Man 99
Person 99
Wheel 96.6
Wheel 96.4
Adult 94
Male 94
Man 94
Person 94
Alloy Wheel 93.5
Car Wheel 93.5
Tire 93.5
Transportation 93.5
Vehicle 93.5
Car 93
Clothing 83.5
Footwear 83.5
Shoe 83.5
Animal 80.5
Horse 80.5
Mammal 80.5
Worker 76
Antique Car 75.3
Model T 75.3
Person 73
Face 70
Head 70
Hat 57.9
City 57.4
Road 57.4
Street 57.4
Urban 57.4
Shorts 57.3
Coat 56.9
Outdoors 56.6
Photography 55.8
Portrait 55.8
Architecture 55.6
Building 55.6
Shelter 55.6
Axle 55.2

Clarifai
created on 2018-05-11

people 99.9
vehicle 99.8
adult 99.4
transportation system 99
one 98.1
man 98
military 96.5
driver 96.3
group together 95
car 94.2
two 92.6
outfit 92.6
soldier 90.3
wear 90
administration 89.1
war 88.9
military vehicle 88.7
leader 87.6
uniform 86.2
police 86

Imagga
created on 2023-10-06

model t 100
car 100
motor vehicle 100
vehicle 51.8
wheeled vehicle 36.7
transportation 34.1
auto 33.5
road 30.7
automobile 30.6
wheel 29.3
drive 26.5
transport 23.7
driving 18.4
old 18.1
tire 16.9
speed 16.5
motor 16.5
truck 15.7
machine 15.5
travel 13.4
snow 13
wheels 11.7
adult 11.6
fast 11.2
outdoors 11.2
winter 11.1
sport 10.7
outdoor 10.7
driver 10.7
engine 10.6
luxury 10.3
man 10.1
roadster 9.9
sky 9.6
antique 9.5
people 9.5
person 9.5
power 9.2
city 9.1
vintage 9.1
black 9
equipment 9
style 8.9
tires 8.9
working 8.8
cars 8.8
accident 8.8
machinery 8.8
broken 8.7
work 8.6
cold 8.6
race 8.6
dirt 8.6
design 8.4
street 8.3
danger 8.2
rural 7.9
grass 7.9
tractor 7.9
automotive 7.8
model 7.8
summer 7.7
expensive 7.7
traffic 7.6
fashion 7.5
landscape 7.4
classic 7.4
sports 7.4
land 7.4
yellow 7.3
industrial 7.3
dirty 7.2
activity 7.2
male 7.1

Microsoft
created on 2018-05-11

outdoor 99.3
man 91.6
person 91.6
black 82.5
white 77.8
old 74.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-54
Gender Male, 100%
Calm 96.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 2%
Angry 0.7%
Disgusted 0.2%
Happy 0%

Microsoft Cognitive Services

Age 41
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Wheel 96.6%
Car 93%
Shoe 83.5%
Horse 80.5%

Categories

Imagga

cars vehicles 91%
paintings art 7.2%

Text analysis

Amazon

KALET