Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

April 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2916

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

April 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2916

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Overcoat 99.1
Person 98.7
Person 98.4
Car 97.8
Transportation 97.8
Vehicle 97.8
Machine 92.2
Wheel 92.2
Wheel 86.9
Hat 85.5
License Plate 83.9
Person 83.6
Face 83.1
Head 83.1
Alloy Wheel 72.2
Car Wheel 72.2
Spoke 72.2
Tire 72.2
Person 62.4
Formal Wear 62.1
Suit 62.1
Hat 60.9
Animal 57.7
Horse 57.7
Mammal 57.7
Footwear 57.5
Shoe 57.5
Cap 57.3
City 55.9
Accessories 55.4
Bag 55.4
Handbag 55.4

Clarifai
created on 2018-05-10

people 100
group together 99.5
adult 99.2
vehicle 98.9
group 98.8
many 98.3
administration 97.1
man 96.9
one 96.2
outfit 95.2
wear 95.1
several 95
transportation system 94.1
street 93.9
two 93.4
military 93
leader 92.3
war 88.9
police 88.9
woman 88.6

Imagga
created on 2023-10-06

man 26.9
elephant 23.7
clothing 23.5
car 19.4
uniform 17.4
people 15.6
covering 13.8
male 13.6
person 13
vehicle 12.8
pedestrian 12.6
garment 12.4
men 12
danger 11.8
transportation 11.6
military uniform 11.6
wheel 11.5
urban 11.4
black 11.3
sport 11.1
protection 10.9
city 10.8
walking 10.4
mask 10.1
safety 10.1
street 10.1
road 9.9
outdoors 9.8
auto 9.6
power 9.2
military 8.7
travel 8.4
wet suit 8.3
weapon 8.2
mammal 8.1
activity 8.1
automobile 7.7
outdoor 7.6
dangerous 7.6
drive 7.6
consumer goods 7.5
transport 7.3
metal 7.2
active 7.2
women 7.1
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 97.2
person 97.1
old 78.9
vintage 28.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 53-61
Gender Male, 98.5%
Happy 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Calm 0%
Disgusted 0%
Confused 0%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Car 97.8%
Wheel 92.2%
Hat 85.5%
Horse 57.7%
Shoe 57.5%

Categories

Text analysis

Amazon

Swe
GROW!
EXCHANCE
TER & GGGS
and