Human Generated Data

Title

Untitled (Lower East Side, New York City)

Date

April 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2927

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Lower East Side, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

April 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2927

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Coat 100
Overcoat 99.7
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Person 98.6
Person 98.1
Car 97.7
Transportation 97.7
Vehicle 97.7
Machine 94.9
Spoke 94.9
Wheel 90.4
Alloy Wheel 87.7
Car Wheel 87.7
Tire 87.7
Face 82.3
Head 82.3
Wheel 77.7
Hat 70
Person 70
Cap 66.1
Footwear 60.8
Shoe 60.8
Adult 59.7
Male 59.7
Man 59.7
Person 59.7
License Plate 57.8
Funeral 55.7
Formal Wear 55.7
Suit 55.7
Antique Car 55.3
Accessories 55.3
Bag 55.3
Handbag 55.3
Sun Hat 55.1

Clarifai
created on 2018-05-10

people 100
group together 99
vehicle 99
group 98.7
adult 98.7
many 97.3
administration 97.1
street 97.1
man 96.4
transportation system 95.1
one 95.1
two 93.8
police 93.6
several 93.3
military 93
wear 92.3
outfit 91.6
war 90.7
car 89.9
monochrome 89.8

Imagga
created on 2023-10-05

man 24.2
people 20.6
city 17.4
black 17.1
clothing 16.3
male 15.9
car 15.2
adult 14.9
mask 14.9
urban 14.8
person 14.8
street 13.8
protection 13.6
danger 13.6
pedestrian 13
men 12
women 11.9
vehicle 11.5
safety 11
military 10.6
world 10.2
transportation 9.9
gas 9.6
automobile 9.6
wheel 9.6
day 9.4
power 9.2
outdoor 9.2
human 9
life 8.9
garment 8.9
soldier 8.8
business 8.5
travel 8.4
smoke 8.4
fashion 8.3
silhouette 8.3
weapon 8.3
industrial 8.2
sidewalk 8.1
light 8
gun 7.9
destruction 7.8
disaster 7.8
horror 7.8
portrait 7.8
sitting 7.7
crowd 7.7
uniform 7.6
casual 7.6
dangerous 7.6
hand 7.6
walking 7.6
dark 7.5
water 7.3
brass 7.3
metal 7.2
road 7.2
looking 7.2
device 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 94.6
person 93.4
old 75.4
vintage 33.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Happy 95.4%
Surprised 6.8%
Fear 6%
Sad 2.3%
Calm 1.3%
Angry 0.9%
Disgusted 0.3%
Confused 0.1%

Microsoft Cognitive Services

Age 52
Gender Male

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Car 97.7%
Wheel 90.4%
Hat 70%
Shoe 60.8%

Text analysis

Amazon

Swe
GROW!
TER LEGGS
: