Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3158

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3158

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98
Male 98
Man 98
Person 98
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Person 97.1
Adult 96.5
Male 96.5
Man 96.5
Person 96.5
Person 96.5
Adult 95.4
Male 95.4
Man 95.4
Person 95.4
Architecture 94
Building 94
Coat 88.9
Outdoors 86.1
Footwear 84.6
Shoe 84.6
Person 83.5
People 82
Person 70.9
Person 70.7
Countryside 70.6
Hut 70.6
Nature 70.6
Rural 70.6
Shoe 70
Car 65.1
Transportation 65.1
Vehicle 65.1
Person 63
Shoe 60.1
Hat 58.3
Face 57.9
Head 57.9
Sun Hat 57.9
Shoe 57.6
Cap 57.3
Bus Stop 57.1
Shelter 56.5
Accessories 56
Formal Wear 56
Tie 56
Guitar 55.9
Musical Instrument 55.9
Hat 55.9
Kiosk 55.6
Hat 55.3
Guitarist 55
Leisure Activities 55
Music 55
Musician 55
Performer 55

Clarifai
created on 2018-05-10

people 99.9
group together 99.5
group 99.1
many 98.9
adult 97.8
military 94.3
administration 94
man 92.9
several 91.7
war 91.5
wear 91.2
outfit 90.3
soldier 89.9
woman 89.6
leader 89.1
street 85.3
home 85.1
child 82.7
furniture 82
five 80.3

Imagga
created on 2023-10-07

building 37.4
architecture 32.3
city 31.6
street 26.7
urban 18.3
tourism 17.3
travel 16.9
house 16.8
window 16
people 15.6
town 14.8
motor scooter 14.8
musical instrument 13.9
old 13.9
stone 13.5
structure 12.9
exterior 12.9
statue 12.5
wheeled vehicle 12.5
vehicle 12.5
buildings 12.3
tourist 11.7
person 11.7
sculpture 11.5
male 11.4
monument 11.2
shop 11.2
clothing 10.8
man 10.8
palace 10.7
station 10.7
world 10.5
accordion 10.1
historic 10.1
uniform 10.1
windows 9.6
roof 9.5
brick 9.4
wall 9.4
landmark 9
wind instrument 9
history 8.9
military 8.7
day 8.6
adult 8.6
door 8.6
outdoor 8.4
famous 8.4
police station 8.2
military uniform 8.1
college 8.1
road 8.1
keyboard instrument 8.1
barbershop 7.7
england 7.6
walking 7.6
conveyance 7.5
mercantile establishment 7.5
classic 7.4
university 7.4
business 7.3
home 7.2
facade 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

building 100
outdoor 99.6
person 98.3
people 69.9
group 60.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 37-45
Gender Female, 82.5%
Angry 63.7%
Sad 29.7%
Calm 11.2%
Surprised 6.8%
Fear 6.6%
Disgusted 0.6%
Confused 0.5%
Happy 0.3%

AWS Rekognition

Age 40-48
Gender Male, 99.3%
Calm 89.9%
Surprised 6.8%
Fear 6%
Sad 3.8%
Angry 2.7%
Confused 1.3%
Disgusted 0.5%
Happy 0.1%

AWS Rekognition

Age 28-38
Gender Male, 98.1%
Calm 95.1%
Surprised 6.6%
Fear 6.3%
Sad 2.6%
Angry 1%
Confused 0.5%
Disgusted 0.3%
Happy 0.1%

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Building 94%
Shoe 84.6%
Car 65.1%
Hat 58.3%
Tie 56%

Text analysis

Amazon

FERRIES
BROUKLYN FERRIES
BROUKLYN

Google

QUUKLYN FERRIES
QUUKLYN
FERRIES