Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4231

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4231

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 99.4
Person 99.4
Person 99.2
Person 99
Person 98.9
Person 98.8
Person 98.6
Person 97.5
Person 96.5
Person 96.2
Person 96.1
Person 96
Person 94.2
Person 94.2
Sailor Suit 91.9
Pedestrian 90.2
Person 88.4
Transportation 88.3
Vehicle 87.1
Person 86.4
Person 85.9
Person 84.6
Clothing 81.9
Apparel 81.9
Military 78.3
Person 75.6
Building 75.5
Person 75.2
Boat 70.6
Harbor 67.6
Water 67.6
Pier 67.6
Dock 67.6
Port 67.6
Waterfront 67.6
Officer 65.3
Military Uniform 65.3
Aircraft 63.7
Person 63.1
Airplane 57.9
Airport 56.9
Boat 56.8
People 56
Airfield 55.5

Clarifai
created on 2023-10-25

people 99.9
group 99.2
adult 97.5
many 97.2
woman 97.1
man 96.9
wear 95.4
group together 95.2
boy 93.3
child 92.9
art 90.5
street 89.5
winter 87
bridge 85.4
recreation 84.9
girl 82.9
several 81.4
snow 81.4
transportation system 81.1
music 81

Imagga
created on 2022-01-08

travel 19
people 19
dairy 17.7
animal 16
crowd 15.4
group 13.7
man 13.4
business 12.7
farm 12.5
city 12.5
horse 12.5
urban 12.2
world 12.2
life 12.1
tourism 11.5
sky 11.5
hall 11.3
architecture 11.1
stall 10.7
adult 10.3
silhouette 9.9
tourist 9.8
person 9.7
sun 9.7
scene 9.5
landscape 8.9
sand 8.9
work 8.6
desert 8.5
male 8.5
journey 8.5
animals 8.3
speed 8.2
transportation 8.1
cattle 8
job 8
working 8
ruler 7.9
women 7.9
station 7.8
art 7.8
sunny 7.7
men 7.7
walking 7.6
outdoors 7.5
ranch 7.4
vacation 7.4
occupation 7.3
landmark 7.2
activity 7.2
rural 7
clothing 7
country 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 95.5
man 93.8
text 93.8
clothing 89.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 99.7%
Calm 97.8%
Fear 0.5%
Happy 0.4%
Sad 0.4%
Angry 0.3%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Calm 79.3%
Sad 8%
Disgusted 4.2%
Angry 2.5%
Happy 1.8%
Fear 1.4%
Surprised 1.4%
Confused 1.3%

AWS Rekognition

Age 23-33
Gender Female, 84.8%
Surprised 37.4%
Calm 17.5%
Disgusted 16.5%
Sad 12.4%
Angry 8.2%
Fear 5%
Confused 1.5%
Happy 1.5%

AWS Rekognition

Age 25-35
Gender Male, 97.5%
Calm 52.6%
Happy 22.4%
Sad 18.7%
Angry 2.3%
Surprised 1.4%
Fear 0.9%
Disgusted 0.9%
Confused 0.7%

Feature analysis

Amazon

Person 99.5%
Boat 70.6%