Human Generated Data

Title

Bronx Subway Station, NYC

Date

1950

People

Artist: Larry Silver, American born 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bruce Silverstein, 2018.383

Human Generated Data

Title

Bronx Subway Station, NYC

People

Artist: Larry Silver, American born 1934

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bruce Silverstein, 2018.383

Machine Generated Data

Tags

Amazon
created on 2020-03-11

Human 96.4
Person 96.4
Water 95.6
Terminal 92.6
Waterfront 91.5
Port 89
Pier 89
Dock 89
Train Station 87.3
Vehicle 87.3
Train 87.3
Transportation 87.3
Building 84.4
Boardwalk 84.4
Path 75.9
Bridge 72.4
Porch 72.3
Person 70.4
Person 67.7
Railway 60.2
Train Track 60.2
Rail 60.2
Patio 58.5
Corridor 56.3
Walkway 55.7

Clarifai
created on 2020-03-11

Imagga
created on 2020-03-11

pier 100
support 100
device 93.5
travel 25.4
city 24.1
architecture 23.7
sky 23.6
water 22.7
ocean 22.4
bridge 21.9
sea 20.4
urban 19.2
coast 18
building 17.5
transportation 16.2
tourism 15.7
beach 15.2
river 15.1
vacation 13.9
road 13.6
clouds 12.7
wooden 12.3
outdoor 12.2
light 12
street 12
landscape 11.9
transport 11.9
station 11.7
wood 11.7
silhouette 11.6
steel 11.5
walk 11.4
bay 11.3
sun 11.3
outdoors 11.2
old 11.2
structure 11
glass 10.9
rail 10.8
sunset 10.8
dock 10.7
people 10.6
sidewalk 10.6
metal 10.5
traffic 10.5
scene 10.4
outside 10.3
shore 10.2
holiday 10
train 9.8
summer 9.7
way 9.5
journey 9.4
track 9.4
morning 9.1
reflection 8.9
business 8.5
destination 8.4
deck 8.3
island 8.3
tourist 8.2
tower 8.1
gate 8
interior 8
scenic 7.9
railway 7.9
empty 7.7
modern 7.7
construction 7.7
winter 7.7
stone 7.6
cityscape 7.6
trip 7.6
perspective 7.5
boat 7.5
house 7.5
dark 7.5
sunrise 7.5
barrier 7.5
evening 7.5
window 7.3
to 7.1

Google
created on 2020-03-11

Microsoft
created on 2020-03-11

outdoor 98.5
sky 90
black and white 89.5
shadow 73.8
monochrome 54.5
silhouette 54.1
several 11.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-40
Gender Male, 53.5%
Sad 45.1%
Happy 45%
Fear 45%
Surprised 45%
Calm 54.8%
Confused 45%
Angry 45%
Disgusted 45%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.4%
Bridge 72.4%

Text analysis

Google

AV
AV