Human Generated Data

Title

Untitled (Azure Tides Motel pool)

Date

1956

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8924

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Azure Tides Motel pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Road 93.3
Arecaceae 90.2
Tree 90.2
Palm Tree 90.2
Plant 90.2
Nature 86.8
Outdoors 86.8
Landscape 86.8
Transportation 81.4
Automobile 81.4
Car 81.4
Vehicle 81.4
Intersection 78.1
Building 77.2
Water 74
Human 66.1
Person 66.1
Waterfront 64.7
Scenery 62.9
Freeway 62.2
Aerial View 59.1
Port 56.2
Pier 56.2
Dock 56.2
Bridge 55.3
Boardwalk 55.3
Person 55

Imagga
created on 2022-01-09

intersection 100
road 44.3
city 44.1
traffic 39
transportation 36.8
highway 35.7
urban 31.5
travel 31
street 30.4
car 26.5
transport 26.5
speed 23.8
night 23.1
sky 21.7
lane 20.6
motorway 19.7
asphalt 19.6
motion 18
lights 17.6
cars 17.6
aerial 17.5
building 17.5
architecture 17.4
way 16.3
drive 16.1
light 16
bridge 15.6
landscape 15.6
scene 15.6
downtown 15.4
freeway 14.8
speedway 14.7
business 14.6
driving 14.5
direction 14.3
cityscape 14.2
trip 14.2
town 13.9
vehicle 13.8
line 13.7
rush 12.8
automobile 12.5
buildings 12.3
outdoor 12.2
blur 12.1
modern 11.9
move 11.5
perspective 11.3
movement 11.3
fast 11.2
evening 11.2
tourism 10.7
center 10.4
journey 10.4
racetrack 10.2
long 10.1
dark 10
skyscraper 9.6
auto 9.6
high 9.5
day 9.4
clouds 9.3
horizon 9
headlight 8.9
airport 8.8
above 8.7
concrete 8.6
moving 8.6
dusk 8.6
station 8.5
course 8.3
vacation 8.2
tower 8.1
trees 8
life 7.8
facility 7.8
empty 7.7
blurred 7.7
wing 7.6
skyline 7.6
expressway 7.6
outdoors 7.5
land 7.4
river 7.1
structure 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.7
black and white 86.1

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 74.5%
Calm 40.7%
Sad 33.7%
Happy 8.4%
Angry 6.6%
Confused 3.8%
Surprised 2.8%
Disgusted 2.6%
Fear 1.4%

Feature analysis

Amazon

Car 81.4%
Person 66.1%

Captions

Microsoft

a large building 70.1%
a large building in the background 70%
an aerial view of a city 69.9%

Text analysis

Amazon

68024
MJ17--YT37A- -