Human Generated Data

Title

Untitled (man and woman parked in a car near the ocean)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8966

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman parked in a car near the ocean)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Transportation 99.6
Vehicle 99.6
Automobile 99.6
Car 99.6
Plant 97.6
Tree 97.6
Person 96.2
Human 96.2
Nature 88.8
Outdoors 81.7
Wheel 76.5
Machine 76.5
Arecaceae 72.3
Palm Tree 72.3
Vegetation 70.3
Road 66.5
City 66.5
Town 66.5
Street 66.5
Urban 66.5
Building 66.5
Housing 64.7
Face 62.5
Photography 61.6
Photo 61.6
House 60.8
Villa 60.8
Conifer 58.4
Countryside 56.1
Fir 55.5
Abies 55.5

Imagga
created on 2022-01-09

fountain 100
structure 76.6
snow 58.3
weather 35.4
tree 33.7
landscape 29.7
sky 29.4
sun 22.8
night 21.3
light 20
winter 18.7
trees 18.7
outdoor 18.3
season 17.9
water 17.3
scene 17.3
park 15.9
forest 15.7
cold 15.5
morning 15.4
holiday 15
scenery 13.5
frost 13.4
travel 13.4
palm 13.1
evening 13.1
tropical 12.8
sunset 12.6
day 12.5
field 12.5
dark 12.5
wood 12.5
sunrise 12.2
bright 12.1
outdoors 11.9
fireworks 11.8
snowy 11.7
summer 11.6
paradise 11.3
beach 11.2
clouds 11
vacation 10.6
fog 10.6
scenic 10.5
woods 10.5
cloud 10.3
black 10.2
island 10.1
ocean 10
building 9.9
river 9.8
celebration 9.6
natural 9.4
lighting 9.3
space 9.3
yellow 9.3
ice 9.2
horizon 9
sunlight 8.9
rural 8.8
frosty 8.8
freeze 8.7
grass 8.7
festival 8.6
frozen 8.6
sand 8.5
old 8.4
color 8.3
land 8.3
tourism 8.2
reflection 8.2
countryside 8.2
new 8.1
recreation 8.1
sea 7.8
sunny 7.7
wilderness 7.5
resort 7.5
city 7.5
design 7.3
tranquil 7.2
fall 7.2
meadow 7.2
shadow 7.2
architecture 7.2

Microsoft
created on 2022-01-09

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 95.6%
Sad 79.1%
Calm 11.2%
Fear 4.9%
Confused 1.8%
Happy 0.9%
Surprised 0.8%
Angry 0.7%
Disgusted 0.6%

Feature analysis

Amazon

Car 99.6%
Person 96.2%
Wheel 76.5%

Captions

Microsoft

a person sitting in front of a window 33%
a person standing in front of a window 32.9%
a person in front of a window 32.8%

Text analysis

Amazon

MJ17--Y3742408

Google

MJI
2
MJIヨ--YT3 2 o8
--YT3
o8