Human Generated Data

Title

Untitled (beach and grassy area at Tutor Hotel)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10764

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (beach and grassy area at Tutor Hotel)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Furniture 100
Human 98.9
Person 98.9
Person 98.5
Person 97.7
Person 97.5
Person 95.9
Cradle 93
Person 78.7
Person 75.4
Person 74
Tree 65.3
Plant 65.3
Hammock 59.1
Person 48.5

Imagga
created on 2022-01-15

fountain 45.7
structure 38.7
beach 38.6
tree 34.3
tropical 31.5
sky 31.2
palm 30.9
water 28.7
island 27.5
ocean 26.9
landscape 24.5
sea 24.2
sun 24.1
summer 22.5
sand 22.2
travel 21.8
vacation 21.3
cloud 20.7
holiday 20.1
paradise 18.8
scenery 16.2
tourism 15.7
resort 15.5
coast 15.3
coconut 14.6
relax 14.3
trees 14.2
scenic 13.2
outdoor 13
scene 13
tranquil 12.7
horizon 12.6
architecture 12.3
idyllic 12.2
shore 12.2
destination 12.2
negative 11.9
seascape 11.5
building 11.2
relaxation 10.9
hot 10.9
dark 10.9
sunset 10.8
light 10.7
night 10.7
tropic 10.6
clear 10.5
wave 10.4
film 10.3
bay 10.2
parachute 10.2
clouds 10.1
city 10
park 9.9
bridge 9.8
coastline 9.4
newspaper 9.3
exotic 9.1
color 8.9
river 8.9
turquoise 8.6
sunny 8.6
old 8.4
recreation 8.1
natural 8
sunlight 8
forest 7.8
equipment 7.8
parasol 7.8
silhouette 7.4
product 7.4
rescue equipment 7.4
peace 7.3
photographic paper 7.2
black 7.2
colorful 7.2
bright 7.1
creation 7
leaf 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.2
water 75.5
black and white 75.2
sky 69.3
beach 57.1
plant 39.1

Face analysis

Amazon

AWS Rekognition

Age 14-22
Gender Female, 51.7%
Calm 70.8%
Sad 7.9%
Surprised 4.9%
Confused 4.6%
Disgusted 4.5%
Angry 4.4%
Fear 1.6%
Happy 1.4%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a tree in front of a palm tree 66.6%
a person standing in front of a palm tree 44%
a person sitting in front of a palm tree 33.5%

Text analysis

Google

48320-C.
YT37A°
--XAGOX
48320-C. - YT37A° --XAGOX
-