Human Generated Data

Title

Untitled (suburban scene with palm)

Date

c. 1969

People

Artist: Edward Grazda, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Robert M. Sedgwick II Fund, 2.2002.790

Copyright

© Edward Grazda

Human Generated Data

Title

Untitled (suburban scene with palm)

People

Artist: Edward Grazda, American born 1947

Date

c. 1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.3
Person 99.3
Plant 99
Tree 99
Arecaceae 97.6
Palm Tree 97.6
Silhouette 83.7
Outdoors 60.5
Apparel 58.7
Overcoat 58.7
Clothing 58.7
Coat 58.7
Fir 56.2
Abies 56.2
Weather 55.3
Nature 55.3

Imagga
created on 2021-12-14

palm 62.8
tree 58.9
beach 58
coconut 53.9
tropical 47.7
sky 46.2
ocean 45.2
sand 44.1
sea 43
island 37.6
travel 35.2
paradise 34.9
vacation 34.4
water 33.4
sun 31.3
summer 30.9
trees 30.3
coast 29.7
relax 29.5
holiday 28
sunset 26.1
landscape 26.1
tourism 25.6
cloud 23.3
resort 22.7
shore 21.1
seascape 19.1
horizon 18.9
idyllic 18.9
destination 18.7
tropic 18.4
outdoor 18.4
plant 17.6
sunny 17.2
clouds 16.9
relaxation 16.8
coastline 16
silhouette 15.7
woody plant 15.3
warm 14.7
exotic 14.6
hot 14.2
vascular plant 14
peaceful 13.8
bay 13.2
peace 12.8
tropics 12.7
turquoise 12.5
dusk 12.4
scenic 12.3
scene 12.1
wave 12.1
natural 12.1
calm 11.9
tranquil 11.8
scenery 11.7
leaf 11.7
recreation 11.7
cabbage tree 11.4
outdoors 10.7
clear 10.5
sunrise 10.3
seaside 10.3
relaxing 10
thatch 9.8
shoreline 9.8
pacific 9.7
park 9.5
people 8.9
roof 8.8
nobody 8.6
climate 8.5
trip 8.5
serene 8.5
heat 8.3
romantic 8
palm tree 7.9
tourist 7.9
palms 7.9
day 7.9
lake 7.3
sunlight 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

sky 98.7
text 95.5
outdoor 92.5
palm tree 89.2
black and white 83.1
tree 70.6
street 50.5

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Female, 93.4%
Calm 46.6%
Fear 16.7%
Sad 14.5%
Happy 10.1%
Angry 6.7%
Surprised 3.2%
Confused 1.2%
Disgusted 1%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a man standing in front of a palm tree 94.6%
a man standing next to a palm tree 93.8%
a man standing in front of a tree 90.6%