Human Generated Data

Title

Untitled ("Marine World Africa USA")

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5177

Copyright

© Bill Dane

Human Generated Data

Title

Untitled ("Marine World Africa USA")

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Furniture 94.4
Human 94.4
Person 94.4
Person 94
Sitting 93.9
Meal 86.5
Food 86.5
Bird 86
Animal 86
Bench 83.5
Person 81.5
Outdoors 78.4
Plant 75.6
Tree 75.6
Table 74.5
Apparel 74.1
Undershirt 74.1
Clothing 74.1
Wood 66.5
Nature 64.4
Leisure Activities 60.1
Electronics 56.9
Screen 56.9
Ground 55.6

Clarifai
created on 2019-11-15

people 99.8
monochrome 99.3
adult 97.8
man 96.4
street 95.6
black and white 95.3
one 95.2
two 94.3
woman 93
group 90.7
vehicle 90.3
group together 90.3
furniture 87.8
portrait 87.7
recreation 85.2
tree 85.2
child 85.1
shadow 84.8
family 83.7
war 82.7

Imagga
created on 2019-11-15

barrow 100
handcart 97.7
wheeled vehicle 74.3
vehicle 47.6
tree 26.1
conveyance 25.7
tool 25.2
landscape 24.5
plow 23.4
park 23
trees 20.4
bench 18.5
sky 16.6
outdoor 16
autumn 15.8
chair 15.8
summer 15.4
fall 15.4
rural 15
outdoors 14.9
snow 14.4
grass 14.2
wood 14.2
season 14
garden 13.9
forest 13.9
country 13.2
cold 12.9
countryside 12.8
winter 12.8
old 12.5
light 12
travel 12
seat 11.4
water 11.3
path 11.3
morning 10.8
scenery 10.8
lonely 10.6
fence 10.5
scene 10.4
relax 10.1
lake 10.1
house 10
sun 9.7
scenic 9.7
fog 9.6
day 9.4
holiday 9.3
yellow 9.3
building 9.2
field 9.2
peaceful 9.1
sunny 8.6
park bench 8.6
people 8.4
leisure 8.3
environment 8.2
road 8.1
sunset 8.1
meadow 8.1
natural 8
river 8
home 8
colors 7.9
seasonal 7.9
furniture 7.9
architecture 7.8
color 7.8
sitting 7.7
woods 7.6
walk 7.6
clouds 7.6
sunrise 7.5
ice 7.4
vacation 7.4
leaf 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 98.3
text 96.9
black and white 90.3
person 76.2
water 75
bench 70.3
furniture 65.7
lake 52.9

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Male, 50.4%
Fear 49.5%
Angry 49.5%
Calm 50.4%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.6%
Disgusted 49.5%

Feature analysis

Amazon

Person 94.4%
Bird 86%
Bench 83.5%

Captions

Microsoft

a person sitting on a bench 33.5%