Human Generated Data

Title

Untitled (Hawaii)

Date

1977

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5118

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Hawaii)

People

Artist: Bill Dane, American born 1938

Date

1977

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5118

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.1
Human 99.1
Person 98.3
Person 95.8
Apparel 95.6
Clothing 95.6
Person 92.9
Bag 86.3
Plant 86.1
Tree 86.1
Shorts 81.6
People 74.6
Outdoors 64.9
Person 64.5
Person 62.6
Ground 60.7
Female 60.6
Handbag 57.5
Accessories 57.5
Accessory 57.5
Purse 56.3

Clarifai
created on 2019-11-15

people 99.7
monochrome 98.4
group 97
group together 96.4
adult 95.9
man 94.4
street 91.6
many 91.4
woman 90.5
child 90.4
two 87.9
black and white 86
war 85.6
wear 84.5
tree 83.7
recreation 81.7
one 80.2
nature 80
home 79.5
portrait 77.8

Imagga
created on 2019-11-15

swing 85.1
mechanical device 66.1
plaything 64.8
mechanism 49.2
sunset 23.4
silhouette 23.2
man 20.2
people 17.3
outdoors 16.1
sport 15.1
black 15
park 14.8
sky 14.7
tree 14.4
trees 14.2
outdoor 13.8
male 13.5
light 13.4
lake 12.8
person 12.4
dusk 12.4
water 12
beach 11.8
relax 11.8
recreation 11.7
sun 11.4
landscape 11.2
peaceful 11
ocean 10.8
river 10.7
dark 10
summer 9.6
forest 9.6
evening 9.3
bench 9.1
portrait 9.1
adult 9.1
active 9
wheeled vehicle 8.9
player 8.9
serenity 8.7
sea 8.6
men 8.6
old 8.4
action 8.3
wood 8.3
leisure 8.3
alone 8.2
exercise 8.2
dirty 8.1
athlete 8
autumn 7.9
season 7.8
scene 7.8
solitude 7.7
winter 7.7
serene 7.5
tricycle 7.5
sunrise 7.5
freedom 7.3
lady 7.3
danger 7.3
grass 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 100
outdoor 99.8
ground 97.1
black and white 95.2
grave 86.3
plant 82.9
monochrome 79.7
cemetery 73.9
text 73.6
street 72.2
clothing 70.7
person 61.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Female, 51.3%
Fear 45.3%
Angry 45.2%
Sad 46.3%
Happy 45%
Calm 53%
Confused 45%
Surprised 45.1%
Disgusted 45.1%

AWS Rekognition

Age 33-49
Gender Male, 50.3%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Fear 49.9%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Sad 50%

Feature analysis

Amazon

Person 99.1%

Captions