Human Generated Data

Title

Untitled (Marine World)

Date

1978-1981

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5086

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Marine World)

People

Artist: Bill Dane, American born 1938

Date

1978-1981

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5086

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.7
Person 99.7
Person 99.5
Ball 99.5
Balloon 99.5
Person 99.3
Shoe 97.6
Clothing 97.6
Apparel 97.6
Footwear 97.6
Person 97.5
Person 93.3
Person 92.7
Shoe 87.4
Shorts 69
Sphere 68.8
Performer 55.9

Clarifai
created on 2019-11-15

people 99.8
man 96.8
street 96.7
adult 96.4
group 96.2
monochrome 96
child 95.1
woman 94.1
group together 94.1
two 89.3
boy 87.2
girl 85.2
music 83
one 82.6
administration 81.3
portrait 80.6
wedding 79.9
wear 79.8
dog 79.7
several 78.7

Imagga
created on 2019-11-15

ball 57.1
game equipment 37.4
soccer ball 34.8
equipment 31.1
sport 25
man 24.2
people 21.7
adult 21.4
person 20.7
play 15.5
volleyball 14.5
fashion 14.3
game 14.3
player 14.1
male 13.6
black 13.5
action 13
exercise 12.7
punching bag 12.5
athlete 12.3
human 12
fun 12
competition 11.9
leisure 11.6
lifestyle 11.6
baseball 11.3
sexy 11.2
style 11.1
model 10.9
city 10.8
hand 10.6
football 10.6
athletic 10.5
sports equipment 10.5
attractive 10.5
hair 10.3
playing 10
active 10
lady 9.7
portrait 9.7
summer 9.6
body 9.6
world 9.4
happy 9.4
training 9.2
outdoor 9.2
fitness 9
team 9
posing 8.9
urban 8.7
women 8.7
boy 8.7
soccer 8.7
pretty 8.4
dark 8.3
musical instrument 8.1
dress 8.1
activity 8.1
face 7.8
hat 7.7
men 7.7
jump 7.7
youth 7.7
light 7.3
helmet 7.3
music 7.2
recreation 7.2
worker 7.1
performer 7
modern 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 99.2
clothing 98.5
text 92.6
footwear 90.9
man 88.1
woman 82.9
people 71.5
group 69.1
black and white 60.3
several 12.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 51.7%
Disgusted 45%
Happy 45%
Angry 45.1%
Calm 54.1%
Surprised 45%
Confused 45%
Sad 45.7%
Fear 45%

Feature analysis

Amazon

Person 99.7%
Shoe 97.6%