Human Generated Data

Title

Untitled (Disneyland)

Date

1985

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5285

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Disneyland)

People

Artist: Bill Dane, American born 1938

Date

1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5285

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Figurine 94.9
Human 94.8
Person 94.8
Clothing 94.7
Helmet 94.7
Apparel 94.7
Person 88.8
Person 88.5
Person 80.5
Poster 76.9
Advertisement 76.9
Person 75.5
Person 70.9
Art 63.1
Helmet 61.2
Furniture 60.4
Amusement Park 56.3
Theme Park 56.3
Person 47.4

Clarifai
created on 2019-11-15

people 99.7
group 98.9
man 96.1
adult 96.1
many 93.8
portrait 93.8
woman 92.5
monochrome 91
group together 88.3
art 85.2
child 85.2
wear 84.5
one 82.9
music 82.9
illustration 82.4
outfit 81.3
leader 81.1
skull 79
fear 77.3
musician 77.2

Imagga
created on 2019-11-15

toyshop 50.7
case 44.3
shop 43.6
mercantile establishment 31.6
place of business 21.1
religion 17.9
religious 15.9
culture 14.5
art 14
plaything 13.7
sculpture 13.6
travel 12.7
worship 12.6
statue 12.5
toy 12.2
ancient 12.1
black 11.4
temple 11.4
decoration 11
holiday 10.7
gold 10.7
establishment 10.5
man 9.5
head 9.2
tourism 9.1
window 9.1
old 9.1
retro 9
celebration 8.8
china 8.4
east 8.4
mask 8
colorful 7.9
face 7.8
gift 7.7
spiritual 7.7
meditation 7.7
bookend 7.6
tradition 7.4
present 7.3
people 7.2
support 7.2
clothing 7.1
life 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

posing 95.5
text 94.6
clothing 91.9
person 88.3
indoor 86.9
cartoon 80
human face 75.5
footwear 70
group 66.5
family 15.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 52.7%
Sad 45.4%
Surprised 45%
Confused 45%
Angry 45.8%
Calm 45.4%
Fear 53.3%
Happy 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 50.3%
Sad 45.8%
Happy 45.2%
Fear 49%
Calm 45.6%
Angry 49%
Disgusted 45%
Confused 45.1%
Surprised 45.3%

AWS Rekognition

Age 31-47
Gender Female, 82.3%
Disgusted 0.1%
Surprised 0.4%
Happy 79.3%
Sad 1.6%
Angry 14.8%
Fear 2.7%
Calm 0.9%
Confused 0.1%

AWS Rekognition

Age 24-38
Gender Male, 54.6%
Happy 45%
Confused 45%
Calm 45.1%
Angry 54.7%
Fear 45%
Disgusted 45%
Sad 45.1%
Surprised 45%

AWS Rekognition

Age 16-28
Gender Male, 54.2%
Confused 45%
Happy 45%
Disgusted 45%
Calm 45%
Angry 54.9%
Sad 45.1%
Surprised 45%
Fear 45%

AWS Rekognition

Age 36-54
Gender Male, 52.5%
Disgusted 45%
Sad 48.4%
Fear 51.4%
Angry 45.1%
Confused 45%
Happy 45%
Calm 45%
Surprised 45.1%

AWS Rekognition

Age 28-44
Gender Male, 61.9%
Angry 92.9%
Happy 0.2%
Disgusted 0.2%
Calm 1.6%
Fear 3%
Surprised 0.2%
Sad 1.6%
Confused 0.4%

AWS Rekognition

Age 22-34
Gender Male, 54.4%
Angry 52%
Disgusted 45%
Happy 45%
Calm 45.7%
Surprised 45.1%
Fear 47%
Confused 45%
Sad 45.2%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.8%
Helmet 94.7%
Poster 76.9%

Categories