Human Generated Data

Title

Untitled (Arcade d'Accessories)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5180

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Arcade d'Accessories)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5180

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 98.4
Person 98.1
Apparel 97.4
Clothing 97.4
Person 97.2
Person 95.4
Person 81.7
Musical Instrument 67.9
Shorts 61.5
Horse 58.7
Mammal 58.7
Animal 58.7
Leisure Activities 56
Musician 55.1

Clarifai
created on 2019-11-15

people 99.8
group together 99.4
many 98.4
group 97.8
adult 96.8
man 95.6
woman 95.2
street 94.8
several 93.5
child 92.9
school 91.4
monochrome 89.7
wear 89.4
recreation 87.2
education 86.6
military 83.8
leader 83.8
four 83.2
outfit 82.8
five 81.7

Imagga
created on 2019-11-15

man 24.2
weapon 23.7
sword 23.3
people 20.6
person 20.2
male 18.5
uniform 17.2
city 15.8
clothing 14.3
sport 13.9
world 13.7
performer 13
portrait 12.9
athlete 12.9
men 12.9
adult 12.3
travel 12
street 12
dancer 11.8
military 11.6
urban 11.4
military uniform 11.2
black 10.8
soldier 10.8
walk 10.5
business 10.3
tradition 10.2
player 9.9
group 9.7
war 9.6
women 9.5
walking 9.5
ballplayer 9.3
girls 9.1
suit 9
style 8.9
boy 8.7
statue 8.7
mask 8.6
architecture 8.6
fashion 8.3
historic 8.3
building 8.2
private 8.2
industrial 8.2
scene 7.8
old 7.7
entertainer 7.6
human 7.5
child 7.5
traditional 7.5
outdoors 7.5
rifle 7.4
gun 7.3
danger 7.3
dirty 7.2
dress 7.2
to 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 97.6
outdoor 95.6
text 95
clothing 94.8
man 76.6
black and white 67
footwear 57.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Female, 50.4%
Fear 49.6%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%
Sad 50.4%
Surprised 49.5%

AWS Rekognition

Age 26-40
Gender Female, 50.3%
Confused 49.5%
Angry 49.6%
Surprised 49.5%
Calm 49.6%
Disgusted 49.5%
Fear 49.7%
Sad 50.1%
Happy 49.5%

AWS Rekognition

Age 14-26
Gender Female, 50%
Fear 49.6%
Happy 49.5%
Confused 49.5%
Calm 50.3%
Disgusted 49.5%
Sad 49.6%
Angry 49.5%
Surprised 49.5%

AWS Rekognition

Age 24-38
Gender Male, 50%
Disgusted 49.5%
Sad 49.7%
Happy 49.5%
Surprised 49.5%
Calm 50.2%
Fear 49.5%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50.3%
Happy 49.5%
Disgusted 49.5%
Calm 50.4%
Sad 49.5%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.6%
Horse 58.7%

Text analysis

Amazon

ARCATE
DNESSOMREIS

Google

ARCADE DACCESSOIRES
ARCADE
DACCESSOIRES