Human Generated Data

Title

Untitled (Oakland)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5204

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Oakland)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5204

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Clothing 99.8
Apparel 99.8
Bonnet 98.2
Human 96.3
Person 96.3
Person 88.8
Person 77.4
Plant 76
Tree 76
Text 73.2
Art 66.7
Hat 66.2
Person 58.3
Person 49.8

Clarifai
created on 2019-11-15

people 99.2
adult 97.1
portrait 95.7
woman 95.5
man 93.8
girl 93.6
street 93.4
two 93.1
family 91.6
one 91.5
group 90.9
city 88.3
child 87.5
couple 85.4
house 84.8
wear 84.2
building 83.5
museum 81.9
love 81
wedding 79.6

Imagga
created on 2019-11-15

world 25.2
statue 21.7
sculpture 20.7
portrait 18.8
people 16.7
art 16.5
old 16
culture 15.4
teddy 13.5
face 13.5
black 13.2
white goods 13.1
ancient 13
washer 12.7
marble 12.6
stone 12.3
outdoor 12.2
head 11.8
dress 11.7
plaything 11.7
person 11.7
history 11.6
billboard 11.6
historic 11
traditional 10.8
religion 10.8
structure 10.7
cemetery 10.7
happy 10.7
antique 10.4
signboard 10.4
architecture 10.2
smiling 10.1
man 10.1
home appliance 9.9
building 9.8
detail 9.7
god 9.6
travel 9.2
decoration 9.1
park 9.1
mask 8.6
male 8.6
smile 8.5
monument 8.4
garden 8.4
famous 8.4
vintage 8.3
outdoors 8.2
holiday 7.9
day 7.8
happiness 7.8
theater 7.8
bride 7.7
tourism 7.4
closeup 7.4
girls 7.3
color 7.2
celebration 7.2
adult 7.1
child 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 93.8
clothing 92.5
person 88.2
black and white 84.7
woman 82.4
wedding dress 66.1
dress 62.9
tree 59.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 13-25
Gender Female, 87.3%
Calm 73.3%
Surprised 1.5%
Sad 6.3%
Happy 15.4%
Fear 0.8%
Disgusted 0.2%
Angry 1.1%
Confused 1.5%

Microsoft Cognitive Services

Age 42
Gender Female

Feature analysis

Amazon

Person 96.3%
Hat 66.2%

Categories