Human Generated Data

Title

Untitled (Beverly Hills)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5174

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Beverly Hills)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5174

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.8
Human 99.8
Apparel 99.7
Clothing 99.7
Person 98.7
Person 98.4
Person 96.4
Coat 91.7
Accessory 81
Bag 81
Accessories 81
Pants 78.2
Jacket 74.5
Pedestrian 70.6
Purse 70.6
Shoe 59.8
Footwear 59.8
Handbag 58.3
Female 58.2
Sleeve 58
Overcoat 57.6

Clarifai
created on 2019-11-15

people 99.2
monochrome 96.9
street 96.4
adult 96.4
man 95.8
woman 95.8
group 94.4
portrait 93.5
music 91.1
wear 89.7
two 87.1
actor 85.5
group together 85.1
one 84.8
fashion 83.7
girl 83.2
child 82.2
city 79.8
business 79.1
musician 78.7

Imagga
created on 2019-11-15

people 22.9
urban 22.7
person 22.7
city 20.8
adult 20.4
man 20.2
black 19.7
fashion 18.8
portrait 18.8
business 17
world 16
model 15.6
women 15
sword 14.6
style 14.1
weapon 13.7
male 13.5
attractive 13.3
clothing 13.3
corporate 12.9
street 12.9
pretty 11.9
pose 11.8
lifestyle 11.6
building 11.4
group 11.3
human 11.2
one 11.2
hair 11.1
dress 10.8
posing 10.7
office 10.6
sexy 10.4
men 10.3
motion 10.3
stick 10.1
window 10
lady 9.7
looking 9.6
walk 9.5
legs 9.4
happy 9.4
cleaner 9
sport 8.9
move 8.6
architecture 8.6
garment 8.6
life 8.6
travel 8.4
silhouette 8.3
sensuality 8.2
stylish 8.1
job 8
businessman 7.9
crutch 7.9
smile 7.8
youth 7.7
casual 7.6
walking 7.6
professional 7.5
suit 7.4
executive 7.4
body 7.2
cute 7.2
team 7.2

Google
created on 2019-11-15

Photograph 97
White 96.9
Black 96.2
Black-and-white 95.9
Monochrome 91.1
Snapshot 90.2
Monochrome photography 86.3
Photography 82
Fun 76.8
Street 67.5
Stock photography 65.4
Smile 64.1
Style 59.5
Window 59.1
Art 58.1
Gesture 51.7

Microsoft
created on 2019-11-15

text 99.5
clothing 98.1
person 97.5
street 96.1
outdoor 88.7
woman 83.4
black and white 83
monochrome 69.5
man 55.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-38
Gender Male, 99.6%
Fear 1.3%
Calm 57.7%
Happy 0.3%
Surprised 0.7%
Angry 18.4%
Sad 16.1%
Disgusted 3.1%
Confused 2.3%

AWS Rekognition

Age 39-57
Gender Female, 98.1%
Disgusted 3%
Happy 2%
Angry 73.8%
Fear 2%
Calm 8.4%
Confused 7.5%
Surprised 1.4%
Sad 1.9%

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Jacket 74.5%
Shoe 59.8%
Handbag 58.3%

Categories

Text analysis

Amazon

or
IWALK
tntoe

Google

WALK
WALK