Human Generated Data

Title

Untitled (Oakland)

Date

1980s

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5277

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Oakland)

People

Artist: Bill Dane, American born 1938

Date

1980s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5277

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Performer 99.7
Human 99.7
Person 95.3
Shoe 92.6
Apparel 92.6
Clothing 92.6
Footwear 92.6
Officer 87.9
Military 87.9
Military Uniform 87.9
Suit 83.3
Overcoat 83.3
Coat 83.3
Clown 73.3
Costume 68.3
Accessories 59.8
Tie 59.8
Accessory 59.8
Captain 58.8
Door 55.5

Clarifai
created on 2019-11-15

people 99.6
one 98.5
adult 97.9
man 97.9
wear 96
portrait 91.5
leader 89.7
administration 89.1
outfit 87.2
actor 86.1
business 81.5
fashion 81.3
door 80.2
doorway 76.6
street 76.4
home 75.9
military 74.7
woman 74.3
two 73.7
monochrome 73

Imagga
created on 2019-11-15

male 32.6
man 32.4
suit 29.4
adult 29.2
person 28.5
brass 28.4
wind instrument 28.4
musical instrument 25.8
businessman 25.6
business 25.5
people 24.5
portrait 23.3
building 22.4
attractive 21.7
corporate 20.6
professional 19.7
office 17.9
pretty 17.5
happy 16.9
fashion 16.6
dress 16.3
job 15.9
phone 15.7
black 15.2
human 15
executive 14.9
men 14.6
outdoors 14.2
work 14.1
device 14.1
outside 13.7
city 13.3
clothing 12.8
call 12.6
horn 12.5
lifestyle 12.3
standing 12.2
success 12.1
violin 12
looking 12
happiness 11.7
model 11.7
handsome 11.6
bowed stringed instrument 11.5
outdoor 11.5
tie 11.4
lady 11.4
couple 11.3
sexy 11.2
manager 11.2
elegant 11.1
style 11.1
expression 11.1
street 11
stringed instrument 11
businesswoman 10.9
smile 10.7
urban 10.5
career 10.4
worker 10.3
cornet 10.3
cute 10
telephone 10
student 10
face 9.9
travel 9.9
one 9.7
body 9.6
architecture 9.5
hair 9.5
talking 9.5
confident 9.1
stylish 9
sax 8.8
cellphone 8.7
day 8.6
cell 8.6
serious 8.6
wall 8.5
fashionable 8.5
mobile 8.5
two 8.5
communication 8.4
elegance 8.4
window 8.4
old 8.4
occupation 8.2
posing 8
women 7.9
love 7.9
entrance 7.7
modern 7.7
youth 7.7
boss 7.6
formal 7.6
door 7.6
company 7.4
holding 7.4
successful 7.3
cheerful 7.3
jacket 7.3
alone 7.3
sensuality 7.3
university 7.2
hand 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

outdoor 98.8
suit 98.6
clothing 96.9
person 93
man 92.5
tie 85.9
human face 83.8
building 81.7
smile 76
black and white 75
text 72.9
coat 70
gentleman 57.5
posing 46.2
dressed 25.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-47
Gender Female, 93.1%
Fear 65.8%
Surprised 0.2%
Angry 0.6%
Calm 0.1%
Sad 31.4%
Happy 1.2%
Disgusted 0.3%
Confused 0.3%

Microsoft Cognitive Services

Age 61
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.3%
Shoe 92.6%
Suit 83.3%

Categories

Text analysis

Amazon

POPO
COWTNNAS

Google

РОPО TEASTLS
РОPО
TEASTLS