Human Generated Data

Title

Untitled (S.F.)

Date

1980

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5224

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (S.F.)

People

Artist: Bill Dane, American born 1938

Date

1980

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99
Person 99
White Board 96.3
Person 91.5
Text 83.3
Poster 56.6
Advertisement 56.6

Clarifai
created on 2019-11-15

people 99
adult 96.1
one 96
woman 90.4
man 88
wear 85.4
room 84.6
music 80.8
business 80.4
indoors 80.3
child 79.4
offense 77.3
education 75.1
technology 73.1
two 72.7
portrait 71.6
vehicle 70.7
group 70.4
musician 69.5
furniture 67.3

Imagga
created on 2019-11-15

man 20.8
person 20.3
people 20.1
adult 19.2
office 18.8
male 17.1
business 17
interior 15.9
fashion 15.1
indoors 14.9
professional 14.4
portrait 14.2
black 13.9
attractive 13.3
sexy 12.8
window 12.8
home 12.8
device 12.7
corporate 12
indoor 11.9
work 11.8
businessman 11.5
sitting 11.2
inside 11
computer 10.7
working 10.6
modern 10.5
executive 10.2
lifestyle 10.1
model 10.1
lady 9.7
style 9.6
urban 9.6
standing 9.6
communication 9.2
house 9.2
refrigerator 9.2
room 9.1
pretty 9.1
success 8.8
job 8.8
looking 8.8
design 8.8
happy 8.8
women 8.7
light 8.7
clothing 8.7
men 8.6
suit 8.4
elegance 8.4
door 8.3
one 8.2
confident 8.2
dress 8.1
group 8.1
smiling 8
musical instrument 7.8
chair 7.8
white goods 7.8
appliance 7.6
silhouette 7.4
holding 7.4
technology 7.4
cheerful 7.3
sensual 7.3
businesswoman 7.3
paper 7.2
laptop 7.2
home appliance 7.1
worker 7.1
posing 7.1
building 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 98.4
text 97.7
drawing 96
clothing 88
handwriting 87.1
black and white 82.9
sketch 71.4
poster 63.2
cartoon 56.6
man 54.8
picture frame 7.4

Face analysis

Amazon

Google

AWS Rekognition

Age 33-49
Gender Male, 92.4%
Calm 97.1%
Surprised 0.5%
Disgusted 0.3%
Happy 0.2%
Angry 0.7%
Confused 0.3%
Sad 0.8%
Fear 0%

AWS Rekognition

Age 28-44
Gender Male, 56.5%
Angry 4.2%
Sad 11.8%
Confused 1.2%
Disgusted 0.3%
Surprised 7.4%
Happy 2%
Fear 6.2%
Calm 66.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Poster 56.6%

Captions

Microsoft

a person standing in front of a window 72.8%
a man and a woman standing in front of a window 46.2%
a black and white photo of a person 46.1%

Text analysis

Amazon

THANK
ANARCHY
GOD
FOR
APATHY ANARCHY
APATHY
he THANK GOD FOR ERUCR BeRRY
IME
AaMA
BeRRY
ERUCR
IME Foat!
he
Foat!
KLER.
AaMA Kis KLER.
Ney
Nol
NDY Nol
RK4
NDY
Kis

Google

he THANK GOD FOR EHUCK BERRY AN ACEC IME APATHY ANARCHY Let's
FOR
AN
APATHY
he
BERRY
ACEC
THANK
GOD
EHUCK
IME
ANARCHY
Let's