Human Generated Data

Title

Pamela and Naomi, Boston

Date

1972, printed 1990-1991

People

Artist: Nan Goldin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.215

Copyright

© Nan Goldin

Human Generated Data

Title

Pamela and Naomi, Boston

People

Artist: Nan Goldin, American born 1953

Date

1972, printed 1990-1991

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.215

Copyright

© Nan Goldin

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.1
Human 99.1
Apparel 98.6
Clothing 98.6
Finger 77
Furniture 76
Person 74.8
Couch 66.8
Sweater 61.9
Skin 58.6
Sleeve 58.5
Electronics 56.6
Screen 56.6
Monitor 56.6
LCD Screen 56.6
Display 56.6
Person 54

Clarifai
created on 2018-03-23

people 99.7
adult 98.2
woman 97
monochrome 96.7
portrait 96.2
one 95.6
two 95
music 91.9
wear 91.3
man 91.3
actress 88.4
musician 84
furniture 82.9
group 82.7
chair 82.1
facial expression 81.3
administration 80.6
singer 80.1
street 77.4
movie 76.8

Imagga
created on 2018-03-23

people 31.8
person 30.7
drumstick 28.3
man 28.2
adult 28.1
stick 22.2
portrait 22
male 20
attractive 19.6
sitting 18
work 17.3
disk jockey 16.8
computer 16.4
office 16.3
working 15.9
lifestyle 15.9
laptop 15.7
face 14.9
smiling 14.5
happy 14.4
women 14.2
leisure 14.1
indoors 14.1
black 13.9
lady 13.8
smile 13.5
salon 13.5
broadcaster 13.5
pretty 13.3
desk 13.2
couple 13.1
sexy 12.8
device 12.7
professional 12.6
business 12.1
looking 12
indoor 11.9
model 11.7
interior 11.5
cute 11.5
together 11.4
fashion 11.3
home 11.2
casual 11
microphone 11
communicator 10.9
hand 10.6
job 10.6
one 10.4
technology 10.4
table 10.4
executive 10.4
love 10.3
youth 10.2
chair 10.2
car 10.1
suit 10
businesswoman 10
room 9.9
handsome 9.8
fun 9.7
two 9.3
worker 9
style 8.9
hair 8.7
brunette 8.7
corporate 8.6
blond 8.4
color 8.3
phone 8.3
holding 8.3
stylish 8.1
dress 8.1
music 8.1
vehicle 7.9
look 7.9
garage 7.9
happiness 7.8
notebook 7.8
men 7.7
hairdresser 7.7
modern 7.7
girlfriend 7.7
hand blower 7.7
repair 7.7
jeans 7.6
females 7.6
passion 7.5
20s 7.3
book 7.3
cheerful 7.3
playing 7.3
sensual 7.3
group 7.3
family 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99
indoor 91.4
woman 91.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 99.6%
Sad 10.2%
Disgusted 4.6%
Confused 18.7%
Surprised 6.2%
Calm 53.5%
Angry 3.6%
Happy 3.3%

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft
created on 2018-03-23

a woman sitting on a table 82.8%
a woman sitting at a table 82.7%
a woman sitting on a bed 68.5%