Human Generated Data

Title

Colette at home, Boston

Date

1974, printed 1990-1991

People

Artist: Nan Goldin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.224

Copyright

© Nan Goldin

Human Generated Data

Title

Colette at home, Boston

People

Artist: Nan Goldin, American born 1953

Date

1974, printed 1990-1991

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Furniture 99.9
Cabinet 95.6
Dresser 94.1
Person 83.6
Human 83.6
Drawer 80.3
Bed 73.4
Indoors 70.8
Interior Design 70.8
Art 63.3
Painting 63.3
Room 61.6
Bedroom 55.7

Clarifai
created on 2018-03-23

people 99.5
furniture 99
room 98.2
indoors 96
adult 95.5
one 95.5
home 93.7
man 93.4
family 93.3
no person 91.8
two 90.5
monochrome 88.6
woman 85.9
interior design 85.1
war 79.1
chair 78.6
desk 78.1
cabinet 76
seat 75.2
bedroom 75

Imagga
created on 2018-03-23

file 66.7
furniture 63.4
office furniture 54.7
interior 27.4
room 25.1
home 24.7
television 23.7
box 20.5
house 20
modern 18.2
computer 17.9
office 17.8
wood 17.5
business 15.8
indoor 15.5
design 15.2
indoors 14.9
cabinet 14.1
monitor 13.4
working 13.2
wall 12.8
kitchen 12.6
technology 12.6
work 12.5
container 12.1
man 12.1
luxury 12
inside 11.9
style 11.9
door 11.8
safe 11.8
floor 11.1
laptop 11.1
black 10.8
table 10.8
chest 10.7
storage 10.5
chair 10.4
aquarium 10.2
old 9.7
furnishing 9.7
empty 9.4
equipment 9.4
window 9.3
male 9.2
decoration 8.7
apartment 8.6
architecture 8.6
glass 8.5
nobody 8.5
locker 8.4
desk 8.4
vintage 8.3
security 8.3
strongbox 8.2
domestic 8.1
telecommunication system 8.1
open 8.1
wardrobe 8
light 8
person 7.8
people 7.8
blackboard 7.8
support 7.7
center 7.5
elegance 7.5
display 7.3
metal 7.2
smile 7.1
adult 7.1
information 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 55%
Happy 46.5%
Disgusted 46.5%
Calm 46.9%
Surprised 45.7%
Angry 45.5%
Sad 48.5%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 52.1%
Confused 45.2%
Sad 53.2%
Surprised 45.3%
Calm 45.6%
Happy 45.1%
Disgusted 45.2%
Angry 45.4%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 83.6%
Painting 63.3%

Captions

Microsoft

a black and white photo of a cat 43.4%
a cat sitting on a desk 43.3%
a cat sitting in a room 43.2%