Human Generated Data

Title

Untitled (young woman in long black dress standing between chair and window)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12914

Human Generated Data

Title

Untitled (young woman in long black dress standing between chair and window)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Person 99.2
Clothing 97.9
Apparel 97.9
Flooring 91.9
Floor 86.9
Indoors 86.5
Room 86.5
Sitting 83.6
Female 82.4
Furniture 80.7
Woman 75.8
Door 71.7
Sleeve 69.2
Couch 66.4
Coat 65.8
Overcoat 65.8
Suit 65.8
Living Room 60.9
Long Sleeve 59
Home Decor 57.8
Dressing Room 55.6

Clarifai
created on 2019-11-16

people 99.8
one 99.3
adult 97.8
furniture 97.5
woman 97.2
indoors 95.7
portrait 95.5
sit 95.3
room 95
seat 94.6
wear 93
man 92.9
two 92.6
chair 91.1
easy chair 89.5
leader 85.5
monochrome 84.3
group 83.5
art 82.6
mirror 82.5

Imagga
created on 2019-11-16

chair 32
old 24.4
architecture 23.6
musical instrument 21.9
seat 21.4
building 20.1
throne 18.6
accordion 17.9
house 17.5
city 17.4
keyboard instrument 16.8
ancient 16.4
door 16.3
device 15.2
chair of state 14.9
stone 14.4
window 14.2
furniture 14.2
man 14.1
wall 13.7
history 13.4
wind instrument 12.6
urban 12.2
support 11.9
interior 11.5
street 11
historic 11
sill 10.5
electric chair 10.4
black 10.4
home 10.4
historical 10.3
classic 10.2
town 10.2
robe 10.1
instrument of execution 10.1
rocking chair 9.9
travel 9.9
room 9.7
light 9.4
vintage 9.1
fashion 9
religion 9
people 8.9
person 8.8
scene 8.6
brick 8.5
windowsill 8.4
antique 8
structural member 8
male 7.8
entrance 7.7
statue 7.7
culture 7.7
windows 7.7
apartment 7.7
dark 7.5
exterior 7.4
inside 7.4
tourist 7.2
instrument 7.2
garment 7.1
portrait 7.1
night 7.1
indoors 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97.1
furniture 96.2
text 93
indoor 87.2
black and white 86.3
vase 84.8
chair 80.1
mirror 70.5
curtain 67.6
table 63.4
clothing 63.1
person 56

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 54.9%
Fear 45%
Confused 45.2%
Calm 53.9%
Sad 45.6%
Disgusted 45%
Happy 45.1%
Surprised 45.1%
Angry 45.1%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 83.8%
a person standing in front of a mirror 83.7%
a person standing in front of a window 83.6%

Text analysis

Google

ययट
TRP
ययट TRP