Human Generated Data

Title

Untitled (woman in long black flowered dress standing in front of credenza and painting)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12911

Human Generated Data

Title

Untitled (woman in long black flowered dress standing in front of credenza and painting)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12911

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.7
Person 99.7
Apparel 95.7
Clothing 95.7
Indoors 92.2
Room 92.2
Furniture 87.4
Leisure Activities 86.1
Fashion 83.1
Robe 83.1
Evening Dress 83.1
Gown 83.1
Cabinet 79.7
Dressing Room 78.5
Wood 73.5
Interior Design 72
Musical Instrument 59.4
Piano 59.4
Sideboard 59
Female 58.8
Living Room 58.7
Dresser 57.8
Dress 56.1
Costume 56

Clarifai
created on 2019-11-16

people 100
one 99
adult 98.8
furniture 96.8
wear 96.6
two 96.1
print 95.5
group 94
room 93.5
portrait 92
woman 91.1
man 90.8
administration 90.5
art 87.4
indoors 85.4
music 85
outfit 82
three 81.4
actress 79.1
vehicle 78.5

Imagga
created on 2019-11-16

device 26.9
interior 26.5
electric chair 24.6
instrument of execution 20.2
room 20
house 19.2
wall 17.9
style 17.8
home 17.5
person 17.5
chair 17.3
instrument 17
vintage 16.5
fashion 15.8
antique 15.6
furniture 15.1
light 14.7
old 14.6
protection 14.5
danger 14.5
sexy 14.4
wood 14.2
retro 13.1
adult 13
inside 12.9
stylish 12.6
black 12.6
elegance 12.6
man 12.1
luxury 12
elegant 12
design 11.8
people 11.7
decoration 11.6
indoors 11.4
lamp 11.2
gun 11.2
window 11.1
portrait 11
dirty 10.8
soldier 10.7
posing 10.6
apartment 10.5
attractive 10.5
smoke 10.2
male 9.9
military 9.6
body 9.6
washboard 9.5
model 9.3
floor 9.3
dark 9.2
weapon 9.2
industrial 9.1
dress 9
camouflage 8.8
musical instrument 8.8
clothing 8.7
urban 8.7
hair 8.7
architecture 8.6
fire 8.4
modern 8.4
building 8.1
decor 8
gas 7.7
war 7.7
hotel 7.6
mirror 7.6
uniform 7.5
frame 7.5
city 7.5
holding 7.4
safety 7.4
rifle 7.3
lady 7.3
art 7.3
suit 7.2
looking 7.2
television 7.1
table 7.1
face 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97.9
indoor 97
person 86.6
clothing 78.1
black and white 77.2
text 75.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Female, 99.2%
Sad 0.2%
Angry 0.2%
Calm 91.6%
Happy 0.3%
Surprised 0.2%
Confused 7.3%
Disgusted 0.1%
Fear 0%

Microsoft Cognitive Services

Age 47
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories