Human Generated Data

Title

Standing Woman

Date

20th century

People

Artist: Simkha Simkhovitch, American 1893 - 1949

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Meta and Paul J. Sachs, 1965.462

Human Generated Data

Title

Standing Woman

People

Artist: Simkha Simkhovitch, American 1893 - 1949

Date

20th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Meta and Paul J. Sachs, 1965.462

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Human 98.6
Person 98.6
Clothing 97.4
Apparel 97.4
Sleeve 81.5
Female 65
Face 64.6
Art 61.6
Drawing 59.6
Overcoat 59.3
Coat 59.3
Floor 59.2
Pants 55.4
Door 55.1

Clarifai
created on 2020-05-02

people 99.7
painting 99
portrait 98.7
adult 97.7
one 97.5
art 96.6
man 96
museum 95.4
two 94.7
woman 94.5
family 94.2
furniture 94.1
door 92.3
picture frame 91.9
print 91.8
wear 91.4
wall 89.7
vintage 87.6
doorway 86
retro 85.7

Imagga
created on 2020-05-02

door 39.4
old 35.5
jamb 31.8
wall 27.4
architecture 25.9
upright 25.8
structural member 25.4
ancient 23.3
building 22.3
device 20.8
religion 19.7
window 19
entrance 18.3
support 18.3
vintage 18.2
antique 17.5
art 17.4
furniture 17
frame 16.8
temple 16.1
wooden 15.8
house 15
sculpture 14.6
history 14.3
carving 14.2
historic 13.7
culture 13.7
home 13.5
wardrobe 13.5
wood 13.3
interior 13.3
texture 13.2
brown 12.5
historical 12.2
aged 11.8
travel 11.3
religious 11.2
church 11.1
stone 11.1
exterior 11.1
doorway 10.8
retro 10.6
statue 10.6
painted 10.5
decoration 10.5
detail 10.4
construction 10.3
furnishing 10
weathered 9.5
grunge 9.4
ornate 9.1
design 9
pattern 8.9
metal 8.8
gate 8.7
museum 8.7
worn 8.6
black 8.4
street 8.3
gold 8.2
elevator 8.1
urban 7.9
entry 7.8
carved 7.8
glass 7.8
city 7.5
tourism 7.4
close 7.4
symbol 7.4
sill 7.4
covering 7.3
structure 7.2
open 7.2

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

wall 97.4
gallery 96.8
drawing 95.9
white 94.9
person 94.9
black 94.4
clothing 93.5
sketch 90.1
human face 89
painting 87.6
clock 87.2
black and white 85.8
window 85.4
woman 73.9
text 64
art 58.8
room 56.7
posing 38.4
picture frame 35.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 17-29
Gender Female, 50.7%
Happy 45.3%
Angry 45.2%
Calm 49.7%
Disgusted 45.1%
Surprised 48.2%
Confused 45.2%
Fear 45.9%
Sad 45.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Door 55.1%

Categories