Human Generated Data

Title

Untitled (woman in fur trimmed coat)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1177

Human Generated Data

Title

Untitled (woman in fur trimmed coat)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1177

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.2
Human 98.2
Face 80.1
Drawing 73.4
Art 73.4
Window 71.8
Girl 66.7
Female 66.7
Porthole 55.2

Clarifai
created on 2023-10-26

art 98.5
retro 98.3
portrait 96.3
no person 95.8
vintage 95.2
sepia 95
one 94.5
wear 92.9
old 92.6
antique 92.3
people 90.7
woman 87.8
painting 86.7
girl 84.4
Easter 83.9
religion 81.7
classic 81.4
travel 81
man 79.8
design 79.7

Imagga
created on 2022-01-22

decoration 30.1
frame 26
gold 24.7
design 20.8
shiny 19
chandelier 17.8
silver 17.7
vintage 17.5
pattern 16.4
retro 16.4
golden 16.3
luxury 16.3
metal 16.1
texture 16
art 15.8
celebration 15.1
antique 15.1
shape 14.8
gift 14.6
lighting fixture 14.3
elegance 14.3
jewelry 14.2
decorative 14.2
card 13.9
old 13.2
wallpaper 13
fashion 12.8
floral 12.8
gem 12.6
graphic 12.4
paper 12
wedding 12
border 11.8
holiday 11.5
round 11.2
ornament 11.2
empty 11.2
style 11.1
fixture 11.1
grunge 11.1
object 11
ornate 11
element 10.7
ring 10.7
banner 10.1
symbol 10.1
pendant 10.1
backdrop 9.9
light bulb 9.8
decor 9.7
diamond 9.7
blank 9.6
necklace 9.5
sconce 9.5
circle 9.5
season 9.4
lamp 8.8
rings 8.8
jewel 8.7
love 8.7
space 8.5
stones 8.5
valentine 8.2
wealth 8.1
bracket 7.9
black 7.9
electric lamp 7.9
shield 7.7
coin 7.6
stone 7.6
classic 7.4
greeting 7.4
protective covering 7.4
shine 7.4
backgrounds 7.3
yellow 7.3
dirty 7.2
porcelain 7.2
currency 7.2
ball 7.1
leaf 7
seasonal 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

human face 98.4
wall 98.2
mirror 96.6
person 94.6
text 89.8
woman 89
indoor 87
clothing 86.5
reflection 51.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Female, 99.9%
Calm 99%
Angry 0.4%
Confused 0.2%
Sad 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Categories

Imagga

paintings art 100%