Human Generated Data

Title

Untitled (portait of a woman in shimmery coat with fur trim)

Date

c. 1925

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1164

Human Generated Data

Title

Untitled (portait of a woman in shimmery coat with fur trim)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1164

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.5
Human 97.5
Head 96.4
Art 90
Face 83.8
Painting 82.2
Portrait 69.6
Photography 69.6
Photo 69.6
Female 57.9

Clarifai
created on 2023-10-26

portrait 99.9
people 99.6
art 99.4
one 98.9
painting 98.4
adult 97
museum 96.2
wear 96.2
man 95.7
sepia 93.3
model 93.2
window 93
vintage 92.8
woman 92.6
old 92.2
girl 91.5
outerwear 89.6
dress 88.8
fashion 87
retro 85.7

Imagga
created on 2022-01-22

fur coat 55.6
coat 47.3
garment 30
people 27.3
world 27
person 26.7
portrait 26.5
clothing 26
attractive 23.1
man 21.5
pretty 21
adult 20.7
sexy 20.1
hair 19.8
human 19.5
lady 19.5
male 18.8
model 18.7
lifestyle 17.3
face 17
body 16.8
black 16.3
skin 15.5
fashion 14.3
happy 13.8
love 13.4
looking 12.8
one 12.7
health 12.5
dress 11.7
healthy 11.3
couple 11.3
cute 10.8
care 10.7
lovely 10.7
hand 10.6
muscle 10.6
brunette 10.5
home 10.4
eyes 10.3
women 10.3
head 10.1
sensuality 10
smile 10
family 9.8
look 9.6
blond 9.6
bride 9.6
smiling 9.4
child 9.4
vintage 9.1
make 9.1
gorgeous 9.1
old 9.1
posing 8.9
happiness 8.6
sitting 8.6
casual 8.5
studio 8.4
mother 8.3
wedding 8.3
bow tie 8.1
hands 7.8
nude 7.8
muscular 7.6
serious 7.6
erotic 7.6
relaxation 7.5
clean 7.5
leisure 7.5
girls 7.3
fitness 7.2
handsome 7.1
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

wall 99.9
gallery 99.1
human face 99
indoor 98.9
room 98
person 96.7
clothing 96.6
scene 96.2
painting 95.3
mirror 86.8
portrait 86.5
woman 73.5
drawing 59.6
picture frame 20.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-50
Gender Female, 100%
Calm 98.3%
Confused 0.4%
Surprised 0.3%
Sad 0.3%
Angry 0.2%
Fear 0.2%
Happy 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 48
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Categories