Human Generated Data

Title

Untitled (seated woman)

Date

1900s

People

Artist: Whitman Studio, American active 1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2918

Human Generated Data

Title

Untitled (seated woman)

People

Artist: Whitman Studio, American active 1900s

Date

1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2918

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 99.9
Person 98.3
Human 98.3
Painting 97.3
Art 97.3
Chair 57.7

Clarifai
created on 2023-10-25

portrait 99.8
people 99.7
one 99.4
art 98.5
girl 98.3
woman 98.3
adult 97.7
painting 95.2
child 94.5
wear 94.4
music 94
vintage 92.9
furniture 91.8
baby 91.1
seat 87.8
model 86.6
retro 86.5
analogue 84
son 83.8
street 82.6

Imagga
created on 2022-01-08

grand piano 100
piano 86.2
keyboard instrument 65.7
stringed instrument 64.8
percussion instrument 63.5
musical instrument 44.2
model 32.7
adult 27.2
person 27.2
sexy 26.5
fashion 26.4
attractive 25.9
portrait 25.2
people 23.4
body 20.8
sitting 20.6
hair 20.6
sensuality 20
black 18.9
pretty 18.2
lady 17.9
elegance 15.1
brunette 14.8
lifestyle 14.5
skin 14.4
erotic 14.4
human 14.2
posing 14.2
face 14.2
dark 14.2
happy 13.8
sofa 13.6
women 13.4
one 13.4
studio 12.9
smile 12.8
youth 12.8
sensual 12.7
blond 12.7
style 12.6
looking 12
love 11.8
make 11.8
dress 11.7
nude 11.6
couch 11.6
sexual 11.6
passion 11.3
lying 11.3
couple 10.5
home 10.4
elegant 10.3
man 10.1
naked 9.7
smiling 9.4
cute 9.3
casual 9.3
silhouette 9.1
gramophone 8.8
breast 8.8
sex 8.8
boy 8.7
male 8.6
happiness 8.6
outdoor 8.4
relaxation 8.4
child 8.4
room 8.3
cheerful 8.1
interior 8
barrow 7.9
machine 7.7
expression 7.7
bed 7.6
legs 7.5
fun 7.5
light 7.4
teen 7.3
water 7.3
relaxing 7.3
furniture 7.2
romance 7.1
record player 7.1
indoors 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.4
human face 97.8
painting 97.5
clothing 96.5
person 95.8
sitting 95.3
woman 91.4
window 87.7
portrait 59.4
smile 57.4
picture frame 41.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 100%
Calm 97.5%
Confused 1.9%
Sad 0.2%
Surprised 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Painting 97.3%

Categories