Human Generated Data

Title

Untitled (seated woman, half-length)

Date

c. 1940

People

Artist: Michael Disfarmer, American 1884 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2008.286

Human Generated Data

Title

Untitled (seated woman, half-length)

People

Artist: Michael Disfarmer, American 1884 - 1959

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2008.286

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Face 100
Head 100
Photography 100
Portrait 100
Art 99.9
Painting 99.9
Person 99.7
Accessories 98.3
Jewelry 98.3
Necklace 98.3
Lady 93.3
Earring 85.1
Blouse 83.4
Clothing 83.4
Coat 72.7
Photo Booth 56.7
Blazer 56.4
Jacket 56.4

Clarifai
created on 2019-02-18

people 98.8
portrait 98.8
adult 98.3
one 98.1
woman 96.3
wear 95.9
museum 94.5
art 92.3
man 90.8
painting 90.2
retro 89.5
fashion 89.5
facial expression 88.9
dress 86.2
girl 84.7
exhibition 83.9
music 83.5
outfit 82.4
model 82.1
indoors 79

Imagga
created on 2019-02-18

bow tie 47.6
necktie 36.4
portrait 33
people 23.4
person 22.7
fashion 20.4
face 19.9
garment 19.2
attractive 18.9
adult 18.8
clothing 18.4
mug shot 18.3
pretty 17.5
model 17.1
man 17
black 16.8
hair 15.8
photograph 15.1
happiness 14.9
happy 14.4
smiling 13.7
representation 13.3
vintage 13.2
lady 13
smile 12.8
dress 12.6
bride 12.3
male 12.1
sexy 12
skin 11.8
love 11.8
creation 11.3
human 11.2
old 11.1
expression 11.1
wedding 11
business 10.9
holding 10.7
couple 10.5
culture 10.3
elegance 10.1
cute 10
studio 9.9
art 9.5
eyes 9.5
color 9.5
child 9.4
head 9.2
mother 9.2
sensual 9.1
one 9
style 8.9
family 8.9
office 8.8
body 8.8
look 8.8
brunette 8.7
world 8.7
women 8.7
youth 8.5
hand 8.4
makeup 8.2
posing 8
lifestyle 7.9
blond 7.8
hands 7.8
flower 7.7
hairstyle 7.6
cosmetics 7.5
closeup 7.4
looking 7.2
romantic 7.1

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

gallery 97.2
wall 95.3
room 93.3
indoor 92.9
scene 91
white 67.3
posing 35.6
painting 18.5
picture frame 7.4
art 7.4
museum 1.7
person 1.3
black and white 1.1
monochrome 1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Female, 100%
Happy 89.5%
Calm 6.4%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Confused 2%
Disgusted 0.7%
Angry 0.3%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Coat 72.7%

Categories

Imagga

paintings art 100%

Captions