Human Generated Data

Title

Untitled (young woman in V-necked dress with hand on hip seated by vase of flowers)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12853

Human Generated Data

Title

Untitled (young woman in V-necked dress with hand on hip seated by vase of flowers)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12853

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Plant 94
Flower 86.4
Blossom 86.4
Clothing 85.4
Apparel 85.4
Interior Design 85
Indoors 85
Skin 80.6
Flower Arrangement 75.2
Electronics 75.2
Screen 75.2
Female 71.5
LCD Screen 67.9
Monitor 67.9
Display 67.9
Leisure Activities 66.2
Vase 64.9
Ornament 64.9
Jar 64.9
Art 64.9
Ikebana 64.9
Pottery 64.9
Couch 63.7
Furniture 63.7
Chair 63.3
Door 63.2
Woman 63
Sitting 58.7
Room 55.7
Living Room 55.7

Clarifai
created on 2019-11-16

people 99.4
woman 98.6
adult 98.2
one 97.5
monochrome 97.1
indoors 96.4
portrait 96
actress 92.2
dress 90
wear 88.9
fashion 88.9
facial expression 88.5
man 87.2
girl 86.1
room 86
sit 84.2
model 83.7
music 82
furniture 81.9
sitting 80.7

Imagga
created on 2019-11-16

person 34
adult 32.6
attractive 30.1
people 26.8
pretty 23.8
business 23.7
lady 23.6
portrait 23.3
businesswoman 22.7
sitting 22.4
office 21.7
black 21.4
man 21.3
groom 21.2
bride 19.1
studio 19
professional 19
model 18.7
couple 18.3
happy 18.2
male 17.7
brunette 17.4
fashion 17.4
confident 17.3
smiling 16.7
suit 16.4
sexy 16.1
20s 15.6
cheerful 15.5
dress 15.4
executive 15.1
happiness 14.9
holding 14.9
elegant 13.7
laptop 13.7
looking 13.6
hair 13.5
women 13.5
style 13.4
job 13.3
businessman 13.3
smile 12.8
face 12.8
indoor 12.8
corporate 12
body 12
expression 12
love 11.8
elegance 11.8
businesspeople 11.4
clothing 11.3
human 11.3
casual 11
work 11
pose 10.9
lifestyle 10.8
cute 10.8
computer 10.6
formal 10.5
smart 10.4
newlywed 10.4
slim 10.1
hand 9.9
secretary 9.7
working 9.7
indoors 9.7
success 9.7
together 9.6
career 9.5
passion 9.4
dark 9.2
worker 9.1
handsome 8.9
posing 8.9
interior 8.9
chair 8.8
dancer 8.7
bouquet 8.6
two 8.5
communication 8.4
one 8.2
sensual 8.2
spouse 8
glass 8
good looking 7.8
partner 7.7
modern 7.7
notebook 7.7
shirt 7.7
tie 7.6
figure 7.6
employee 7.5
drink 7.5
relationship 7.5
fun 7.5
wine 7.4
positive 7.4
successful 7.3
sensuality 7.3
make 7.3
gorgeous 7.3
romance 7.1
dance 7.1
look 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

vase 96.7
text 93.8
smile 79.3
flower 79.3
black and white 77.3
clothing 77.1
person 75.6
woman 72.6
houseplant 62.3
dress 58.7
picture frame 11.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Female, 99%
Disgusted 3.9%
Surprised 3.5%
Happy 33.1%
Confused 54%
Sad 0.4%
Fear 1%
Angry 2.2%
Calm 1.9%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Captions