Human Generated Data

Title

Untitled (two girls in matching floral dresses seated at edge of sofa looking out glass door)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12896

Human Generated Data

Title

Untitled (two girls in matching floral dresses seated at edge of sofa looking out glass door)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12896

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.7
Human 94.9
Person 94.9
Person 94.5
Couch 77.5
Bookcase 75.8
Shelf 65
Home Decor 63.3
Window 62.4
Room 61.5
Living Room 61.5
Indoors 61.5
Electronics 57.3
Screen 57.3
Display 57.3
Monitor 57.3
LCD Screen 57.3

Clarifai
created on 2019-11-16

people 99.8
portrait 98.8
seat 98.7
furniture 98.6
one 98.3
chair 98.2
sit 98.2
child 98
baby 97.7
adult 97.4
family 96.3
two 95.6
woman 95
monochrome 94.8
art 92.1
indoors 91.7
actress 89.8
easy chair 89.7
old 89.1
reclining 89.1

Imagga
created on 2019-11-16

mother 49
parent 44.9
kin 28.1
couch 27.1
man 26.9
sitting 26.6
home 26.3
happy 25.7
male 24.2
people 22.3
family 22.2
adult 22
father 21.4
sofa 21.4
room 21.1
person 20.8
lifestyle 18.8
television 18.7
couple 18.3
dad 18.2
portrait 18.1
happiness 18
senior 17.8
smiling 17.4
casual 16.9
chair 16.5
indoor 16.4
grandfather 15.1
love 15
indoors 14.9
mature 14.9
child 14.3
smile 14.3
together 14
attractive 14
old 13.9
looking 13.6
aged 13.6
telecommunication system 13.4
grandma 13.2
elderly 12.4
living 12.3
boy 12.2
face 12.1
fashion 11.3
relaxed 11.3
one 11.2
women 11.1
relaxing 10.9
living room 10.8
handsome 10.7
interior 10.6
cheerful 10.6
loving 10.5
domestic 10
clothing 9.9
seated 9.7
lady 9.7
retired 9.7
antique 9.6
husband 9.5
model 9.3
cute 9.3
alone 9.1
black 9
son 8.8
hair 8.7
daughter 8.7
retirement 8.6
wife 8.5
expression 8.5
pretty 8.4
pensioner 8.3
human 8.2
children 8.2
dress 8.1
life 8
kid 8
brunette 7.8
parenthood 7.8
older 7.8
affection 7.7
youth 7.7
relax 7.6
togetherness 7.6
furniture 7.5
kids 7.5
relationship 7.5
leisure 7.5
camera 7.4
seat 7.3
playing 7.3
sexy 7.2
computer 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.8
person 86.9
black and white 86.4
window 85.5
clothing 80.4
human face 75.9
statue 63.8
child 62.2
baby 58.1
furniture 50.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-25
Gender Female, 96.8%
Angry 18.2%
Surprised 0.3%
Disgusted 0.2%
Confused 2%
Fear 0.3%
Calm 70.8%
Happy 1.1%
Sad 7.1%

AWS Rekognition

Age 3-9
Gender Female, 96.1%
Happy 0.2%
Fear 0%
Angry 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Calm 99.2%
Sad 0.3%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.9%
Couch 77.5%

Categories

Text analysis

Amazon

0.