Human Generated Data

Title

Untitled (woman seated on arm of patterned loveseat with her two children)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12950

Human Generated Data

Title

Untitled (woman seated on arm of patterned loveseat with her two children)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12950

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.3
Human 98.3
Home Decor 97.2
Person 96
Person 94
Clothing 92.4
Apparel 92.4
Furniture 92.1
Couch 92.1
Person 86.5
Female 81.7
Indoors 78.8
People 78.7
Room 73.9
Living Room 68.6
Gown 64.9
Evening Dress 64.9
Fashion 64.9
Robe 64.9
Woman 63.9
Girl 61.1
Bed 58.5
Flooring 58.5
Handrail 58
Banister 58
Window 57.9

Clarifai
created on 2019-11-16

people 99.8
adult 98.3
two 97.8
one 97.3
woman 96.9
wear 96.7
furniture 92.9
child 91.7
sit 91
man 91
actress 89.9
music 89.6
portrait 89.2
seat 88.3
outfit 88
musician 87.3
group 85.7
chair 85.7
singer 84
facial expression 82.3

Imagga
created on 2019-11-16

person 18.2
portrait 17.5
black 17.2
man 16.3
adult 14.6
dress 14.5
male 14.3
fashion 13.6
people 13.4
vintage 13.2
statue 13
old 12.5
window 12.2
building 11.4
sexy 11.2
posing 10.7
face 10.7
hair 10.3
culture 10.3
mask 10.2
musical instrument 10.1
decoration 10
art 9.9
human 9.7
world 9.7
device 9.5
makeup 9.2
attractive 9.1
room 8.9
groom 8.9
sculpture 8.9
interior 8.8
ancient 8.6
model 8.6
pretty 8.4
traditional 8.3
city 8.3
clothing 8.3
style 8.2
lady 8.1
religion 8.1
light 8
newspaper 8
architecture 7.9
costume 7.9
elegance 7.6
passion 7.5
historic 7.3
sensuality 7.3
detail 7.2
family 7.1
product 7.1
travel 7
look 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 96
text 95.9
indoor 89.4
black and white 87.1
clothing 69.6
person 61.8
woman 51.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Female, 54.9%
Confused 45.1%
Fear 45.1%
Angry 45.2%
Sad 46%
Disgusted 45.2%
Surprised 45.1%
Happy 46%
Calm 52.3%

AWS Rekognition

Age 2-8
Gender Female, 53.5%
Fear 45%
Angry 45%
Happy 45%
Disgusted 45%
Confused 45%
Calm 54.2%
Sad 45.7%
Surprised 45%

AWS Rekognition

Age 20-32
Gender Female, 52.7%
Disgusted 45%
Sad 45.4%
Fear 45.1%
Angry 45%
Confused 45%
Happy 54.2%
Calm 45.2%
Surprised 45.1%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Categories

Imagga

pets animals 99.5%