Human Generated Data

Title

Untitled (woman seated in chair with little girl in lap, framed pictures and window in background)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12843

Human Generated Data

Title

Untitled (woman seated in chair with little girl in lap, framed pictures and window in background)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12843

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Person 97.8
Furniture 93.3
Couch 93.3
Table Lamp 86.1
Lamp 86.1
Indoors 79.7
Room 79.7
Living Room 79.7
Baby 76.3
Monitor 75.2
Screen 75.2
Display 75.2
Electronics 75.2
Curtain 62.4
Photography 62.1
Face 62.1
Portrait 62.1
Photo 62.1
Clothing 61.9
Apparel 61.9
Interior Design 61.8
Finger 58.8
Home Decor 56.5
Chair 55.7

Clarifai
created on 2019-11-16

people 100
child 99.3
two 98.8
group 97.9
family 97.6
man 97
portrait 96.6
boy 96.6
movie 96
adult 95.9
woman 94.4
monochrome 94.2
group together 93.8
three 93.8
recreation 93.5
furniture 93.4
street 93.3
music 93.3
room 92.5
offspring 91.2

Imagga
created on 2019-11-16

man 37.6
person 27.5
adult 25.5
male 25.2
people 24.5
world 21.5
love 20.5
couple 18.3
black 17.4
room 16.4
looking 14.4
dark 14.2
happy 13.8
happiness 13.3
passenger 13.2
lifestyle 13
portrait 12.9
loving 12.4
men 12
business 11.5
husband 11.4
indoors 11.4
one 11.2
sitting 11.2
indoor 11
model 10.9
television 10.9
girlfriend 10.6
attractive 10.5
body 10.4
home 10.4
relationship 10.3
smile 10
chair 10
face 9.9
fashion 9.8
couch 9.7
boyfriend 9.6
hair 9.5
smiling 9.4
passion 9.4
office 9.4
back 9.2
human 9
posing 8.9
interior 8.8
businessman 8.8
together 8.8
expression 8.5
togetherness 8.5
relaxation 8.4
window 8.3
clothing 8.3
child 8.2
dress 8.1
romance 8
sexy 8
handsome 8
working 8
hand 7.6
wife 7.6
adults 7.6
waiter 7.5
light 7.5
fire 7.5
holding 7.4
computer 7.4
suit 7.3
20s 7.3
sensuality 7.3
family 7.1
modern 7

Google
created on 2019-11-16

Photograph 97.1
Black 96.8
White 96.3
Black-and-white 94.2
People 93.7
Snapshot 88.4
Monochrome 87.2
Monochrome photography 84.4
Photography 81.3
Room 76.5
Sitting 75.8
Child 71.2
Photographic paper 58.8
Father 56.1
Style 55.5
Window 53.8
Family 53.7
Television set 50.2

Microsoft
created on 2019-11-16

human face 97.2
baby 95.1
clothing 93.4
toddler 92.5
person 88.2
black and white 87.9
text 87
child 80.9
smile 68.8
boy 68.5
monochrome 52.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 99.1%
Surprised 0.6%
Angry 2.7%
Fear 0.9%
Calm 84.3%
Confused 2.3%
Sad 7.4%
Disgusted 0.9%
Happy 1%

AWS Rekognition

Age 2-8
Gender Female, 96.4%
Calm 97.5%
Sad 1.5%
Happy 0.1%
Angry 0.4%
Fear 0%
Confused 0.2%
Surprised 0.1%
Disgusted 0.2%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

events parties 76.2%
people portraits 11.5%
food drinks 10.3%

Text analysis

Google

ww
ww