Human Generated Data

Title

Untitled (mother seated with baby on lap in chair, little boy standing on chair to her side)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12875

Human Generated Data

Title

Untitled (mother seated with baby on lap in chair, little boy standing on chair to her side)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12875

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.3
Human 99.3
Person 97.7
Person 96.2
Baby 94.4
Newborn 94.4
Furniture 91.8
Couch 87.2
Chair 81.1
Flooring 75.1
Face 70.4
Photography 64.9
Portrait 64.9
Photo 64.9
Door 64.2
Clothing 64.2
Apparel 64.2
Finger 62.5
Window 62.4
People 62
Shorts 57.9
LCD Screen 56.4
Electronics 56.4
Display 56.4
Screen 56.4
Monitor 56.4

Clarifai
created on 2019-11-16

people 99.9
child 99.7
woman 98
portrait 97.9
two 97.7
family 97.3
son 95.6
group 94.6
adult 94.6
monochrome 93.8
baby 93.6
boy 92.9
offspring 92.6
man 92.5
music 91.7
wear 90.1
girl 90
street 89.9
facial expression 86.6
wedding 86.5

Imagga
created on 2019-11-16

parent 31.6
man 30.9
mother 27.5
male 22.7
dad 21.3
people 20.1
kin 19.6
adult 18.9
father 18.7
child 18.1
person 17.9
world 17.5
portrait 14.9
black 14.1
youth 13.6
businessman 13.2
family 12.5
business 12.1
building 11.3
love 11
happiness 11
men 10.3
sport 10.2
room 10.1
human 9.7
interior 9.7
one 9.7
couple 9.6
happy 9.4
dark 9.2
fun 9
looking 8.8
face 8.5
future 8.4
old 8.4
action 8.3
city 8.3
leisure 8.3
silhouette 8.3
athlete 8.3
body 8
lifestyle 7.9
boy 7.8
play 7.8
summer 7.7
wall 7.7
expression 7.7
outdoor 7.6
casual 7.6
statue 7.6
outdoors 7.5
player 7.4
holding 7.4
exercise 7.3
sunset 7.2
home 7.2
hair 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

human face 98.6
clothing 98.2
toddler 96.8
baby 96.1
person 95.3
boy 93.4
smile 91.8
black and white 85.9
child 85
text 69.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 82.1%
Fear 2.9%
Confused 1%
Disgusted 1.6%
Angry 1.7%
Sad 11.4%
Calm 33.9%
Happy 44.8%
Surprised 2.6%

AWS Rekognition

Age 20-32
Gender Female, 99.8%
Confused 1.3%
Angry 3.2%
Surprised 1.3%
Calm 61.1%
Disgusted 14.5%
Fear 3.8%
Sad 3.1%
Happy 11.7%

AWS Rekognition

Age 0-3
Gender Female, 90.3%
Disgusted 0.1%
Fear 5.9%
Confused 24.4%
Angry 0.4%
Calm 1%
Surprised 0.1%
Sad 68%
Happy 0%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 6
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories