Human Generated Data

Title

Untitled (woman seated in chair with baby on lap, little boy standing at side)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12873

Human Generated Data

Title

Untitled (woman seated in chair with baby on lap, little boy standing at side)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12873

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.6
Human 99.4
Person 99.4
Person 98.4
Person 97
Couch 90.4
Chair 84.1
Baby 75.6
Flooring 75.4
Apparel 63.4
Clothing 63.4
Newborn 59.5
Finger 58.7
Armchair 56.6

Clarifai
created on 2019-11-16

people 99.8
child 99.6
two 99
woman 98
family 97.6
seat 97.4
portrait 97.3
adult 97.1
baby 97.1
offspring 96.1
son 95.3
indoors 94.3
sit 93.5
furniture 92.3
three 92
man 91.2
chair 90.5
wear 89.8
group 88.6
facial expression 87.7

Imagga
created on 2019-11-16

man 28.9
male 24.7
mother 24.7
parent 23.5
people 23.4
adult 22.7
person 20.2
couple 18.3
family 16.9
home 16.7
portrait 16.2
love 15.8
happy 15.7
happiness 14.9
black 14.7
attractive 14
room 13.8
chair 13.8
father 13.8
fashion 12.8
two 12.7
face 12.1
sexy 12
call 12
casual 11.9
child 11.5
smile 11.4
human 11.2
dad 11.2
sitting 11.2
dress 10.8
interior 10.6
together 10.5
husband 10.5
relationship 10.3
house 10
indoors 9.7
style 9.6
girlfriend 9.6
looking 9.6
hair 9.5
pretty 9.1
sensual 9.1
suit 9
lady 8.9
women 8.7
married 8.6
model 8.6
expression 8.5
youth 8.5
daughter 8.5
adults 8.5
elegance 8.4
patient 8.3
holding 8.3
indoor 8.2
handsome 8
romantic 8
business 7.9
hugging 7.8
sepia 7.8
men 7.7
boyfriend 7.7
kin 7.7
professional 7.7
bride 7.7
loving 7.6
device 7.6
one 7.5
vintage 7.4
window 7.3
20s 7.3
cheerful 7.3
lifestyle 7.2
body 7.2
posing 7.1
kid 7.1
work 7.1
day 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

baby 99
toddler 98.7
human face 98
clothing 97.4
person 97.1
child 88.1
black and white 83.8
window 83.8
boy 82.6
text 79.2
smile 76.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Female, 78.9%
Confused 0.1%
Calm 0.2%
Sad 45.5%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 54.2%
Angry 0.1%

AWS Rekognition

Age 18-30
Gender Female, 98.5%
Disgusted 0.2%
Angry 0.4%
Confused 0.9%
Sad 21.5%
Calm 75.3%
Happy 1.1%
Fear 0.3%
Surprised 0.4%

AWS Rekognition

Age 0-3
Gender Female, 79.9%
Angry 15.1%
Disgusted 0.1%
Happy 0%
Calm 2.8%
Surprised 0%
Fear 0.2%
Confused 2.3%
Sad 79.5%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 8
Gender Male

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

people portraits 92.6%
paintings art 5.7%

Text analysis

Amazon

2964

Google

2 9 G 4
2
9
G
4