Human Generated Data

Title

Untitled (woman leaning over baby in bassinet with large bow at top)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12876

Human Generated Data

Title

Untitled (woman leaning over baby in bassinet with large bow at top)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12876

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 100
Person 99.4
Human 99.4
Cradle 95.1
Bed 92.4

Clarifai
created on 2019-11-16

people 99.8
portrait 98.7
two 97.6
one 97.4
adult 97.2
child 96.8
theater 96.2
baby 95.9
man 95.4
monochrome 94.9
opera 94
woman 93.3
love 92.2
movie 92.1
actress 91
administration 90.4
music 89.8
family 88.6
girl 86.9
theatre 85.5

Imagga
created on 2019-11-16

baby bed 84.5
cradle 72.7
furniture 62.7
furnishing 42.5
adult 31.1
person 28.1
bassinet 27.5
people 27.3
indoors 24.6
home 21.5
male 20.6
computer 19.3
bed 18.9
sitting 18.9
office 17.8
man 17.5
child 16.2
lifestyle 15.2
happy 15
lying 15
women 15
work 14.9
face 14.2
one 14.2
laptop 13.8
hand 13.7
relaxing 13.6
attractive 13.3
portrait 12.9
working 12.4
desk 12.3
indoor 11.9
casual 11.9
suit 11.7
blond 11.7
job 11.5
pretty 11.2
love 11
business 10.9
room 10.3
keyboard 10.3
baby 10.3
scholar 10.2
professional 10.2
focus 10.2
smiling 10.1
intellectual 10.1
relax 10.1
bedroom 10.1
alone 10
smile 10
groom 9.8
human 9.7
sleeping 9.7
businessman 9.7
looking 9.6
brunette 9.6
hands 9.6
resting 9.5
hair 9.5
adults 9.5
females 9.5
relaxed 9.4
clothing 9.3
mature 9.3
inside 9.2
lady 8.9
half length 8.8
sleep 8.8
paper 8.6
corporate 8.6
break 8.6
relaxation 8.4
house 8.4
leisure 8.3
notebook 7.9
couple 7.8
happiness 7.8
black 7.8
table 7.8
color 7.8
model 7.8
tired 7.8
workplace 7.6
horizontal 7.5
floor 7.4
holding 7.4
rest 7.4
pen 7.4
20s 7.3
children 7.3
dress 7.2
worker 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 99.2
black and white 92.4
indoor 90.6
text 81.5
bed 80.8
baby 80.1
human face 72.1
clothing 60

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Female, 83.6%
Fear 3.3%
Sad 21%
Happy 44.7%
Calm 25.8%
Angry 2%
Surprised 1.1%
Disgusted 0.9%
Confused 1.2%

AWS Rekognition

Age 3-11
Gender Female, 74.4%
Angry 1.8%
Calm 35.5%
Sad 49.2%
Happy 1%
Fear 2.1%
Confused 1.6%
Surprised 1.1%
Disgusted 7.7%

Feature analysis

Amazon

Person 99.4%
Bed 92.4%

Categories

Imagga

paintings art 78.1%
food drinks 19.9%
people portraits 1.2%

Captions

Text analysis

Google

y6
y6