Human Generated Data

Title

Untitled (woman leaning over baby in bassinet from side)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12470

Human Generated Data

Title

Untitled (woman leaning over baby in bassinet from side)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12470

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 98.4
Apparel 98.4
Person 87.1
Human 87.1
Veil 86.8
Furniture 74.2
Cat 67.2
Animal 67.2
Mammal 67.2
Pet 67.2
Room 59.3
Indoors 59.3
Gown 59.3
Fashion 59.3
Lace 58.3
Robe 56.4

Clarifai
created on 2023-10-27

monochrome 98.3
people 98
art 94.3
group 92.7
design 88.8
retro 88.6
illustration 87.4
child 86.6
man 86.1
music 84.9
vintage 84.3
old 83.1
desktop 82.9
many 82.3
animal 80.2
adult 79.7
dog 79
wear 78.2
mammal 78
decoration 76.2

Imagga
created on 2022-01-29

device 20.6
people 18.4
shower cap 17.7
man 16.8
medical 16.8
person 16.3
cap 14.6
adult 13.6
human 13.5
hospital 12.9
male 12.8
headdress 12.2
equipment 12.2
hand 12.2
technology 11.9
love 11.8
professional 11.8
clothing 11.6
science 11.6
bride 11.5
medicine 11.4
biology 11.4
work 11.1
health 11.1
wedding 11
groom 10.9
black 10.8
ventilator 10.7
laboratory 10.6
working 10.6
glass 10.5
celebration 10.4
doctor 10.3
hair 10.3
men 10.3
negative 10.1
film 9.9
fashion 9.8
scientist 9.8
art 9.8
worker 9.8
lab 9.7
research 9.5
light 9.4
mask 9.2
dress 9
veil 8.8
scientific 8.7
chemistry 8.7
chemical 8.7
test 8.7
instrument 8.6
mechanical device 8.2
style 8.2
surgeon 8
lifestyle 7.9
education 7.8
play 7.8
lady 7.3
smile 7.1
portrait 7.1
women 7.1
face 7.1
modern 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 94.4
black and white 80.5
statue 77.4
skull 70.6
art 52.1

Color Analysis

Feature analysis

Amazon

Person
Cat
Person 87.1%
Cat 67.2%

Categories

Imagga

paintings art 99.8%

Captions

Text analysis

Amazon

HX&
728CH

Google

2235H O-YT3RA2-MAMTEA INLAIN XH 5
2235H
O-YT3RA2-MAMTEA
INLAIN
XH
5