Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (older woman seated on chair in front of dresser and mirror, holding book)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12442

Human Generated Data

Title

Untitled (older woman seated on chair in front of dresser and mirror, holding book)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12442

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clarifai
created on 2023-10-27

people 96.6
woman 95.5
monochrome 92
desktop 90.7
window 90.5
retro 88.1
adult 87.9
art 87.5
man 86.8
fashion 86.5
girl 85.9
indoors 85.6
design 85.4
vintage 85.3
abstract 84.7
technology 84.2
elegant 82.8
wear 81.2
illustration 81
graphic 80.8

Imagga
created on 2022-01-29

person 44.1
patient 31.2
people 26.2
man 26.2
adult 23.9
room 23.9
home 23.1
male 22.7
happy 20.7
sick person 19.3
hospital 19.3
case 19
indoors 18.4
device 15.5
smiling 15.2
portrait 14.9
smile 14.2
work 14.1
medical 14.1
sitting 13.7
child 13.6
professional 13.5
iron lung 13.4
inside 12.9
umbrella 12.8
family 12.4
holding 12.4
working 12.4
seat 12.3
doctor 12.2
office 12
men 12
breathing device 11.9
computer 11.2
nurse 11.1
casual 11
worker 10.9
lifestyle 10.8
respirator 10.7
human 10.5
health 10.4
machine 10.1
laptop 10
cheerful 9.7
business 9.7
mother 9.7
technology 9.6
black 9.6
women 9.5
senior 9.4
specialist 9.2
attractive 9.1
one 9
job 8.8
love 8.7
clothing 8.6
chair 8.5
fun 8.2
care 8.2
indoor 8.2
life 8.1
looking 8
interior 8
happiness 7.8
face 7.8
uniform 7.8
modern 7.7
illness 7.6
communication 7.6
lady 7.3
medicine 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

drawing 90.7
sketch 88.3
text 83.3
black and white 73.4
cartoon 63.2

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.2%

Captions

Microsoft
created on 2022-01-29

an old photo of a man 50.8%
a man wearing a hat 32.3%
a man sitting in a room 32.2%

Text analysis

Amazon

3
XH 3 786
XH
786
2858
-MAMTBA
will

Google

98LE
98LE