Human Generated Data

Title

Untitled (portrait of a child in a chair)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1967

Human Generated Data

Title

Untitled (portrait of a child in a chair)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1967

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 97.6
Person 94.4
Human 94.4
Chair 80.7
Indoors 75.4
Interior Design 75.4
Drawing 75.4
Art 75.4
Sketch 64.3
Table 62.9
Astronomy 61.1
Outer Space 61
Space 61
Universe 61
Flooring 58.7

Clarifai
created on 2023-10-26

people 99.7
adult 97.2
art 96.5
one 95.7
portrait 95.7
street 94.8
monochrome 94.6
man 94
woman 93.5
wear 93.3
two 91.7
girl 91.1
window 91
child 90.4
analogue 89.9
room 88.1
nude 87.2
family 83.4
house 82.7
vintage 81.2

Imagga
created on 2022-01-22

room 24.7
interior 23
vessel 19.3
home 19.2
house 17.5
clean 17.5
toilet tissue 16.4
glass 16.1
washbasin 15.4
container 15.4
basin 15.3
bathroom 14.9
modern 14
furniture 13.5
people 13.4
hygiene 13.2
health 13.2
tissue 13.1
paper 12.5
medical 12.4
medicine 11.4
toilet 11.4
relax 10.9
lifestyle 10.8
design 10.8
vase 10.7
wall 10.4
relaxation 10
liquid 9.8
bath 9.5
face 9.2
indoor 9.1
care 9
equipment 9
towel 9
decoration 9
device 9
healthy 8.8
man 8.7
light 8.7
sketch 8.7
laboratory 8.7
water 8.7
table 8.6
life 8.6
luxury 8.6
pretty 8.4
negative 8.2
person 8.2
architecture 7.8
sofa 7.8
adult 7.8
cold 7.7
skin 7.6
research 7.6
snow 7.5
bottle 7.4
detail 7.2
color 7.2
spa 7.2
drawing 7.1
portrait 7.1
science 7.1
hospital 7.1
decor 7.1
shower 7
wooden 7
indoors 7

Microsoft
created on 2022-01-22

sketch 98.8
drawing 98.6
wall 98.5
black and white 91.1
text 86.8
art 82.5
old 66.5
white 65.2
cartoon 63.6
child art 55.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-20
Gender Female, 92.2%
Calm 88.9%
Sad 4.1%
Surprised 3.4%
Confused 1.1%
Angry 0.9%
Disgusted 0.7%
Fear 0.6%
Happy 0.3%

Feature analysis

Amazon

Person 94.4%

Categories

Imagga

food drinks 92.2%
interior objects 7.3%

Captions

Microsoft
created on 2022-01-22

an old photo of a person 85.1%
old photo of a person 83.3%
an old photo of a boy 57.5%