Human Generated Data

Title

Untitled (portrait of child leaning on chair)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1976

Human Generated Data

Title

Untitled (portrait of child leaning on chair)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1976

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Chair 100
Furniture 100
Clothing 98.5
Apparel 98.5
Person 98.2
Human 98.2
Face 72.6
Photography 69.5
Photo 69.5
Portrait 69.4
Kid 66.9
Child 66.9
Flower 59
Blossom 59
Plant 59
Helmet 56.2
Sitting 55.4

Clarifai
created on 2023-10-26

child 99.6
people 98.6
monochrome 98.5
wear 97.4
portrait 97
girl 95.2
retro 93.3
boy 92.7
nostalgia 90.2
son 90.1
one 88.7
sepia 86.6
art 86.2
baby 84.1
little 82.3
fun 81.4
cute 81.3
street 79.4
black and white 79.2
vintage 78.1

Imagga
created on 2022-01-22

person 23.8
people 23.4
man 22.2
adult 20.3
snow 20.1
cap 18.5
shower cap 18.1
cleaner 15.7
portrait 15.5
clothing 15.3
human 14.2
child 14.2
happiness 14.1
happy 13.8
male 13.8
dress 13.6
fashion 12.8
winter 12.8
joy 12.5
world 12.4
headdress 12.3
weather 11.7
active 11.7
business 11.5
black 11.4
walking 11.4
outdoors 11.2
pretty 11.2
men 11.2
women 11.1
lifestyle 10.8
outdoor 10.7
fun 10.5
sexy 10.4
cold 10.3
wall 10.3
urban 9.6
smiling 9.4
art 9.2
hat 9.2
city 9.1
attractive 9.1
summer 9
cheerful 8.9
cool 8.9
businessman 8.8
hair 8.7
walk 8.6
model 8.6
shovel 8.3
one 8.2
exercise 8.2
mask 8.1
suit 8.1
success 8
water 8
looking 8
work 7.9
couple 7.8
standing 7.8
hands 7.8
power 7.6
manager 7.5
action 7.4
umbrella 7.4
alone 7.3
danger 7.3
body 7.2
celebration 7.2
face 7.1
love 7.1
athlete 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.8
outdoor 97.5
boy 90.7
person 86.7
clothing 84.1
drawing 83.6
young 82.8
sketch 75.1
old 74.6
black 70.9
white 67.4
fog 66.4
cartoon 61
human face 51.1
posing 48
vintage 38.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 78.7%
Happy 83.9%
Sad 4.4%
Disgusted 4.2%
Confused 3.2%
Angry 1.9%
Surprised 1.4%
Fear 0.7%
Calm 0.3%

Feature analysis

Amazon

Person 98.2%
Helmet 56.2%

Categories

Imagga

paintings art 99.7%