Human Generated Data

Title

Untitled (Eugenie Stoll in wicker chair)

Date

c. 1935

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.791

Human Generated Data

Title

Untitled (Eugenie Stoll in wicker chair)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.791

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Furniture 99.9
Chair 99.8
Person 97.1
Human 97.1
Canvas 84.1
Sitting 61

Clarifai
created on 2021-04-03

wear 98.9
people 98.7
furniture 98.1
art 98
painting 97.7
sepia 97.2
one 96.6
child 96.4
adult 96.2
man 95.9
sepia pigment 95.8
two 95.8
portrait 95.6
retro 95.4
seat 94.7
woman 94.7
nostalgia 94
girl 91.8
sit 91.5
son 91.2

Imagga
created on 2021-04-03

sketch 70.7
drawing 53.3
representation 43.3
backboard 39.4
equipment 27.1
people 17.3
person 15.8
money 15.3
currency 15.2
face 14.9
portrait 14.9
old 14.6
dress 14.5
cash 13.7
bank 13.4
banking 12.9
finance 11.8
business 11.5
fashion 11.3
art 11.2
bill 10.5
dollar 10.2
happy 10
smile 10
adult 9.8
antique 9.5
man 9.4
head 9.2
blond 9.1
holding 9.1
retro 9
child 8.8
home 8.8
wall 8.6
culture 8.5
black 8.4
attractive 8.4
house 8.4
paper 8.3
historic 8.2
investment 8.2
lady 8.1
design 8.1
symbol 8.1
financial 8
decoration 7.8
happiness 7.8
ancient 7.8
model 7.8
banknote 7.8
marble 7.7
luxury 7.7
bride 7.7
statue 7.6
one 7.5
savings 7.5
sexy 7.2
wealth 7.2
cute 7.2
hair 7.1
sculpture 7.1
interior 7.1
look 7

Google
created on 2021-04-03

Art 74.8
Vintage clothing 68.5
Baby 67.7
Wheel 66.2
Room 64.9
Paper product 62.3
Sitting 62.1
Motor vehicle 60.5
Toddler 60.3
Visual arts 58.3
Chair 52.2
Child 50.9
Paper 50.5

Microsoft
created on 2021-04-03

person 93.4
drawing 93.1
text 89.6
clothing 87.6
sketch 86.6
human face 76.6
old 72
envelope 59.9
picture frame 23.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-5
Gender Male, 57.4%
Happy 100%
Calm 0%
Surprised 0%
Confused 0%
Angry 0%
Sad 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2021-04-03

a vintage photo of a box 50.5%
a vintage photo of a person 50.4%
a vintage photo of a person 46.5%