Human Generated Data

Title

Self-Portrait

Date

1907

People

Artist: Ignaz Marcel Gaugengigl, American 1855 - 1932

Framemaker: Charles Prendergast,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the estate of Isabella Grandin, 1986.453

Human Generated Data

Title

Self-Portrait

People

Artist: Ignaz Marcel Gaugengigl, American 1855 - 1932

Framemaker: Charles Prendergast,

Date

1907

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the estate of Isabella Grandin, 1986.453

Machine Generated Data

Tags

Amazon
created on 2020-04-23

Human 97.7
Person 97.7
Art 95.8
Wood 92.5
Painting 80.9
Apparel 56.9
Clothing 56.9

Clarifai
created on 2020-04-23

people 99.7
one 99.4
adult 99.1
painting 99
wear 98.5
portrait 98
man 97.6
furniture 94.2
doorway 92
museum 90.7
art 90.4
room 87.8
two 87.6
music 87.5
door 87.5
wood 85.9
woman 85.8
pants 85.6
facial hair 84.5
side view 82.7

Imagga
created on 2020-04-23

old 23.7
building 21.4
wall 20.6
door 20.4
person 18.8
man 18.1
furniture 18.1
home 16.7
architecture 16.4
interior 15.9
house 15.9
wood 15.8
adult 14.9
room 14.5
wardrobe 14.3
style 14.1
wooden 14
vintage 14
people 13.9
fashion 13.6
attractive 13.3
brown 13.2
male 12.8
dress 12.6
modern 11.9
clothing 11.4
brunette 11.3
casual 11
portrait 11
furnishing 10.8
parquet 10.8
light 10.7
posing 10.7
basement 10.6
pretty 10.5
ancient 10.4
black 10.1
device 9.9
lady 9.7
urban 9.6
hair 9.5
construction 9.4
street 9.2
one 8.9
doorway 8.8
entrance 8.7
smiling 8.7
window 8.6
model 8.5
face 8.5
grunge 8.5
dark 8.3
traditional 8.3
alone 8.2
outdoors 8.2
religion 8.1
covering 8
lifestyle 7.9
antique 7.9
art 7.9
garment 7.8
expression 7.7
stone 7.6
jacket 7.5
retro 7.4
design 7.3
indoor 7.3
aged 7.2
dirty 7.2
sexy 7.2
cute 7.2
working 7.1

Google
created on 2020-04-23

Painting 91.6
Portrait 84.3
Standing 83.6
Art 75.5
Picture frame 61.8
Self-portrait 55.4

Microsoft
created on 2020-04-23

person 91.2
text 81.7
painting 64.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-58
Gender Male, 96.3%
Calm 74.6%
Sad 2.3%
Angry 4.2%
Disgusted 1.2%
Fear 0.7%
Happy 1.2%
Confused 10.8%
Surprised 5%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Painting 80.9%

Captions

Azure OpenAI

Created on 2024-01-25

The image depicts a figure dressed in a traditional suit with a waistcoat and holding what appears to be a cane or walking stick in one hand. This person is surrounded by a room rich in detail, including a wooden brown floor, a shelved wall unit filled with books, boxes, and papers, wall fixtures, framed artworks or photographs, and possibly doorways leading to other rooms. The room suggests a bygone era, with a sense of formality and classic design. The lighting in the room appears soft, casting gentle shadows and giving the space a quiet, introspective atmosphere.

Anthropic Claude

Created on 2024-03-29

The image appears to be a painting depicting a man standing in a cluttered, dimly lit room. The man is dressed in a dark suit and has a serious, contemplative expression on his face. He is surrounded by various objects, including books, artwork, and other furnishings, which suggest this may be the man's study or personal space. The overall atmosphere of the painting is one of introspection and quiet contemplation.

Text analysis

Google

LEGO
LEGO