Human Generated Data

Title

Untitled (seated man in overalls, flanked by two girls, and a standing woman, full-length, painted backdrop)

Date

c. 1920

People

Artist: Michael Disfarmer, American 1884 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2008.281

Human Generated Data

Title

Untitled (seated man in overalls, flanked by two girls, and a standing woman, full-length, painted backdrop)

People

Artist: Michael Disfarmer, American 1884 - 1959

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2008.281

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Face 100
Head 100
Photography 100
Portrait 100
People 99.8
Person 98.4
Clothing 98.4
Pants 98.4
Person 97.4
Coat 96
Art 93.5
Painting 93.5
Formal Wear 79.9
Dress 79.9
Suit 66.5
Outdoors 65.3
Chair 63.3
Furniture 63.3
Jacket 57.7
Text 56.6
Blouse 56.1
Accessories 55.8
Glasses 55.8
Hat 55.2
Fashion 55.2
Gown 55.2

Clarifai
created on 2019-02-18

people 99.5
adult 99
wear 98.7
group 97.6
two 97
woman 95.5
art 95
man 93.6
one 93.6
facial expression 92.9
illustration 90.1
portrait 89.4
painting 88.8
outfit 86.9
veil 85
print 84.9
three 82.6
leader 81.5
child 81.2
furniture 80.5

Imagga
created on 2019-02-18

sketch 28.5
sculpture 25.4
old 23
drawing 22.5
statue 21.8
art 21.3
representation 17.9
antique 17.5
ancient 17.3
religion 17
symbol 16.2
stone 15.5
culture 15.4
stamp 14.6
architecture 14.4
history 14.3
religious 14
monument 14
face 13.5
vintage 13.2
church 12.9
historical 12.2
travel 12
marble 11.9
historic 11.9
paper 11.1
decoration 11
tourism 10.7
carving 10.7
figure 10.5
design 10.2
sign 9.8
holy 9.6
envelope 9.6
traditional 9.1
bust 8.8
detail 8.8
wall 8.8
die 8.7
saint 8.7
spiritual 8.6
god 8.6
golden 8.6
blank 8.6
model 8.6
frame 8.4
famous 8.4
stucco 8.3
tourist 8.2
currency 8.1
container 7.9
portrait 7.8
book jacket 7.7
city 7.5
retro 7.4
dress 7.2
landmark 7.2
body 7.2

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

room 100
scene 100
gallery 100
white 63.6
old 51
art 51
museum 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 99.5%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Happy 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 30-40
Gender Male, 95.6%
Calm 97.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.7%
Confused 0.6%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 9-17
Gender Female, 99.4%
Calm 98%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 1.6%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 1-7
Gender Female, 99.9%
Angry 95.3%
Surprised 6.3%
Fear 5.9%
Calm 4.2%
Sad 2.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Categories

Imagga

paintings art 100%

Captions