Human Generated Data

Title

Unity (mylar/separation)

Date

1995

People

Artist: Louis Delsarte, American 1944 - 2020

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Brandywine Workshop and Archives, Philadelphia, Pennsylvania, 2018.33.6.3

Copyright

© Estate of Louis Delsarte

Human Generated Data

Title

Unity (mylar/separation)

People

Artist: Louis Delsarte, American 1944 - 2020

Date

1995

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2020-02-04

Advertisement 99.8
Collage 99.8
Art 90.7
Person 85.3
Human 85.3
Poster 82.4
Modern Art 70.8
Painting 59.6
Text 57.6

Clarifai
created on 2020-02-04

people 98.5
paper 98.3
vintage 97.7
art 96.8
retro 96.4
wear 95.5
old 95.4
illustration 95
group 94
antique 93.4
print 93.4
desktop 91.3
texture 91.3
adult 90.9
page 89.7
sepia pigment 88.8
dirty 88.6
man 87.9
bill 87.8
war 87.3

Imagga
created on 2020-02-04

old 44.6
stone 40.8
memorial 39.1
ancient 33.8
architecture 33.7
gravestone 33.5
wall 31.4
texture 29.2
structure 29.1
grunge 29
pattern 21.9
textured 21.1
rough 21
building 21
dirty 20.8
history 20.6
art 20.5
antique 20
aged 19.9
weathered 18.1
detail 17.7
material 17
travel 16.9
surface 16.8
grungy 16.1
monument 15.9
vintage 15.7
culture 15.4
tourism 14.9
historic 14.7
sculpture 14.2
brown 14
construction 13.7
paint 13.6
carving 13.4
wood 13.4
door 13.3
city 13.3
temple 13.3
black 13.2
exterior 12.9
close 12.6
design 12.4
decoration 12.4
wooden 12.3
historical 12.2
urban 12.2
famous 12.1
cemetery 12
brick 11.9
religion 11.7
landmark 10.8
retro 10.7
rust 10.6
window 10.5
damaged 10.5
device 10.5
rusty 10.5
metal 10.5
artistic 10.4
padlock 10.1
lock 10
gray 9.9
statue 9.8
rock 9.6
color 9.5
house 9.4
brass 8.9
century 8.8
stucco 8.8
graffito 8.8
ruins 8.8
messy 8.7
entrance 8.7
obsolete 8.6
column 8.6
worn 8.6
frame 8.3
fastener 8.3
street 8.3
border 8.1
cement 7.9
support 7.9
abandoned 7.8
past 7.7
decay 7.7
traditional 7.5
place 7.5
town 7.4

Google
created on 2020-02-04

Microsoft
created on 2020-02-04

text 97.9
drawing 96
old 95.5
sketch 91.1
painting 76
black 68.9
vintage 32.5

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 74.1%
Angry 5.3%
Sad 8.4%
Confused 4.1%
Fear 36.8%
Calm 21.9%
Happy 11.8%
Surprised 9%
Disgusted 2.6%

AWS Rekognition

Age 17-29
Gender Female, 50.2%
Calm 49.6%
Angry 50.2%
Fear 49.5%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Sad 49.7%
Confused 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.1%
Surprised 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 49.8%
Happy 49.6%
Calm 50%
Angry 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 85.3%
Poster 82.4%
Painting 59.6%

Captions

Microsoft

a vintage photo of a person 66.9%
an old photo of a person 66.8%
old photo of a person 66.7%

Text analysis

Amazon

Pawtms
Bhe
belma
Pawtms 0sm (ongredD
froen
D belma
0sm (ongredD
3n9
D

Google

Blue Pantime frocem
Blue
Pantime
frocem