Human Generated Data

Title

Untitled (woman seated under a flowering tree by a body of water)

Date

c. 1910-1930

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Richard E. Bennink, 2.2002.5079

Human Generated Data

Title

Untitled (woman seated under a flowering tree by a body of water)

People

Artist: Unidentified Artist,

Date

c. 1910-1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Richard E. Bennink, 2.2002.5079

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Person 99.1
Clothing 99
Dress 99
Person 97.8
Art 95.5
Painting 95.5
Formal Wear 94.7
Fashion 93.8
Gown 93.8
Computer Hardware 82.2
Electronics 82.2
Hardware 82.2
Monitor 82.2
Screen 82.2
Head 81.3
Outdoors 65.9
Plant 63.7
Tree 63.7
Collage 57.4
Vegetation 57.2
Window 56.6
Nature 55.6
Door 55.5
Herbal 55.3
Herbs 55.3

Clarifai
created on 2018-06-30

landscape 98.9
people 98.9
tree 98.2
mammal 97.3
adult 96.9
dog 96.5
window 95.6
water 95.2
painting 95.1
cat 94.2
park 94.2
weather 94
wear 93.8
bird 92.5
art 92.4
light 91.4
no person 91.3
two 91.3
one 91.2
rain 90.8

Imagga
created on 2018-06-30

window screen 100
screen 100
protective covering 100
covering 72.2
old 52.3
grunge 46.8
vintage 45.5
texture 44.5
frame 44.1
antique 37.2
retro 35.2
aged 32.6
material 32.1
dirty 28.9
empty 28.3
textured 28
border 28
wall 27.4
rough 27.3
pattern 26.7
blank 24.9
damaged 24.8
grungy 23.7
black 22.2
paper 22
design 21.4
rusty 21
space 20.9
ancient 20.8
art 19.5
surface 18.5
backdrop 18.1
weathered 18
messy 17.4
structure 17.4
chalkboard 16.7
board 16.3
old fashioned 16.2
blackboard 15.9
rust 15.4
stain 15.4
film 15.4
graphic 14.6
wallpaper 14.6
wood 14.2
chalk 13.6
brown 13.3
nobody 13.2
wooden 13.2
text 13.1
backgrounds 13
edge 12.5
concrete 12.4
decay 11.6
obsolete 11.5
fire screen 11.4
dark 10.9
decoration 10.9
gray 10.8
element 10.7
frames 10.7
detailed 10.6
digital 10.5
close 10.3
grain 10.1
highly 9.8
photographic 9.8
scratched 9.8
scratch 9.8
copy 9.7
stains 9.7
faded 9.7
crumpled 9.7
your 9.7
computer 9.6
parchment 9.6
spot 9.6
sheet 9.4
paint 9.1
school 9
style 8.9
color 8.9
layered 8.8
broad 8.8
grime 8.8
mottled 8.8
album 8.8
fracture 8.8
building 8.7
movie 8.7
crack 8.7
layer 8.7
education 8.7
aging 8.6
worn 8.6
textures 8.5
stone 8.4
note 8.3
historic 8.2
interior 8
scratches 7.9
designed 7.9
mess 7.9
crease 7.8
noise 7.8
tracery 7.8
succulent 7.8
announcement 7.8
spotted 7.7
collage 7.7
photograph 7.7
mask 7.7
great 7.7
dirt 7.6
age 7.6
photography 7.6
television 7.6
writing 7.5
striped 7.5
iron 7.5
closeup 7.4
object 7.3
book 7.3
metal 7.2
detail 7.2
history 7.2

Google
created on 2018-06-30

green 94.6
picture frame 71.4
tree 70.1
window 65.6
painting 63
glass 62.8
grass 53.3

Microsoft
created on 2018-06-30

window 98.8
tree 97.9
aquarium 61.4
day 14.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 67.4%
Sad 84.1%
Happy 28%
Calm 21.9%
Surprised 8.7%
Fear 6.6%
Angry 3.1%
Disgusted 1.7%
Confused 0.9%

AWS Rekognition

Age 14-22
Gender Male, 60.1%
Calm 86%
Happy 8.8%
Surprised 6.4%
Fear 6%
Sad 3.5%
Confused 0.7%
Disgusted 0.3%
Angry 0.3%

Feature analysis

Amazon

Person 99.1%
Monitor 82.2%