Human Generated Data

Title

Eva and Topsy

Date

2000

People

Artist: David Levinthal, American born 1949

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M24388

Human Generated Data

Title

Eva and Topsy

People

Artist: David Levinthal, American born 1949

Date

2000

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M24388

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Indoors 98.3
Fireplace 98.3
Human 93.3
Person 93.3
Art 84.2
Hearth 83.1
Screen 76
Electronics 76
Painting 69.5
Monitor 62
Display 62
Furniture 61.7
Text 56.8

Clarifai
created on 2019-10-30

picture frame 98.8
illustration 98.5
vector 97.4
design 96.7
art 96.6
silhouette 95.2
image 93.9
people 93.5
retro 92.7
desktop 92.4
painting 91.9
old 91.7
vintage 91.7
margin 91.6
graphic 90.9
decoration 90.8
woman 90.1
style 88.8
element 87.9
symbol 85.8

Imagga
created on 2019-10-30

blackboard 67.1
frame 33.7
chalkboard 31.4
black 29.2
blank 29.2
chalk 27.3
board 24.6
vintage 23.2
texture 20.8
education 19.9
paper 19.7
design 19.3
message 19.2
school 18.9
old 18.8
empty 18
grunge 17.9
note 17.5
card 17.2
container 17.1
drawing 16.5
business 16.4
text 15.7
retro 15.6
classroom 15.5
symbol 14.8
space 14.7
art 14.5
border 13.6
wooden 13.2
write 13.2
sign 12.8
aged 12.7
teach 12.7
notice 12.6
envelope 12.5
object 11.7
wood 11.7
pattern 11.6
learn 11.3
letter 11
lesson 10.7
billboard 10.7
idea 10.7
antique 10.5
copy 9.7
class 9.6
college 9.5
word 9.4
writing 9.4
study 9.3
communication 9.2
decorative 9.2
photograph 9
decoration 9
reminder 8.7
announcement 8.7
university 8.6
nobody 8.6
money 8.5
finance 8.4
element 8.3
gold 8.2
dirty 8.1
wallpaper 7.7
head 7.6
horizontal 7.5
film 7.5
technology 7.4
graphic 7.3
present 7.3
office 7.2

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

gallery 99.7
scene 99.4
room 99.3
art 99.1
drawing 98.2
sketch 96.9
illustration 80.2
painting 62.7
cartoon 60.3
human face 56
black and white 54
picture frame 32.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 50.1%
Sad 50.6%
Disgusted 45%
Angry 45.3%
Fear 45.5%
Surprised 45.1%
Confused 45.1%
Happy 45.3%
Calm 47.9%

Feature analysis

Amazon

Person 93.3%
Painting 69.5%

Categories

Imagga

paintings art 98.6%

Captions

Microsoft
created on 2019-10-30

a room with art on the wall 57%
a close up of a screen 39%