Human Generated Data

Title

Untitled

Date

2007

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.284

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

2007

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Text 91.9
Skin 87.4
Symbol 85.9
Human 85.4
Person 85.4
Logo 84
Trademark 84
Arrow 71.2
People 71.2
Emblem 67.1
Person 62.5
Beverage 59.8
Drink 59.8
Alcohol 57.9

Clarifai
created on 2018-11-05

no person 98.4
retro 95.6
business 93
old 92.2
paper 91.7
art 91.2
technology 91.1
wood 91
antique 90.5
painting 90
ancient 87.6
still life 86.9
indoors 86.4
desktop 85.9
display 85.7
vintage 84.1
design 84
traditional 82.7
one 81.1
travel 80.5

Imagga
created on 2018-11-05

pen 100
fountain pen 100
writing implement 89
paper 19.6
office 18.5
business 18.2
close 17.1
finance 15.2
pencil 14.9
document 14.8
writing 14.1
ballpoint 13.9
black 13.2
education 13
work 11.8
wine 11.4
notebook 11.4
plan 11.3
write 11.3
note 11
bottle 10.6
success 10.5
school 9.9
old 9.7
sign 9
closeup 8.8
blank 8.6
gold 8.2
design 8.1
object 8.1
detail 8
light 8
financial 8
text 7.8
pole 7.7
golden 7.7
empty 7.7
ink 7.7
alcohol 7.7
money 7.6
drink 7.5
stock 7.5
study 7.5
technology 7.4
clothespin 7.3
letter 7.3
new 7.3
market 7.1
information 7.1
life 7
tool 7

Google
created on 2018-11-05

footwear 90.6
shoe 78.7
font 55.5

Microsoft
created on 2018-11-05

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 61.1%
Happy 1.8%
Confused 1.5%
Calm 11.6%
Surprised 0.8%
Disgusted 0.4%
Sad 82.6%
Angry 1.2%

Feature analysis

Amazon

Person 85.4%

Captions

Microsoft

a close up of a screen 70.1%
a close up of a computer screen 58.4%
close up of a screen 58.3%

Text analysis

Amazon

3/7
G