Human Generated Data

Title

Untitled

Date

1975

People

Artist: Vincent Sharp, Canadian 1937 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.170

Human Generated Data

Title

Untitled

People

Artist: Vincent Sharp, Canadian 1937 -

Date

1975

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Imagga
created on 2022-01-23

grand piano 100
piano 100
percussion instrument 100
stringed instrument 100
keyboard instrument 100
musical instrument 84.8
upright 42.4
music 37.9
keyboard 35.7
instrument 31.6
musical 27.8
key 25.2
black 24.6
keys 23.4
play 21.5
sound 21.5
classical 19.1
grand 16.5
entertainment 15.7
musician 15.6
equipment 14.7
playing 14.6
ivory 13.8
business 12.7
song 12.7
concert 12.6
home 12
chord 11.8
jazz 11.8
performance 11.5
education 11.3
office 11.2
light 10.7
old 10.4
art 10.4
classic 10.2
closeup 10.1
3d 10.1
melody 9.8
lesson 9.7
digital 9.7
audio 9.6
empty 9.4
nobody 9.3
professional 9.3
note 9.2
wood 9.2
hand 9.1
modern 9.1
furniture 9
pianist 8.9
ebony 8.9
interior 8.8
close 8.6
people 8.4
child 8.3
data 8.2
technology 8.2
indoors 7.9
performing 7.9
work 7.8
antique 7.8
table 7.8
stage 7.8
practice 7.7
learn 7.6
harmony 7.5
object 7.3
detail 7.2
board 7.2
male 7.1
job 7.1
wooden 7

Microsoft
created on 2022-01-23

text 96.3
music 95.9
piano 93.4
black and white 89.9
musical keyboard 85.8
black 71.2
shadow 68.7
electric organ 12.3

Feature analysis

Amazon

Piano 95.8%
Person 67.8%

Captions

Microsoft

a piano in a room 81.5%
a close up of a piano 80.1%
a person sitting on a piano 55.1%