Human Generated Data

Title

CUP

Date

17th century

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. Nicholas Brown, 1950.125.50

Human Generated Data

Title

CUP

People
Date

17th century

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-17

Milk 90.9
Drink 90.9
Beverage 90.9
Electronics 75.9
Blow Dryer 61.2
Appliance 61.2
Hair Drier 61.2
Dryer 61.2

Imagga
created on 2022-06-17

lens cap 93.3
cap 74.6
protective covering 56
covering 38.3
push button 26.3
close 24
black 20.4
business 19.4
equipment 17
closeup 14.8
focus 14.8
selective focus 14.7
nobody 14
clear 14
magnifier 13.9
antique 13.9
searching 13.7
target 13.6
through 13.6
selective 13.6
copy space 13.5
music 13.5
technology 13.4
direction 13.3
key 13.2
sound 13.1
metal 12.9
geographical location 12.9
national border 12.9
eyeglass 12.9
capital city 12.9
information medium 12.8
spectacle 12.8
topography 12.8
cartography 12.8
examining 12.8
blueprint 12.7
guide 12.7
exploration 12.6
horizontal 12.6
pointing 12.6
geography 12.5
journey 12.2
map 12.1
globe 12
earth 11.9
desktop globe 11.9
directly above 11.9
business travel 11.8
magnifying glass 11.8
travel destination 11.8
high angle 11.8
paper 11.8
device 11.7
speaker 11.6
button 11.5
vacations 11.3
world map 10.8
city 10.8
tourism 10.7
travel 10.6
audio 10.5
circle 10.4
computer 10.4
keyboard 10.3
stereo 9.9
round 8.6
system 8.6
control 8.5
modern 8.4
color 8.3
against 8.3
loud 7.8
bass 7.8
dutch oven 7.8
lock 7.7
lens 7.7
type 7.7
accessory 7.6
number 7.5
object 7.3
data 7.3
kitchen 7.2
loudspeaker 7.1
silver 7.1

Google
created on 2022-06-17

Microsoft
created on 2022-06-17

text 79.6
black and white 72.8
monochrome 60.6
circle 57.4

Feature analysis

Amazon

Milk 90.9%

Captions

Microsoft

a cup of coffee 28.9%
a close up of a cup 28.8%
close up of a cup 28.7%

Text analysis

Amazon

50

Google

1950.125.50 3O
3O
1950.125.50