Human Generated Data
Title
Untitled
Date
1994
People
Artist: David Levinthal, American born 1949
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.278
Human Generated Data
Title
Untitled
People
Artist: David Levinthal, American born 1949
Date
1994
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.278
Machine Generated Data
Tags
Amazon
created on 2019-04-10
Human
81.2
Figurine
70.2
Electronics
69
Monitor
67.2
Display
67.2
Screen
67.2
Phone
65.7
Face
61.9
Text
58.2
Clothing
57.7
Apparel
57.7
Person
54
Person
43.2
Clarifai
created on 2018-11-05
blur
95.6
wear
95.1
people
94
picture frame
93.7
exhibition
93
no person
92.3
indoors
91.8
museum
90.5
margin
89.7
one
89.4
man
89.1
woman
88.6
desktop
88.4
outdoors
84.1
painting
84
group
84
portrait
83.4
adult
83.4
screen
83.4
landscape
83.1
Imagga
created on 2018-11-05
glass
15.1
device
14.9
celebration
13.5
bottle
12.7
black
12.6
light
12
person
11.9
people
11.7
holiday
11.5
color
11.1
wine
10.4
close
10.3
hand
9.9
lifestyle
9.4
dark
9.2
drink
9.2
detail
8.8
luxury
8.6
design
8.6
restaurant
8.5
fire
8.4
adult
8.4
alcohol
8.1
closeup
8.1
interior
8
love
7.9
art
7.9
equipment
7.6
hot
7.5
flame
7.5
rose
7.5
indoor
7.3
celebrate
7.2
home
7.2
face
7.1
Google
created on 2018-11-05
picture frame
68.6
art
66.5
Microsoft
created on 2018-11-05
monitor
99.7
electronics
99.3
indoor
96.9
display
96.7
television
91.7
screen
91.4
computer
91.3
desk
75.2
screenshot
50.3
flat
36.6
picture frame
25.5
entertainment center
12.3
Color Analysis
Feature analysis
Amazon
Monitor
Person
❮
❯
Monitor
67.2%
❮
❯
Person
54%
Categories
Imagga
macro flowers
38.4%
pets animals
25.8%
food drinks
14.7%
events parties
7.2%
text visuals
4.9%
paintings art
3.8%
interior objects
2.7%
Captions
Microsoft
created on 2018-11-05
a flat screen tv sitting on top of a television
67.6%
a flat screen tv sitting in front of a television
67.5%
a flat screen television
67.4%
Text analysis
Amazon
O
AP
194 AP
194