Human Generated Data
Title
Untitled (Dr. Herman M. Juergens, driving car; walking on sidewalk)
Date
1965-1968
People
Artist: Gordon W. Gahan, American 1945 - 1984
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.484
Human Generated Data
Title
Untitled (Dr. Herman M. Juergens, driving car; walking on sidewalk)
People
Artist: Gordon W. Gahan, American 1945 - 1984
Date
1965-1968
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.484
Machine Generated Data
Tags
Amazon
created on 2023-10-25
Art
98.4
Collage
98.4
Adult
97.8
Male
97.8
Man
97.8
Person
97.8
Person
97.4
Chart
94
Plot
94
Person
85.7
Adult
83.5
Person
83.5
Female
83.5
Woman
83.5
Home Decor
81.8
Linen
81.8
Face
72.8
Head
72.8
Diagram
56.3
Plan
56.3
Indoors
56
Closet
55.8
Cupboard
55.8
Furniture
55.8
Page
55.6
Text
55.6
Shelf
55.4
Clarifai
created on 2018-10-06
man
96.9
people
96.7
vertical
91.5
illustration
91.2
art
90.9
woman
90.6
adult
90
old
89.9
desktop
89.7
no person
87.7
retro
86.5
one
86.4
vector
85.8
vintage
83.9
monochrome
81.4
design
81.1
wear
80.9
picture frame
80.5
paper
79.8
bill
78.7
Imagga
created on 2018-10-06
film
23.6
negative
17.3
black
15.3
clothing
14.1
education
13.8
photographic paper
13.8
business
13.3
adult
12.9
man
12.8
design
11.8
cleaver
11.7
object
11
work
11
paper
10.4
symbol
10.1
book
10.1
people
10
male
9.9
sleeve
9.8
success
9.6
pad
9.4
empty
9.4
men
9.4
shirt
9.4
photographic equipment
9.3
knife
9.1
person
9
blank
8.7
texture
8.3
template
8.2
holding
8.2
edge tool
8.2
yellow
7.9
smile
7.8
old
7.7
tool
7.6
hand
7.6
outline
7.6
garment
7.4
safety
7.4
cap
7.3
student
7.2
office
7.2
school
7.2
worker
7.1
Google
created on 2018-10-06
black
95.2
black and white
93.3
monochrome photography
83
photography
82.8
monochrome
66.5
shelving
61.3
shelf
53.8
Color Analysis
Feature analysis
Amazon
Adult
Male
Man
Person
Female
Woman
❮
❯
Adult
97.8%
❮
❯
Male
97.8%
❮
❯
Man
97.8%
❮
❯
Person
97.8%
❮
❯
Female
83.5%
❮
❯
Woman
83.5%
Categories
Imagga
interior objects
50.4%
paintings art
41.8%
streetview architecture
7%
pets animals
0.5%
nature landscape
0.1%
text visuals
0.1%
food drinks
0.1%
Captions
Microsoft
created on 2018-10-06
a person in a white room
47.1%
a person standing next to a fireplace
28.7%
a person standing in a room
28.6%
Text analysis
Amazon
29
28
30
KODAK
SAFETY
SAFETY FILM
FILM
27
PAN
PAN FILM
H
J H
J
26A
27A
28A
1
25A
25 1
25
201
CAFE
SC 1
x
KODAK TAXI x
SC
TAXI
Google
KODA H25 26A -> 27A → 28 -28A → 29 →30
KODA
H25
26A
-
>
27A
→
28
-28A
29
30