Human Generated Data
Title
Untitled (Susanna Shahn)
Date
Fall 1936
People
Artist: Ben Shahn, American 1898 - 1969
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4372.6
Human Generated Data
Title
Untitled (Susanna Shahn)
People
Artist: Ben Shahn, American 1898 - 1969
Date
Fall 1936
Classification
Photographs
Credit Line
Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4372.6
Machine Generated Data
Tags
Amazon
created on 2023-10-07
Baby
98.7
Person
98.7
Furniture
97.1
Face
93.2
Head
93.2
Bed
81.1
Crib
71.2
Infant Bed
71.2
Body Part
60.7
Finger
60.7
Hand
60.7
Cradle
56.4
Funeral
55.8
Photography
55.4
Portrait
55.4
Clarifai
created on 2018-05-09
people
99.5
one
98.3
adult
94.8
man
93.1
vehicle
91.4
offense
90
wear
89.6
monochrome
86.2
child
83.8
portrait
82.8
two
80.9
indoors
79.9
music
77.5
war
77.3
woman
76.5
street
75.9
furniture
75.3
room
75
transportation system
73.6
retro
71.7
Imagga
created on 2023-10-07
dishwasher
27.8
white goods
23.6
home appliance
23
technology
21.5
equipment
21.4
device
18.3
car
18.1
appliance
17.7
television
17.6
computer
16.1
business
15.8
black
15
vehicle
14.8
monitor
14.6
metal
13.7
object
13.2
money
12.7
electronic equipment
12.4
finance
11
transportation
10.7
digital
10.5
old
10.4
close
10.3
power
10.1
currency
9.9
steel
9.7
laptop
9.6
savings
9.3
dollar
9.3
cash
9.1
transport
9.1
bank
8.9
freight car
8.8
media
8.6
box
8.5
banking
8.3
plastic
8.3
speed
8.2
data
8.2
music
8.1
wealth
8.1
man
8.1
drive
8
machine
8
work
7.8
travel
7.7
film
7.7
automobile
7.7
durables
7.6
storage
7.6
electronic
7.5
retro
7.4
entertainment
7.4
telecommunication system
7.3
container
7.3
open
7.2
financial
7.1
working
7.1
paper
7
Google
created on 2018-05-09
black
95.5
photograph
95.3
black and white
94.7
monochrome photography
89.9
photography
83.1
monochrome
75.9
product design
63.8
still life photography
55.4
font
51.2
product
50.8
Microsoft
created on 2018-05-09
appliance
58
kitchen appliance
16.4
oven
15.3
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
10-18
Gender
Female, 69%
Calm
97.3%
Surprised
6.3%
Fear
5.9%
Sad
2.8%
Angry
0.3%
Confused
0.2%
Disgusted
0.2%
Happy
0.1%
Feature analysis
Amazon
Baby
Person
Crib
❮
❯
Baby
98.7%
❮
❯
Person
98.7%
❮
❯
Crib
71.2%
Categories
Imagga
cars vehicles
57.2%
interior objects
38.6%
food drinks
2%
Captions
Microsoft
created on 2018-05-09
a black and white photo of a person
34.6%
a black and white photo of an oven
27.1%
Text analysis
Amazon
Art
and
University
College
(Harvard
Fellows
President and Fellows of Harvard College (Harvard University Art Museums)
Harvard
Museums)
of
P1970.4372.0006
President
Google
@ President and Fellows of Harvard College (Harvard University Art Museums) P1970.4372.0006
@
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4372.0006