Human Generated Data

Title

Untitled (model window, open)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19046

Human Generated Data

Title

Untitled (model window, open)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 88.8
Aluminium 64.6
Window 62.1
Appliance 59.3
Dishwasher 59.3

Imagga
created on 2022-03-05

piano 100
grand piano 100
keyboard instrument 92.4
stringed instrument 91.3
percussion instrument 89.1
musical instrument 60.6
technology 29.7
laptop 27.5
business 26.1
computer 24.1
electronic 23.3
screen 21.9
modern 21
notebook 19.8
keyboard 19.7
equipment 18.7
work 18
communication 17.6
office 16.9
information 15
digital 14.6
black 13.2
object 13.2
car 13.1
reflection 13
data 12.8
silver 12.4
mobile 12.2
monitor 11.7
design 11.2
network 11.1
open 10.8
transportation 10.7
metal 10.5
device 10.3
travel 9.9
interior 9.7
portable 9.7
building 9.5
construction 9.4
industry 9.4
tool 9.1
home 8.8
light 8.7
education 8.7
display 8.4
futuristic 8.1
success 8
paper 7.8
architecture 7.8
color 7.8
3d 7.7
corporate 7.7
blank 7.7
automobile 7.7
system 7.6
finance 7.6
vehicle 7.5
key 7.5
close 7.4
transport 7.3
structure 7.2
table 7.1
steel 7.1

Microsoft
created on 2022-03-05

black and white 97.9
monochrome 93.5
indoor 88.9
text 88.1

Text analysis

Amazon

oor

Google

Oor MJI3-YT 33A°2-XAGON
33A°2-XAGON
Oor
MJI3-YT