Human Generated Data

Title

Untitled (large machinery)

Date

1940s

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2102

Human Generated Data

Title

Untitled (large machinery)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Building 99
Factory 98
Wheel 82.5
Machine 82.5
Workshop 74.1
Assembly Line 69.8
Nature 60.1
Manufacturing 58.1

Imagga
created on 2021-12-14

modern 30.8
architecture 28.1
supermarket 25.7
interior 25.6
business 23.1
design 22.5
3d 21.7
construction 21.4
technology 20.8
grocery store 19.3
office 18.6
building 17.5
sketch 16.7
drawing 16.5
plan 16.1
house 15.9
glass 15.9
urban 15.7
digital 15.4
city 15
marketplace 14.4
negative 14.3
mercantile establishment 14.2
engineering 13.3
architect 12.5
hall 12.5
film 12
light 12
room 12
industry 11.9
development 11.8
structure 11.6
science 11.6
project 11.5
perspective 11.3
inside 11
container 11
futuristic 10.8
blueprint 10.8
floor 10.2
window 10.2
reflection 10.1
equipment 10
wagon 9.8
metal 9.6
apartment 9.6
web 9.3
photographic paper 9.3
industrial 9.1
art 8.9
computer 8.8
empty 8.7
architectural 8.6
corporate 8.6
shop 8.6
roof 8.6
wall 8.5
wheeled vehicle 8.4
power 8.4
data 8.2
global 8.2
new 8.1
home 8.1
work 8
medical 7.9
place of business 7.9
space 7.8
table 7.7
residential 7.7
research 7.6
skyline 7.6
buildings 7.6
sign 7.5
network 7.4
beaker 7.4
lines 7.2
paper 7.2
idea 7.1
frame 7
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 95
drawing 82.4
sketch 81.5
appliance 78.7
black and white 74.7
white 68.1
white goods 56.6
building 56.3
cluttered 12.6

Feature analysis

Amazon

Wheel 82.5%

Captions

Microsoft

a group of people in a room 55.6%

Text analysis

Amazon

FENN
K
123RF
2019