Human Generated Data

Title

Untitled (close-up view of machinery)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17773

Human Generated Data

Title

Untitled (close-up view of machinery)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Machine 85.7
Building 74.1
Car 61.9
Vehicle 61.9
Automobile 61.9
Transportation 61.9
Lathe 55.2

Imagga
created on 2022-02-26

equipment 54.4
device 38.7
metal 28.1
electronic equipment 27.5
technology 27.4
radio receiver 24.8
receiver 23.6
projector 23
old 22.3
digital 19.4
computer 18.4
electrical 18.2
antique 16.4
black 16.2
set 16.1
radio 15.7
machine 15.4
industry 14.5
film 14.2
lock 14.2
apparatus 14.1
retro 13.9
data 13.7
sequencer 13.4
camera 13.3
steel 13.3
amplifier 13.2
object 13.2
board 12.6
video 12.6
vintage 12.4
electric 12.2
music 11.7
system 11.4
power 10.9
close 10.8
electronics 10.4
key 10.4
sound 10.3
electronic 10.3
display 10.2
safe 10.2
safety 10.1
security 10.1
box 9.7
lens 9.7
media 9.5
door 9.5
optical instrument 9.4
light 9.3
dial 9.3
business 9.1
control 9
instrument 9
silver 8.8
hardware 8.6
musical 8.6
engineering 8.6
storage 8.6
design 8.4
modern 8.4
part 8.3
entertainment 8.3
room 8.2
protection 8.2
open 8.1
button 7.9
chip 7.9
secure 7.7
card 7.6
audio 7.6
recorder 7.6
communication 7.6
mechanism 7.5
closeup 7.4
industrial 7.3
optical device 7.2
science 7.1
information 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.4
electronics 75.2
black and white 74.2
clock 64.4
telephone 51.2
watch 50.1

Feature analysis

Amazon

Car 61.9%

Captions

Microsoft

an old photo of a person 45.1%

Text analysis

Amazon

20
801

Google

20 801 YT37A°2- XAGON
YT37A°2-
20
801
XAGON