textfiles/computers/CYBERSPACE/maddox.ess

257 lines
12 KiB
Plaintext
Raw Permalink Normal View History

2021-04-15 11:31:59 -07:00
From: tmaddox@netcom.com (Tom Maddox)
Newsgroups: alt.cyberpunk
Subject: After the Deluge (an essay on cyberpunk)
Date: 13 Jul 92 09:42:14 GMT
Organization: Netcom - Online Communication Services (408 241-9760 guest)
Lines: 262
(The following essay was printed in the volume _Thinking Robots,
an Aware Internet, and Cyberpunk Librarians_, edited by R. Bruce Miller and
Milton T. Wolf, distributed at the Library and Information Technology
Association meeting in San Francisco, during the 1992 American Library
Association Conference. An expanded version of the volume will be published
later this year.)
After the Deluge: Cyberpunk in the '80s and '90s
Tom Maddox
In the mid-'80s cyberpunk emerged as a new way of
doing science fiction in both literature and film. The
primary book was William Gibson's _Neuromancer_; the
most important film, _Blade Runner_. Both featured a
hard-boiled style, were intensely sensuous in their
rendering of detail, and engaged technology in a manner
unusual in science fiction: neither technophiliac (like
so much of "Golden Age" sf) nor technophobic (like the
sf "New Wave"), cyberpunk did not so much embrace
technology as go along for the ride.
However, this was just the beginning: during the '80s
cyberpunk _spawned_, and in a very contemporary mode.
It was cloned; it underwent mutations; it was the
subject of various experiments in recombining its
semiotic DNA. If you were hip in the '80s, you at least
heard about cyberpunk, and if in addition you were even
marginally literate, you knew about Gibson.
To understand how this odd process came about, we have
to look more closely at cyberpunk's beginnings--more
particularly, at the technological and cultural context.
At the same time, I want to acknowledge what seems to me
an essential principle: when we define or describe a
literary or artistic style, we are suddenly in contested
territory, where no one owns the truth. This principle
applies with special force to the style (if it is a
style) or movement (if it is a movement) called
cyberpunk, which has been the occasion for an
extraordinary number of debates, polemics, and fights
for critical and literary terrain. So let me remind you
that I am speaking from my own premises, interests, even
prejudices.
By 1984, the year of _Neuromancer_'s publication,
personal computers were starting to appear on desks all
over the country; computerized videogames had become
commonplace; networks of larger computers, mainframes
and minis, were becoming more extensive and accessible
to people in universities and corporations; computer
graphics and sound were getting interesting; huge stores
of information had gone online; and some hackers were
changing from nerds to sinister system crackers. And of
course the rate of technological change continued to be
rapid--which in the world of computers has meant better
and cheaper equipment available all the time. So
computers became at once invisible, as they disappeared
into carburetors, toasters, televisions, and wrist
watches; and ubiqitous, as they became an essential part
first of business and the professions, then of personal
life.
Meanwhile the global media circus, well underway for
decades, continued apace, quite often feeding off the
products of the computer revolution, or at least
celebrating them. The boundaries between entertainment
and politics, or between the simulated and the real,
first became more permeable and then--at least according
to some theorists of these events--collapsed entirely.
Whether we were ready or not, the postmodern age was
upon us.
In the literary ghetto known as science fiction,
things were not exactly moribund, but sf certainly was
ready for some new and interesting trend. Like all
forms of popular culture, sf thrives on labels, trends,
and combinations of them--labeled trends and trendy
labels. Marketers need all these like a vampire needs
blood.
This was the context in which _Neuromancer_ emerged.
Anyone who was watching the field carefully had already
noticed stories such as "Johnny Mnemonic" and "Burning
Chrome," and some of us thought that Gibson was writing
the most exciting new work in the field, but no one--
least of all Gibson himself--was ready for what happened
next. _Neuromancer_ won the Hugo, the Nebula, the
Philip K. Dick Award, Australia's Ditmar; it contributed
a central concept to the emerging computer culture
("cyberspace"); it defined an emerging literary style,
cyberpunk; and it made that new literary style famous,
and (remarkably, given that we're talking about science
fiction here) even hip.
Also, as I've said, there was the film _Blade Runner_,
Ridley Scott's unlikely adaptation of Philip K. Dick's
_Do Androids Dream of Electric Sheep?_ The film didn't
have the success _Neuromancer_ did; in fact, I heard its
producer remark wryly when the film was given the Hugo
that perhaps someone would now go to see it. Despite
this, along with _Neuromancer_, _Blade Runner_ together
set the boundary conditions for emerging cyberpunk: a
hard-boiled combination of high tech and low life. As
the famous Gibson phrase puts it, "The street has its
own uses for technology." So compelling were these two
narratives that many people then and now refuse to
regard as cyberpunk anything stylistically and
thematically different from them.
Meanwhile, down in Texas a writer named Bruce Sterling
had been publishing a fanzine (a rigorously postmodern
medium) called _Cheap Truth_; all articles were written
under pseudonyms, and taken together, they amounted to a
series of guerrilla raids on sf. Accuracy of aim and
incisiveness varied, of course; these raids were
polemical, occasional, essentially temperamental.
Altogether, _Cheap Truth_ stirred up some action, riled
some people, made others aware of each other.
Gibson and Sterling were already friends, and other
writers were becoming acquainted with one or both: Lew
Shiner, Sterling's right-hand on _Cheap Truth_ under the
name "Sue Denim," Rudy Rucker, John Shirley, Pat
Cadigan, Richard Kadrey, others, me included. Some
became friends, and at the very least, everyone became
aware of eve2yonhO<68><01>@<40><><EFBFBD><EFBFBD><EFBFBD> <0C>6<<3C>7 g<>g<><67><EFBFBD> <11>!<21>l<EFBFBD><6C>d$<08>.L<>̮D<08><>M<EFBFBD>.d m<><6D><EFBFBD>@019<>0<EFBFBD><04><08><><18><> <1B>{1<03>+3+<2B><>Ks9<03>y<03>CK<43>;<3B>{<7B><>{1<03>+<2B><>c{{<7B>+c<>hS 33KcK <0B>+!3{cY <0B><1B>+<2B><><EFBFBD>s[<5B>q <0B><03>C) <0B><>+ <0B> s){1S<>C)<03>{<7B>!a<03>C)k+#K K<><1B><> s!K<><4B> {cˣ+<2B>a<03>C)Sk <0B>[+<2B>+<2B><>a<03>+s<>Ks<4B>y;+ <0B>q<1B>+<2B><><EFBFBD>sY+ k)<03> cK<63>k sK<19>S<>K<EFBFBD>CKq<03>C)<03>1;C+<2B><>ya<03>{k) <0B><>c <0B>#+!a<03>{k){{+!a<03>{k)S <0B>C+!Kqa<03>{k)+<2B>+q#+sK+!<03>Ct of a central controller
been raised. This is because one is not necessary, in an object centred
view. Each object must be capable of taking care of itself.
Object Definition
=========================================================================
An object is defined as being comprised of a number of processes. Each
process sends messages to other processes. Inter-object links are simply
extensions of inter-process links that occur between objects.
The grouping of certain processes into an object is really arbitrary, and
one could supposedly do away with the idea of objects entirely (which in
fact don't really exist outside of the definition that they are composed of
a number of processes) except that they are a convenient method of grouping
processes into functional units.
Process Definition
=========================================================================
A process is basically a task, executing in some named language. In fact, a
task could be defined in any language as long as a few criteria are met:
1. Each process is capable of establishing communication links
with other processes.
2. Each process is capable of starting another object/process
3. Each process can be moved, mid-execution, onto another processor.
The first two requirements are simple. The last one is not, and to my
knowledge, has not been investigated before.
Can a compiled language be shifted, mid-execution, to another processor?
And I don't necessarily mean a processor of the same type. I am talking
from an IBM to a SparcStation. In a multiprocessor environment, an object
may be copied, or moved, onto another processor. While copying the object
could involve re-compiling and restarting the process, moving it requires
that all internal structures/variables, and execution state be preserved.
Example: If each process was a Unix process running in C:
In this example, each process would be capable of establishing a
communication link with other processes via library functions and standard
unix calls. If the definition of a process extended to include the source
code, then a process could start another process by issuing a command to
compile it and start the process, passing it some initial information. It
would even be capable of copying and starting itself on another processor
but it would fail the last requirement of being able to be moved, while
running, to any other processor.
This requirement of being moved also creates a few other problems, which
can be solved with a little work. First, during the move, a process will be
inactive, and incapable of responding to messages. A queue will have to be
set up to store these messages. But also, once the object has moved, then
messages passed to the original processor have to be passed on to the new
processor, and all other sending processes should be notified that the
process has been moved. We have the same situation were someone is moving
house, and something has to happen to their mail.
Why do we need this third requirement, which is making life so difficult?
In a distributed system, each object is comprised of many processes. To
distribute the load evenly. processes have to be shifted and moved among
the processors available to the network. This allocation is done
transparently and separately to the individual processes, so that
distribution algorithms can be improved and site-specific parameters can be
taken into account.
If processes cannot by dynamically distributed amongst the multiple
processors available, then processes will get "stuck" on certain
processors, resulting in an uneven distribution and wasted processing time.
And what happens when a processing node has to be shut down? The object has
to be moved then.
Distributed processing and Networking
The fundamentals for networking are already in place, and no doubt
specialist networks will be evolved for VR in time. Any networking details
will be taken for granted.
Likewise, since processes are defined as concurrent tasks, which
communicate in a very OCCAM like way, the basics for fine-grained
multiprocessing are in place. The complex tasks of process-processor
allocations and linking is left to the site/machine dependant VR operating
system that individual sites are using. This area has already been
extensively researched, and an appropriate method is no doubt currently
sitting on someone's shelf waiting to be dusted off and put into place.
Languages
=========================================================================
Any language that the VR supports must be capable of performing the above
mentioned tasks. I can only guess at what such a VR language will look like.
It will probably not be a current language. The requirements are beyond the
specifications for any current language. The saving grace of the system is
that multiple languages can be defined and implemented, as long as they can
communicate, start other encapsulated processes, and be encapsulated. Of
course, if one machine cannot handle a process defined in a particular
language, then it will have to be run on another that can support it.
Eventually on or two languages should emerge.
As long as they adhere to the rules that the OS lays down, there should be
no problem.
Processes
=========================================================================
As mentioned earlier, processes are defined in some language, and must
perform a few basic tasks such as linking to other processes, starting new
processes, and being encapsulated for transmission.
Whatever language it is, must be available on all VR machines that are
going to want to support the obnd so long as the process remains problematic--
for instance, so long as it threatens to redefine us--
the voice will be heard.
--
Tom Maddox
tmaddox@netcom.com
"I swear I never heard the first shot"
Wm. Gibson, "Agrippa: a book of the dead"