------- www.t0.or.at/delanda/meshwork.htm
MESHWORKS, HIERARCHIES AND INTERFACES this is one of 7 Delanda items at:
t0.or.at/delanda/ ------- Economics, computers and the war machine ----------
MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY --------------- THE GEOLOGY
OF MORALS ---------------- artnode.se/artorbit/issue1 (the fourth dates
from feb 99) Deleuze and the genesis of form -------- Cybernetic culture
research unit e-mail abstract culture syzygy archive id(entity) links occultures
The Decimal Labyrinth (everything up to zone 4) -------- ccru.info/id(entity)/communiquetwo.htm
------------- altx.com/wordbombs/fuller.html ==============
www.t0.or.at/delanda/meshwork.htm MESHWORKS, HIERARCHIES AND INTERFACES
by Manuel De Landa The world of interface design is today undergoing dramatic
changes which in their impact promise to rival those brought about by the
use of the point-and-click graphical interfaces popularized by the Macintosh
in the early 1980's. The new concepts and metaphors which are aiming to
replace the familiar desk-top metaphor all revolve around the notion of
semi-autonomous, semi-intelligent software agents. To be sure, different
researchers and commercial companies have divergent conceptions of what
these agents should be capable of, and how they should interact with computer
users. But whether one aims to give these software creatures the ability
to learn about the users habits, as in the non-commercial research performed
at MIT autonomous agents group, or to endow them with the ability to perform
transactions in the users name, as in the commercial products pioneered
by General Magic, the basic thrust seems to be in the direction of giving
software programs more autonomy in their decision-making capabilities.
For a philosopher there are several interesting issues involved in this
new interface paradigm. The first one has to do with the history of the
software infrastructure that has made this proliferation of agents possible.
From the point of view of the conceptual history of software, the creation
of worlds populated by semi-autonomous virtual creatures, as well as the
more familiar world of mice, windows and pull-down menus, have been made
possible by certain advances in programming language design. Specifically,
programming languages needed to be transformed from the rigid hierarchies
which they were for many years, to the more flexible and decentralized
structure which they gradually adopted as they became more "object-oriented".
One useful way to picture this transformation is as a migration of control
from a master program (which contains the general task to be performed)
to the software modules which perform all the individual tasks. Indeed,
to grasp just what is at stake in this dispersal of control, I find it
useful to view this change as a part of a larger migration of control from
the human body, to the hardware of the machine, then to the software, then
to the data and finally to the world outside the machine. Since this is
a crucial part of my argument let me develop it in some detail. The first
part of this migration, when control of machine-aided processes moved from
the human body to the hardware, may be said to have taken place in the
eighteenth century when a series of inventors and builders of automata
created the elements which later came together in the famous Jacquard loom,
a machine which automated some of the tasks involved in weaving patterns
in textiles. Jacquards loom used a primitive form of software, in which
holes punched into cards coded for some of the operations behind the creation
of patterned designs. {1} This software, however, contained only data and
not control structures. In other words, all that was coded in the punched
cards was the patterns to be weaved and not any directions to alter the
reading of the cards or the performance of the operations, such as the
lifting of the warp threads. Therefore, it was the machine's hardware component
that "read" the cards and translated the data into motion, in which control
of the process resided. Textile workers at the time were fully aware that
they had lost some control to Jacquards loom, and they manifested their
outrage by destroying the machines in several occasions. The idea of coding
data into punched cards spread slowly during the 1800's, and by the beginning
of our century it had found its way into computing machinery, first the
tabulators used by Hollerith to process the 1890 United States census,
then into other tabulators and calculators. In all these cases control
remained embodied in the machine's hardware. One may go as far as saying
that even the first modern computer, the imaginary computer created by
Alan Turing in the 1930's still kept control in the hardware, the scanning
head of the Turing machine. The tape that his machine scanned held nothing
but data. But this abstract computer already had the seed of the next step,
since as Turing himself understood, the actions of the scanning head could
themselves be represented by a table of behavior, and the table itself
could now be coded into the tape. Even though people may not have realized
this at the time, coding both numbers and operations on numbers side by
side on the tape, was the beginning of computer software as we know it.
{2} When in the 1950's Turing created the notion of a subroutine, that
is, the notion that the tasks that a computer must perform can be embodied
into separate sub-programs all controlled by a master program residing
in the tape, the migration of control from hardware to software became
fully realized. From then on, computer hardware became an abstract mesh
of logical gates, its operations fully controlled by the software. The
next step in this migration took place when control of a given computational
process moved from the software to the very data that the software operates
on. For as long as computer languages such as FORTRAN or Pascal dominated
the computer industry, control remained hierarchically embedded in the
software. A master program would surrender control to a subroutine whenever
that sub-task was needed to be performed, and the subroutine itself may
pass control to an even more basic subroutine. But the moment the specific
task was completed, control would move up the hierarchy until it reached
the master program again. Although this arrangement remained satisfactory
for many years, and indeed, many computer programs are still written that
way, more flexible schemes were needed for some specific, and at the time,
esoteric applications of computers, mostly in Artificial Intelligence.
Trying to build a robot using a hierarchy of subroutines meant that researchers
had to completely foresee all the tasks that a robot would need to do and
to centralize all decision-making into a master program. But this, of course,
would strongly limit the responsiveness of the robot to events occurring
in its surroundings, particularly if those events diverged from the predictions
made by the programmers. One solution to this was to decentralize control.
The basic tasks that a robot had to perform were still coded into programs,
but unlike subroutines these programs were not commanded into action by
a master program. Instead, these programs were given some autonomy and
the ability to scan the data base on their own. Whenever the found a specific
pattern in the data they would perform whatever task they were supposed
to do. In a very real sense, it was now the data itself that controlled
the process. And, more importantly, if the data base was connected to the
outside world via sensors, so that patterns of data reflected patterns
of events outside the robot, then the world itself was now controlling
the computational process, and it was this that gave the robot a degree
of responsiveness to its surroundings. Thus, machines went from being hardware-driven,
to being software-driven, then data-driven and finally event-driven. Your
typical Macintosh computer is indeed an event-driven machine even if the
class of real world events that it is responsive to is very limited, including
only events happening to the mouse (such as position changes and clicking)
as well as to other input devices. But regardless of the narrow class of
events that personal computers are responsive to, it is in these events
that much of the control of the processes now resides. Hence, behind the
innovative use of windows, icons, menus and the other familiar elements
of graphical interfaces, there is this deep conceptual shift in the location
of control which is embodied in object-oriented languages. Even the new
interface designs based on semi-autonomous agents were made possible by
this decentralization of control. Indeed, simplifying a little, we may
say that the new worlds of agents, whether those that inhabit computer
screens or more generally, those that inhabit any kind of virtual environment
(such as those used in Artificial Life), have been the result of pushing
the trend away from software command hierarchies ever further. The distinction
between centralized and decentralized control of given process has come
to occupy center-stage in many different contemporary philosophies. It
will be useful to summarize some of this philosophical currents before
I continue my description of agent-based interfaces, since this will reveal
that the paradigm-shift is by no means confined to the area of software
design. Economist and Artificial Intelligence guru Herbert Simon views
bureaucracies and markets as the human institutions which best embody these
two conceptions of control.{3} Hierarchical institutions are the easiest
ones to analyze, since much of what happens within a bureaucracy in planned
by someone of higher rank, and the hierarchy as a whole has goals and behaves
in ways that are consistent with those goals. Markets, on the other hand,
are tricky. Indeed, the term "market" needs to be used with care because
it has been greatly abused over the last century by theorists on the left
and the right. As Simon remarks, the term does not refer to the world of
corporations, whether monopolies or oligopolies, since in these commercial
institutions decision-making is highly centralized, and prices are set
by command. I would indeed limit the sense of the term even more to refer
exclusively to those weakly gatherings of people at a predefined place
in town, and not to a dispersed set of consumers catered by a system of
middleman (as when one speaks of the "market" for personal computers).
The reason is that, as historian Fernand Braudel has made it clear, it
is only in markets in the first sense that we have any idea of what the
dynamics of price formation are. In other words, it is only in peasant
and small town markets that decentralized decision-making leads to prices
setting themselves up in a way that we can understand. In any other type
of market economists simply assume that supply and demand connect to each
other in a functional way, but they do not give us any specific dynamics
through which this connection is effected. {4} Moreover, unlike the idealized
version of markets guided by an "invisible hand" to achieve an optimal
allocation of resources, real markets are not in any sense optimal. Indeed,
like most decentralized, self-organized structures, they are only viable,
and since they are not hierarchical they have no goals, and grow and develop
mostly by drift. {5} Herbert Simon's distinction between command hierarchies
and markets may turn out to be a special case of a more general dichotomy.
In the view of philosophers Gilles Deleuze and Felix Guattari, this more
abstract classes, which they call strata and self-consistent aggregates
(or trees and rhizomes), are defined not so much by the locus of control,
as by the nature of elements that are connected together. Strata are composed
of homogenous elements, whereas self-consistent aggregates articulate heterogeneous
elements as such. {6} For example, a military hierarchy sorts people into
internally homogenous ranks before joining them together through a chain
of command. Markets, on the other hand, allow for a set of heterogeneous
needs and offers to become articulated through the price mechanism, without
reducing this diversity. In biology, species are an example of strata,
particularly if selection pressures have operated unobstructedly for long
periods of time allowing the homogenization of the species gene pool. On
the other hand, ecosystems are examples of self-consistent aggregates,
since they link together into complex food webs a wide variety of animals
and plants, without reducing their heterogeneity. I have developed this
theory in more detail elsewhere, but for our purposes here let's simply
keep the idea that besides centralization and decentralization of control,
what defines these two types of structure is the homogeneity or heterogeneity
of its composing elements. Before returning to our discussion of agent-based
interfaces, there is one more point that needs to be stressed. As both
Simon and Deleuze and Guattari emphasize, the dichotomy between bureaucracies
and markets, or to use the terms that I prefer, between hierarchies and
meshworks, should be understood in purely relative terms. In the first
place, in reality it is hard to find pure cases of these two structures:
even the most goal-oriented organization will still show some drift in
its growth and development, and most markets even in small towns contain
some hierarchical elements, even if it is just the local wholesaler which
manipulates prices by dumping (or withdrawing) large amounts of a product
on (or from) the market. Moreover, hierarchies give rise to meshworks and
meshworks to hierarchies. Thus, when several bureaucracies coexist (governmental,
academic, ecclesiastic), and in the absence of a super-hierarchy to coordinate
their interactions, the whole set of institutions will tend to form a meshwork
of hierarchies, articulated mostly through local and temporary links. Similarly,
as local markets grow in size, as in those gigantic fairs which have taken
place periodically since the Middle Ages, they give rise to commercial
hierarchies, with a money market on top, a luxury goods market underneath
and, after several layers, a grain market at the bottom. A real society,
then, is made of complex and changing mixtures of these two types of structure,
and only in a few cases it will be easy to decide to what type a given
institution belongs. A similar point may be made about the worlds inhabited
by software agents. The Internet, to take the clearest example first, is
a meshwork which grew mostly by drift. No one planned either the extent
or the direction of its development, and indeed, no one is in charge of
it even today. The Internet, or rather its predecessor, the Arpanet, acquired
its decentralized structure because of the needs of U.S. military hierarchies
for a command and communications infrastructure which would be capable
of surviving a nuclear attack. As analysts from the Rand Corporation made
it clear, only if the routing of the messages was performed without the
need for a central computer could bottlenecks and delays be avoided, and
more importantly, could the meshwork put itself back together once a portion
of it had been nuclearly vaporized. But in the Internet only the decision-making
behind routing is of the meshwork type. Decision-making regarding its two
main resources, computer (or CPU) time and memory, is still hierarchical.
Schemes to decentralize this aspect do exist, as in Drexler's Agoric Systems,
where the messages which flow through the meshwork have become autonomous
agents capable of trading among themselves both memory and CPU time. {7}
The creation by General Magic of its Teletext operating system, and of
agents able to perform transactions on behalf of users, is one of the first
real-life steps in the direction of a true decentralization of resources.
But in the meanwhile, the Internet will remain a hybrid of meshwork and
hierarchy components, and the imminent entry of big corporations into the
network business may in fact increase the amount of command components
in its mix. These ideas are today being hotly debated in the field of interface
design. The general consensus is that interfaces must become more intelligent
to be able to guide users in the tapping of computer resources, both the
informational wealth of the Internet, as well as the resources of ever
more elaborate software applications. But if the debaters agree that interfaces
must become smarter, and even that this intelligence will be embodied in
agents, they disagree on how the agents should acquire their new capabilities.
The debate pits two different traditions of Artificial Intelligence against
each other: Symbolic AI, in which hierarchical components predominate,
against Behavioral AI, where the meshwork elements are dominant. Basically,
while in the former discipline one attempts to endow machines with intelligence
by depositing a homogenous set of rules and symbols into a robot's brain,
in
the latter one attempts to get intelligent behavior to emerge from the
interactions of a few simple task-specific modules in the robot's head,
and the heterogeneous affordances of its environment. Thus, to build a
robot that walks around a room, the first approach would give the robot
a map of the room, together with the ability to reason about possible walking
scenarios in that model of the room. The second approach, on the other
hand, endows the robot with a much simpler set of abilities, embodied in
modules that perform simple tasks such as collision-avoidance, and walking-around-the-room
behavior emerges from the interactions of these modules and the obstacles
and openings that the real room affords the robot as it moves.{8} Translated
to the case of interface agents, for instance, personal assistants in charge
of aiding the user to understand the complexities of particular software
applications, Symbolic AI would attempt to create a model of the application
as well as a model of the working environment, including a model of an
idealized user, and make these models available in the form of rules or
other symbols to the agent. Behavioral AI, on the other hand, gives the
agent only the ability to detect patterns of behavior in the actual user,
and to interact with the user in different ways so as to learn not only
from his or her actual behavior but also from feedback that the user gives
it. For example, the agent in question would be constantly looking over
the user's shoulder keeping track on whatever regular or repetitive patterns
it observes. It then attempts to establish statistical correlations between
certain pairs of actions that tend to occur together. At some point the
agent suggests to the user the possibility of automating these actions,
that is, that whenever the first occurs, the second should be automatically
performed. Whether the user accepts or refuses, this gives feedback to
the agent. The agent may also solicit feedback directly, and the user may
also teach the agent by giving some hypothetical examples. {9} In terms
of the location of control, there is very little difference between the
agents that would result, and in this sense, both approaches are equally
decentralized. The rules that Symbolic AI would put in the agents head,
most likely derived from interviews of users and programmers by a Knowledge
Engineer, are independent software objects. Indeed, in one of the most
widely used programming languages in this kind of approach (called a "production
system") the individual rules have even more of a meshwork structure that
many object-oriented systems, which still cling to a hierarchy of objects.
But in terms of the overall human-machine system, the approach of Symbolic
AI is much more hierarchical. In particular, by assuming the existence
of an ideal user, with homogenous and unchanging habits, and of a workplace
where all users are similar, agents created by this approach are not only
less adaptive and more commanding, they themselves promote homogeneity
in their environment. The second class of agents, on the other hand, are
not only sensitive to heterogeneities, since they adapt to individual users
and change as the habits of this users change, they promote heterogeneity
in the work place by not subordinating every user to the demands of an
idealized model. One drawback of the approach of Behavioral AI is that,
given that the agent has very little knowledge at the beginning of a relationship
with a user, it will be of little assistance for a while until it learns
about his or her habits. Also, since the agent can only learn about situations
that have recurred in the past, it will be of little help when the user
encounters new problems. One possible solution, is to increase the amount
of meshwork in the mix and allow agents from different users to interact
with each other in a decentralized way. {10} Thus, when a new agent begins
a relation with a user, it can consult with other agents and speed up the
learning process, assuming that is, that what other agents have learned
is applicable to the new user. This, of course, will depend on the existence
of some homogeneity of habits, but at least it does not assume a complete
homogenous situation from the outset, an assumption which in turn promotes
further uniformization. Besides, endowing agents with a static model of
the users makes them unable to cope with novel situations. This is also
a problem in the Behavioral AI approach but here agents may aid one another
in coping with novelty. Knowledge gained in one part of the workplace can
be shared with the rest, and new knowledge may be generated out of the
interactions among agents. In effect, a dynamic model of the workplace
would be constantly generated and improved by the collective of agents
in a decentralized way, instead of each one being a replica of each other
operating on the basis of a static model centrally created by a knowledge
engineer. I would like to conclude this brief analysis of the issues raised
by agent-based interfaces with some general remarks. First of all, from
the previous comments it should be clear that the degree of hierarchical
and homogenizing components in a given interface is a question which affects
more than just events taking place in the computer's screen. In particular,
the very structure of the workplace, and the relative status of humans
and machines is what is at stake here. Western societies have undergone
at least two centuries of homogenization, of which the most visible element
is the assembly-line and related mass-production techniques, in which the
overall thrust was to let machines discipline and control humans. In this
circumstances, the arrival of the personal computer was a welcome antidote
to the development of increasingly more centralized computer machinery,
such as systems of Numerical Control in factories. But this is hardly a
victory. After two hundred years of constant homogenization, working skills
have been homogenized via routinization and Taylorization, building materials
have been given constant properties, the gene pools of our domestic species
homogenized through cloning, and our languages made uniform through standardization.
To make things worse, the solution to this is not simply to begin adding
meshwork components to the mix. Indeed, one must resist the temptation
to make hierarchies into villains and meshworks into heroes, not only because,
as I said, they are constantly turning into one another, but because in
real life we find only mixtures and hybrids, and the properties of these
cannot be established through theory alone but demand concrete experimentation.
Certain standardizations, say, of electric outlet designs or of data-structures
traveling through the Internet, may actually turn out to promote heterogenization
at another level, in terms of the appliances that may be designed around
the standard outlet, or of the services that a common data-structure may
make possible. On the other hand, the mere presence of increased heterogeneity
is no guarantee that a better state for society has been achieved. After
all, the territory occupied by former Yugoslavia is more heterogeneous
now than it was ten years ago, but the lack of uniformity at one level
simply hides an increase of homogeneity at the level of the warring ethnic
communities. But even if we managed to promote not only heterogeneity,
but diversity articulated into a meshwork, that still would not be a perfect
solution. After all, meshworks grow by drift and they may drift to places
where we do not want to go. The goal-directedness of hierarchies is the
kind of property that we may desire to keep at least for certain institutions.
Hence, demonizing centralization and glorifying decentralization as the
solution to all our problems would be wrong. An open and experimental attitude
towards the question of different hybrids and mixtures is what the complexity
of reality itself seems to call for. To paraphrase Deleuze and Guattari,
never believe that a meshwork will suffice to save us. {11} Footnotes:
{1} Abbot Payson Usher. The Textile Industry, 1750-1830. In Technology
in Western Civilization. Vol. 1. Melvin Kranzberg and Carrol W. Pursell
eds. (Oxford University Press, New York 1967). p. 243 {2} Andrew Hodges.
Alan Turing: The Enigma. (Simon & Schuster, New York 1983). Ch. 2 {3}
Herbert Simon. The Sciences of the Artificial. (MIT Press, 1994). p.43
{4} Fernand Braudel. The Wheels of Commerce. (Harper and Row, New York,
1986). Ch. I {5} Humberto R. Maturana and Francisco J. Varela. The Tree
of Knowledge. The Biological Roots of Human Understanding. (Shambhala,
Boston 1992). p. 47 and 115. {6} Gilles Deleuze and Felix Guattari. A Thousand
Plateaus. (University of Minnesota Press, Minneapolis, 1987). p. 335 {7}
M.S. Miller and K.E. Drexler. Markets and Computation: Agoric Open Systems.
In The Ecology of Computation. Bernardo Huberman ed. (North-Holland, Amsterdam
1988). {8} Pattie Maes. Behaviour-Based Artificial Intelligence. In From
Animals to Animats. Vol. 2. Jean-Arcady Meyer, Herbert L. Roitblat and
Stewart W. Wilson. (MIT Press, Cambridge Mass, 1993). p. 3 {9} Pattie Maes
and Robyn Kozierok. Learning Interface Agents. In Proceedings of AAAI È93
Conference. (AAAI Press, Seattle WA. 1993). p. 459-465 {10} Yezdi Lashari,
Max Metral and Pattie Maes. Collaborative Interface Agents. In Proceedings
of 12th National Conference on AI. (AAAI Press, Seattle WA. 1994). p. 444-449
{11} Deleuze and Guattari. op. cit. p. 500. (Their remark is framed in
terms of "smooth spaces" but it may be argued that this is just another
term for meshworks). =============== Economics, Computers and the War Machine.
by Manuel De Landa. When we "civilians" think about military questions
we tend to view the subject as encompassing a rather specialized subject
matter, dealing exclusively with war and its terrible consequences. It
seems fair to say that, in the absence of war (or at least the threat of
war, as in the case of government defense budget debates) civilians hardly
ever think about military matters. The problem is that, from a more objective
historical perspective, the most important effects of the military establishment
on the civilian world in the last four hundred years have been during peacetime,
and have had very little to do with specifically military subjects, such
as tactics or strategy. I would like to suggest that, starting in the 1
500's, Western history has witnessed the slow militarisation of civilian
society, a process in which schools, hospitals and prisons slowly came
to adopt a form first pioneered in military camps and barracks, and factories
came to share a common destiny with arsenals and armories. I should immediately
add, however, that the influence was hardly unidirectional, and that what
needs to be considered in detail are the dynamics of complex "institutional
ecologies", in which a variety of organizations exert mutual influences
on one another. Nevertheless, much of the momentum of this process was
maintained by military institutions and so we may be justified in using
the term "militarisation". On one hand, there is nothing too surprising
about this. Ever since Napoleon changed warfare from the dynastic duels
of the eighteenth century to the total warfare with which we are familiar
in this century, war itself has come to rely on the complete mobilization
of a society's industrial and human resources. While the armies of Frederick
the Great were composed mostly of expensive mercenaries, who had to be
carefully used in the battlefield, the Napoleonic armies benefited from
the invention of new institutional means of converting the entire population
of a country into a vast reservoir of human resources. Although technically
speaking the French revolution did not invent compulsory military service,
its institutional innovations did allow its leaders to perform the first
modern mass conscription, involving the conversion of all men into soldiers,
and of all women into cheap laborers. As the famous proclamation of 1793
reads: "...all Frenchmen are permanently requisitioned for service into
the armies. Young men will go forth to battle; married men will forge weapons
and transport munitions; women will make tents and clothing and serve in
hospitals; children will make lint from old linen; and old men will be
brought to the public squares to arouse the courage of the soldiers, while
preaching the unity of the Republic and hatred against Kings." {1} This
proclamation, and the vast bureaucratic machinery needed to enforce it,
effectively transformed the civilian population of France into a resource
(for war, production, motivation) to be tapped into at will by the military
high command. A similar point applies to the industrial, mineral and agricultural
resources of France and many other nation states. Given the complete mobilization
of society's resources involved in total war it is therefore not surprising
that there has been a deepening of military involvement in civilian society
in the last two centuries. However, I would want to argue that, in addition
to the links between economic, political and military institutions brought
about by war time mobilizations, there are other links, which are older,
subtler but for the same reason more insidious, which represent a true
militarisation of society during peace time. To retire to the French example,
some of the weapons that the Napoleonic armies used were the product of
a revolution in manufacturing techniques which took place in French armories
in the late eighteenth century. In French armories, the core concepts and
techniques of what later would become assemblyline, mass production
techniques, were for the first time developed. The ideal of creating weapons
with perfectly interchangeable parts, and ideal which could not be fulfilled
without standardization and routinization of production, was taken even
further in American arsenals in the early 19th century. And it was there
that military engineers first realized that in practice, standardization
went hand in hand with replacement of flexible individual skills with rigid
collective routines, enforced through constant discipline and monitoring.
Even before that, in the Dutch armies of the sixteenth century, this process
had already begun. Civilians tend to think of Frederick Taylor, the late
nineteenth century creator of socalled "scientific management" techniques,
as the pioneer of labor process analysis, that is, the breaking down of
a given factory practice into micromovements and the streamlining
of these movements for greater efficiency and centralized management control.
But Dutch commander Maurice of Nassau had already applied these methods
to the training of his soldiers beginning in the 1560's. Maurice analyzed
the motion needed to load, aim and fire a weapon into its micromovements,
redesigned them for maximum efficiency and then imposed them on his soldiers
via continuous drill and discipline. {2} Yet, while the soldiers increased
their efficiency tremendously as a collective whole, each individual soldier
completely lost control of his actions in the battlefield. And a similar
point applies to the application of this idea to factory workers, before
and after Taylorism. Collectively they became more productive, generating
the economies of scale so characteristic of twentycentury big business,
while simultaneously completely losing control of their individual actions.
This is but one example of the idea of militarisation of society. Recent
historians have rediscovered several other cases of the military origins
of what was once thought to be civilian innovations. In recent times it
has been Michel Foucault who has most forcefully articulated this view.
For him this intertwining of military and civilian institutions is constitutive
of the modern European nationstate. On one hand, the project of nationbuilding
was an integrative movement, forging bonds that went beyond the primordial
ties of family and locality, linking urban and rural populations under
a new social contract. On the other, and complementing this process of
unification, there was the less conscious project of uniformation, of submitting
the new population of free citizens to intense and continuous training,
testing and exercise to yield a more or less uniform mass of obedient individuals.
In Foucault's own words: "Historians of ideas usually attribute the dream
of a perfect society to the philosophers and jurists of the eighteenth
century; but there was also a military dream of society; its fundamental
reference was not to the state of nature, but to the meticulously subordinated
cogs of a machine, not to the primal social contract, but to permanent
coercions, not to fundamental rights, but to indefinitely progressive forms
of training, not to the general will but to automatic docility... The Napoleonic
regime was not far off and with it the form of state that was to survive
it and, we must not forget, the foundations of which were laid not only
by jurists, but also by soldiers, not only counselors of state, but also
junior officers, not only the men of the courts, but also the men of the
camps. The Roman reference that accompanied this formation certainly bears
with it this double index: citizens and legionnaires, law and maneuvers.
While jurists or philosophers were seeking in the pact a primal model for
the construction or reconstruction of the social body, the soldiers and
with them the technicians of discipline were elaborating procedures for
the individual and collective coercion of bodies." {3} Given that modern
technology has evolved in such a world of interacting economic, political
and military institutions, it should not come as a surprise that the history
of computers, computer networks, Artificial Intelligence and other components
of contemporary technology, is so thoroughly intertwined with military
history. Here, as before, we must carefully distinguish those influences
which occurred during wartime from those that took place in peacetime,
since the former can be easily dismissed as involving the military simply
as a catalyst or stimulant, that is, an accelerator of a process that would
have occurred more slowly without its direct influence. The computer itself
may be an example of indirect influence. The basic concept, as everyone
knows, originated in a most esoteric area of the civilian world. In the
1 930's British mathematician Alan Turing created the basic concept of
the computer in an attempt to solve some highly abstract questions in metamathematics.
But for that reason, the Turing Machine, as his conceptual machine was
called, was a long way from an actual, working prototype. It was during
World War 11, when Turing was mobilized as part of the war effort to crack
the Nazi's Enigma code, that, in the course of his intense participation
in that operation, he was exposed to some of the practical obstacles blocking
the way towards the creation of a real Turing Machine. On the other side
of the Atlantic, John Von Neuman also developed his own practical insights
as to how to bring the Turing Machine to life, in the course of his participation
in the Manhattan Project and other war related operations. In this case
we may easily dismiss the role that the military played, arguing that without
the intensification and concentration of effort brought about by the war,
the computer would have developed on its own, perhaps at a slower pace.
And I agree that this is correct. On the other hand, many of the uses to
which computers were put after the war illustrate the other side of the
story: a direct participation of military institutions in the development
of technology, a participation which actually shaped this technology in
the direction of uniformization, routinization and concentration of control.
Perhaps the best example of this other relation between the military and
technology is the systems of machinepart production known as Numerical
Control methods. While the methods developed in 19th. century arsenals,
and later transferred to civilian enterprises, had already increased uniformity
and centralized control in the production of large quantities of the same
object (that is, mass production), this had left untouched those areas
of production which create relatively small batches of complex machine
parts. Here the skills of the machinist were still indispensable as late
as World War II. During the 1950's, the Air Force underwrote not only the
research and development of a new system to get rid of the machinist's
skills, but also the development of software, the actual purchase of machinery
by contractors, and the training of operators and programmers. In a contemporary
Numerical Control system, after the engineer draws the parts that need
to be produced, the drawings themselves are converted into data and stored
in cards or electronically. From then on, all the operations needed to
be performed, drilling, milling, lathing, boring, and so on, are performed
automatically by computercontrolled machines. Unlike massproduction
techniques, where this automatism was achieved at the expense of flexibility,
in Numerical Control systems a relatively simple change in software (not
hardware) is all that is needed to adapt the system for the production
of a new set of parts. Yet, the effects on the population of workers were
very similar in both cases: the replacement of flexible skills by rigid
commands embodied in hardware or software, and over time, the loss of those
skills leading to a general process of worker deskilling, and consequently,
to the loss of individual control of the production process. The question
in both cases is not the influence that the objects produced in militarized
factories may have on the civilian world. One could, for instance, argue
that the support of the canned food industry by Napoleon had a beneficial
effect on society, and a similar argument may be made for many objects
developed under military influence. The question, however, is not the transfer
of objects, but the transfer of the production processes behind those objects
that matters, since these processes bring with them the entire control
and command structure of the military with them. To quote historian David
Noble: "The command imperative entailed direct control of production operations
not just with a single machine or within a single plant, but worldwide,
via data links. The vision of the architects of the [Numerical Control]
revolution entailed much more than the automatic machining of complex parts;
it meant the elimination of human intervention a shortening of the
chain of command and the reduction of remaining people to unskilled,
routine, and closely regulated tasks." And he adds that Numerical Control
is a "giant step in the same direction [as the 19th. century drive for
uniformity]; here management has the capacity to bypass the worker and
communicate directly to the machine via tapes or direct computer link.
The machine itself can thereafter pace and discipline the worker." {4}
Let's pause for a moment and consider a possible objection to this analysis.
One may argue that the goal of withdrawing control from workers and transferring
it to machines is the essence of the capitalist system and that, if military
institutions happened to be involved, they did so by playing the role assigned
to them by the capitalist system. The problem with this reply is that,
although it may satisfy a convinced Marxist, it is at odds with much historical
data gathered by this century's best economic historians. This data shows
that European societies, far from having evolved through a unilinear progression
of "modes of production" (feudalism, capitalism, socialism), actually exhibited
a much more complex, more heterogeneous coexistence of processes. In other
words, as historian Ferdinand Braudel has shown, as far back as the fourteenth
and fifteenth centuries, institutions with the capability of exercising
economic power (large banks, wholesalers, longdistance trade companies)
were already in operation. and fully coexisted with feudal institutions
as well as with economic institutions that did no have economic power,
such as retailers and producers of humble goods. Indeed, Braudel shows
that these complex coexistances of institutions of different types existed
before and after the Industrial Revolution, and suggests that the concept
of a "capitalist system" (where every aspect of society is connected into
a functional whole) gives a misleading picture of the real processes. What
I am suggesting here is that we take Braudel seriously, forget about our
picture of history as divided into neat, internally homogeneous eras or
ages, and tackle the complex combinations of institutions involved in real
historical processes. The models we create of these complex "institutional
ecologies" should include military organizations playing a large, relatively
independent role, to reflect the historical data we now have on several
important cases, like fifteenth century Venice, whose famous Arsenal was
at the time the largest industrial complex in Europe, or at eighteenth
century France and nineteenth century United States, and their military
standardization of weapon production. Another important example, involves
the development of the modern corporation, particularly as it happened
in the United States in the last century. The first American big business
was the railroad industry, which developed the management techniques which
many other large enterprises would adopt later on. This much is well known.
What is not so well known is that military engineers were deeply involved
in the creation of the first railroads and that they developed many of
the features of management which later on came to characterize just about
every large commercial enterprise in the United States, Europe and elsewhere.
In the words of historian Charles O'Connell: "As the railroads evolved
and expanded, they began to exhibit structural and procedural characteristics
that bore a remarkable resemblance to those of the Army. Both organizations
erected complicated management hierarchies to coordinate and control a
variety of functionally diverse, geographically separated corporate activities.
Both created specialized staff bureaus to provide a range of technical
and logistical support services. Both divided corporate authority and responsibility
between line and staff agencies and officers and then adopted elaborate
written regulations that codified the relationship between them. Both established
formal guidelines to govern routine activities and instituted standardized
reporting and accounting procedures and forms to provide corporate headquarters
with detailed financial and operational information which flowed along
carefully defined lines of communication. As the railroads assumed these
characteristics, they became America's first 'big business'." {5} Thus,
the transfer of military practices to the civilian world influenced the
lives not only of workers, but of the managers themselves. And the influence
did not stop with the development of railroads. The "management science"
which is today taught in business schools is a development of military
"operations research", a discipline created during World War 11 to tackle
a variety of tactical, strategic and logistic problems. And it was the
combination of this "science of centralization" and the availability of
large computers that, in turn, allowed the proliferation of transnational
corporations and the consequent internationalization of the standardization
and routinization of production processes. Much as skills were replaced
by commands in the shop floor, so were prices replaced by commands at the
management level. (This is one reason not to use the term "markets" when
theorizing big business. Not only they rely on commands instead of prices,
they manipulate demand and supply rather than being governed by them. Hence,
Braudel has suggested calling big business "antimarkets"). {6} Keeping
in mind the actual complexity of historical processes, as opposed to explaining
everything by the "laws of capitalist development", is crucial not only
to understand the past, but also to intervene in the present and speculate
about the future. This is particularly clear when analyzing the role which
computers and computer networks may play in the shaping of the economic
world in the coming century. It is easy to attribute many of the problems
we have today, particularly those related to centralized surveillance and
control, to computer technology. But to do this would not only artificially
homogenize the history of computers (there are large differences between
the development of mainframes and minicomputers, on one hand, and the personal
computer, on the other) but it would obscure the fact that, if computers
have come to play the "disciplinarian" roles they play today it is as part
of a historical processes which is several centuries old, a process which
computers have only intensified. Another advantage of confronting the actual
heterogeneity of historical processes, and of throwing to the garbage the
concept of "the capitalist system", is that we free ourselves to look around
for combinations of economic institutions which coexist with disciplinarian
antimarkets but do not play by the same rules. Historically, as Braudel
has shown, economic power since the 14th century has always been associated
with large size enterprises and their associated "economies of scale".
Although technically this term only applies to massproduced objects,
economies of scale meaning the spreading of production costs among many
identical products, we may use it in an extended way to define any economic
benefits to managers, merchants and financiers stemming from the scale
of any economic resource. Coexisting with economies of scale there are
what is called "economies of agglomeration". These are economic benefits
which small businesses enjoy from the concentration of many of them in
a large city. These economies stem from the benefits of shoptalk,
from unplanned connections and mutual enhancements, as well as for the
services which grow around these concentrations, services which small business
could not afford on their own. To conclude this talk I would like to give
one example, from the world of computers, of two American industrial hinterlands
which illustrate the difference between economies of scale and of agglomeration:
Silicon Valley in Northern California, and Route 128 near Boston: "Silicon
Valley has a decentralized industrial system that is organized around regional
networks. Like firms in Japan, and parts of Germany and Italy, Silicon
Valley companies tend to draw on local knowledge and relationships to create
new markets, products, and applications. These specialist firms compete
intensely while at the same time learning from one another about changing
markets and technologies. The region's dense social networks and open labor
markets encourage experimentation and entrepreneurship. The boundaries
within firms are porous, as are those between firms themselves and between
firms and local institutions such as trade associations and universities."
{7} The growth of this region owed very little to large financial flows
from governmental and military institutions. Silicon Valley did not develop
so much by economies of scale, as by the benefits derived from an agglomeration
of visionary engineers, specialist consultants and financial entrepreneurs.
Engineers moved often from one firm to another, developing loyalties to
the craft and region's networks, not to the corporation. This constant
migration, plus an unusual practice of information sharing among the local
producers, insured that new formal and informal knowledge diffused rapidly
through the entire region. Business associations fostered collaboration
between small and mediumsized companies. Risktaking and innovation
were preferred to stability and routinization. This, of course, does not
mean that there were not large, routinized firms in Silicon Valley, only
that they did not dominate the mix. Not so in Route 128: "While Silicon
Valley producers of the 1970's were embedded in, and inseparable from,
intricate social and technical networks, the Route 128 region came to be
dominated by a small number of highly selfsufficient corporations.
Consonant with New England's two century old manufacturing tradition, Route
128 firms sought to preserve their independence by internalizing a wide
range of activities. As a result, secrecy and corporate loyalty govern
relations between firms and their customers, suppliers, and competitors,
reinforcing a regional culture of stability and selfreliance. Corporate
hierarchies insured that authority remains centralized and information
flows vertically. The boundaries between and within firms and between firms
and local institutions thus remain far more distinct." {8} While before
the recession of the 1980's both regions had been continuously expanding,
one on economies of scale and the other on economies of agglomeration (or
rather, mixtures dominated by one or the other), they both felt the full
impact of the downturn. At that point some large Silicon Valley firms,
unaware of the dynamics behind the region's success, began to switch to
economies of scale, sending parts of their production to other areas, and
internalizing activities previously performed by smaller firms. Yet, unlike
Route 128, the intensification of routinization and internalization in
Silicon Valley was not a constitutive part of the region, which meant that
the old meshwork system could be revived. And this is, in fact, what happened.
Silicon Valley's regional networks were reenergized, through the birth
of new firms in the old pattern, and the region has now returned to its
former dynamic state, unlike the commandheavy Route 128 which continues
to stagnate. What this shows is that, while both scale and agglomeration
economies, as forms of positive feedback, promote growth, only the latter
endows firms with the flexibility needed to cope with adverse economic
conditions. In conclusion I would like to repeat my call for more realistic
models of economic history, models involving the full complexity of the
institutional ecologies involved, including markets, antimarkets,
military and bureaucratic institutions, and if we are to believe Michel
Foucault, schools, hospitals, prisons and many others. It is only through
an honest philosophical confrontation with our complex past that we can
expect to understand it and derive the lessons we may use when intervening
in the present and speculating about the future. References: {1} Excerpt
from the text of the levee en mass of 1793, quoted in William H. McNeill.
The Pursuit of Power. Technology, Armed Force and Society since A.D. 1000.
(University of Chicago Press, 1982). p. 192 {2} ibid. p. 129 {3} Michel
Foucault. Discipline and Punish. The Birth of Prison. (Vintage Books, New
York, 1979) p. 169 {4} David Noble. Command Performance: A Perspective
on Military Enterprise and Technological Change. In Merrit Roe Smith ed.
Military Enterprise. (MIT Press, 1987). p. 341 and 342. {5} Charles F.
O'Connell, Jr. The Corps of Engineers and the Rise of Modern Management.
In ibid. p. 88 {6} Fernand Braudel. The Wheels of Commerce. (Harper and
Row, New York, 1986). p.379 {7} Annalee Saxenian. Lessons from Silicon
Valley. In Technology Review, Vol. 97, no. 5. page. 44 {8} ibid. p. 47
=================== MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY by Manuel
De Landa. One of the most significant epistemological events in recent
years is the growing importance of historical questions in the ongoing
reconceptualization of the hard sciences. I believe it is not an exaggeration
to say that in the last two or three decades, history has almost completely
infiltrated physics, chemistry and biology. It is true that nineteenth
century thermodynamics had already introduced an arrow of time into physics,
and hence the idea of irreversible historical processes. It is also true
that the theory of evolution had already shown that animals and plants
were not embodiments of eternal essences but piecemeal historical constructions,
slow accumulations of adaptive traits cemented together via reproductive
isolation. However, the classical versions of these two theories incorporated
a rather weak notion of history into their conceptual machinery: both thermodynamics
and Darwinism admitted only one possible historical outcome, the reaching
of thermal equilibrium or of the fittest design. In both cases, once this
point was reached, historical processes ceased to count. For these theories,
optimal design or optimal distribution of energy represented, in a sense,
an end of history. Hence, it should come as no surprise that the current
penetration of science by history has been the result of advances in these
two disciplines. Ilya Prigogine revolutionized thermodynamics in the 1960's
by showing that the classical results were only valid for closed systems
where the overall amounts of energy are always conserved. If one allows
energy to flow in and out of a system, the number and type of possible
historical outcomes greatly increases. Instead of a unique and simple equilibrium,
we now have multiple ones of varying complexity (static, periodic and chaotic
attractors); and moreover, when a system switches from one to another form
of stability (at a so-called bifurcation), minor fluctuations can be crucial
in deciding the actual form of the outcome. Hence, when we study a given
physical system, we need to know the specific nature of the fluctuations
that have been present at each of its bifurcations, in other words, we
need to know its exact history to understand its current dynamical form.
{1} And what is true of physical systems is all the more so for biological
ones. Attractors and bifurcations are features of any system in which the
dynamics are nonlinear, that is, in which there are strong interactions
between variables. As biology begins to include these nonlinear dynamical
phenomena in its models, for example, in the case of evolutionary arms-races
between predators and prey, the notion of a "fittest design" loses its
meaning. In an arms-race there is no optimal solution fixed once and for
all, since the criterion of fitness itself changes with the dynamics. This
is also true for any adaptive trait which value depends on how frequent
it occurs in a given population, as well as in cases like migration, where
animal behavior interacts nonlinearly with selection pressures. As the
belief in a fixed criterion of optimality disappears from biology, real
historical processes come to reassert themselves once more. {2} Computers
have played a crucial role in this process of infiltration. The nonlinear
equations that go into these new historical models cannot be solved by
analytical methods alone, and so scientists need computers to perform numerical
simulations and discover the behavior of the solutions. But perhaps the
most crucial role of digital technology has been to allow a switch from
a purely analytic, top-down style of modeling, to a more synthetic, bottom-up
approach. In the growing discipline of Artificial Life, for instance, an
ecosystem is not modeled starting from the whole and dissecting it into
its component parts, but the other way around: one begins at the bottom,
with a population of virtual animals and plants and their local interactions,
and the ecosystem needs to emerge spontaneously from these local dynamics.
The basic idea is that the systematic properties of an ecosystem arise
from the interactions between its animal and plant components, so that
when one dissects the whole into parts the first thing we lose is any property
due to these interactions. Analytical techniques, by their very nature,
tend to kill emergent properties, that is, properties of the whole that
are more than the sum of its parts. Hence the need for a more synthetic
approach, in which everything systematic about a given whole is modeled
as a historically emergent result of local interactions. {3} These new
ideas are all the more important when we move on to the social sciences,
particularly economics. In this discipline, we tend to uncritically assume
systematicity, as when one talks of the "capitalist system", instead of
showing exactly how such systematic properties of the whole emerge from
concrete historical processes. Worse yet, we then tend to reify such unaccounted-for
systematicity, ascribing all kinds of causal powers to capitalism, to the
extent that a clever writer can make it seem as if anything at all (from
nonlinear dynamics itself to postmodernism or cyberculture) is the product
of late capitalism. This basic mistake, which is, I believe, a major obstacle
to a correct understanding of the nature of economic power, is partly the
result of the purely top-down, analytical style that has dominated economic
modeling from the eighteenth century. Both macroeconomics, which begins
at the top with concepts like gross national product, as well as microeconomics,
in which a system of preferences guides individual choice, are purely analytical
in approach. Neither the properties of a national economy nor the ranked
preferences of consumers are shown to emerge from historical dynamics.
Marxism, is true, added to these models intermediate scale phenomena, like
class struggle, and with it conflictive dynamics. But the specific way
in which it introduced conflict, via the labor theory of value, has now
been shown by Shraffa to be redundant, added from the top, so to speak,
and not emerging from the bottom, from real struggles over wages, or the
length of the working day, or for control over the production process.
{4} Besides a switch to a synthetic approach, as it is happening, for instance,
in the evolutionary economics of Nelson and Winter in which the emphasis
is on populations of organizations interacting nonlinearly, what we need
here is a return to the actual details of economic history. Much has been
learned in recent decades about these details, thanks to the work of materialist
historians like Fernand Braudel, and it is to this historical data that
we must turn to know what we need to model synthetically. Nowhere is this
need for real history more evident that in the subject of the dynamics
of economic power, defined as the capability to manipulate the prices of
inputs and outputs of the production process as well as their supply and
demand. In a peasant market, or even in a small town local market, everybody
involved is a price taker: one shows up with merchandise, and sells it
at the going prices which reflect demand and supply. But monopolies and
oligopolies are price setters: the prices of their products need not reflect
demand/supply dynamics, but rather their own power to control a given market
share. {5} When approaching the subject of economic power, one can safely
ignore the entire field of linear mathematical economics (so-called competitive
equilibrium economics), since there monopolies and oligopolies are basically
ignored. Yet, even those thinkers who make economic power the center of
their models, introduce it in a way that ignores historical facts. Authors
writing in the Marxist tradition, place real history in a straight-jacket
by subordinating it to a model of a progressive succession of modes of
production. Capitalism itself is seen as maturing through a series of stages,
the latest one of which is the monopolistic stage in this century. Even
non-Marxists economists like Galbraith, agree that capitalism began as
a competitive pursuit and stayed that way till the end of the nineteenth
century, and only then it reached the monopolistic stage, at which point
a planning system replaced market dynamics. However, Fernand Braudel has
recently shown, with a wealth of historical data, that this picture is
inherently wrong. Capitalism was, from its beginnings in the Italy of the
thirteenth century, always monopolistic and oligopolistic. That is to say,
the power of capitalism has always been associated with large enterprises,
large that is, relative to the size of the markets where they operate.
{6} Also, it has always been associated with the ability to plan economic
strategies and to control market dynamics, and therefore, with a certain
degree of centralization and hierarchy. Within the limits of this presentation,
I will not be able to review the historical evidence that supports this
extremely important hypothesis, but allow me at least to extract some of
the consequences that would follow if it turns out to be true. First of
all, if capitalism has always relied on non-competitive practices, if the
prices for its commodities have never been objectively set by demand/supply
dynamics, but imposed from above by powerful economic decision-makers,
then capitalism and the market have always been different entities. To
use a term introduced by Braudel, capitalism has always been an "antimarket".
This, of course, would seem to go against the very meaning of the word
"capitalism", regardless of whether the word is used by Karl Marx or Ronald
Reagan. For both nineteenth century radicals and twentieth century conservatives,
capitalism is identified with an economy driven by market forces, whether
one finds this desirable or not. Today, for example, one speaks of the
former Soviet Union's "transition to a market economy", even though what
was really supposed to happen was a transition to an antimarket: to large
scale enterprises, with several layers of managerial strata, in which prices
are set not taken. This conceptual confusion is so entrenched that I believe
the only solution is to abandon the term "capitalism" completely, and to
begin speaking of markets and antimarkets and their dynamics. This would
have the added advantage that it would allow us to get rid of historical
theories framed in terms of stages of progress, and to recognize the fact
that antimarkets could have arisen anywhere, not just Europe, the moment
the flows of goods through markets reach a certain critical level of intensity,
so that organizations bent on manipulating these flows can emerge. Hence,
the birth of antimarkets in Europe has absolutely nothing to do with a
peculiarly European trait, such as rationality or a religious ethic of
thrift. As is well known today, Europe borrowed most of its economic and
accounting techniques, those techniques that are supposed to distinguish
her as uniquely rational, from Islam. {8} Finally, and before we take a
look at what a synthetic, bottom-up approach to the study of economic dynamics
would be like, let me meet a possible objection to these remarks: the idea
that "real" capitalism did not emerge till the nineteenth century industrial
revolution, and hence that it could not have arisen anywhere else where
these specific conditions did not exist. To criticize this position, Fernand
Braudel has also shown that the idea that capitalism goes through stages,
first commercial, then industrial and finally financial, is not supported
by the available historical evidence. Venice in the fourteenth century
and Amsterdam in the seventeenth, to cite only two examples, already show
the coexistance of the three modes of capital in interaction. Moreover,
other historians have recently shown that that specific form of industrial
production which we tend to identify as "truly capitalist", that is, assembly-line
mass production, was not born in economic organizations, but in military
ones, beginning in France in the eighteenth century, and then in the United
States in the nineteenth. It was military arsenals and armories that gave
birth to these particularly oppressive control techniques of the production
process, at least a hundred years before Henry Ford and his Model-T cars
{10} Hence, the large firms that make up the antimarket, can be seen as
replicators, much as animals and plants are. And in populations of such
replicators we should be able to observe the emergence of the different
commercial forms, from the family firm, to the limited liability partnership
to the joint stock company. These three forms, which had already emerged
by the fifteenth century, must be seen as arising, like those of animals
and plants, from slow accumulations of traits which later become consolidated
into more or less permanent structures, and not, of course, as the manifestation
of some pre-existing essence. In short, both animal and plant species as
well as "institutional species" are historical constructions, the emergence
of which bottom-up models can help us study. It is important to emphasize
that we are not dealing with biological metaphors here. Any kind of replicating
system which produces variable copies of itself, coupled with any kind
of sorting device, is capable of evolving new forms. This basic insight
is now exploited technologically in the so-called "genetic algorithm",
which allows programmers to breed computer software instead of painstakingly
coding it by hand. A population of computer programs is allowed to reproduce
with some variation, and the programmer plays the role of sorting device,
steering the population towards the desired form. The same idea is what
makes Artificial Life projects work. Hence, when we say that the forms
the antimarket has taken are evolved historical constructions we do not
mean to say that they are metaphorically like organic forms, but that they
are produced by a process which embodies the same engineering diagram as
the one which generates organic forms. Another example may help clarify
this. When one says, as leftists used to say, that "class-struggle is the
motor of history", one is using the word "motor" in a metaphorical way.
On the other hand, to say that a hurricane is a steam motor is not to use
the term metaphorically, but literally: one is saying that the hurricane
embodies the same engineering diagram as a steam motor: it uses a reservoir
of heat and operates via differences of temperature circulated through
a Carnot cycle. The same is true of the genetic algorithm. Anything that
replicates, such as patterns of behavior transmitted by imitation, or rules
and norms transmitted by enforced repetition can give rise to novel forms,
when populations of them are subjected to selection pressures. And the
traits that are thus accumulated can become consolidated into a permanent
structure by codification, as when informal routines become written rules.
{11} In this case, we have the diagram of a process which generates hierarchical
structures, whether large institutions rigidly controlled by their rules
or organic structures rigidly controlled by their genes. There are, however,
other structure-generating processes which result in decentralized assemblages
of heterogeneous components. Unlike a species, an ecosystem is not controlled
by a genetic program: it integrates a variety of animals and plants in
a food web, interlocking them together into what has been called a "meshwork
structure". The dynamics of such meshworks are currently under intense
investigation and something like their abstract diagram is beginning to
emerge. {12} From this research, it is becoming increasingly clear that
small markets, that is, local markets without too many middlemen, embody
this diagram: they allow the assemblage of human beings by interlocking
complementary demands. These markets are indeed, self-organized decentralized
structures: they arise spontaneously without the need for central planning.
As dynamic entities they have absolutely nothing to do with an "invisible
hand", since models based on Adam Smith's concept operate in a frictionless
environment in which agents have perfect rationality and all information
flows freely. Yet, by eliminating nonlinearities, these models preclude
the spontaneous emergence of order, which depends crucially on friction:
delays, bottlenecks, imperfect decision-making and so on. The concept of
a meshwork can be applied not only to the area of exchange, but also to
that of industrial production. Jane Jacobs has created a theory of the
dynamics of networks of small producers meshed together by their interdependent
functions, and has collected some historical evidence to support her claims.
The basic idea is that certain relatively backward cities in the past,
Venice when it was still subordinated to Byzantium, or the network New
York-Boston-Philadelphia when still a supply zone for the British empire,
engage in what she calls, import-substitution dynamics. Because of their
subordinated position, they must import most manufactured products, and
export raw materials. Yet, meshworks of small producers within the city,
by interlocking their skills can begin to replace those imports with local
production, which can then be exchanged with other backward cities. In
the process, new skills and new knowledge is generated, new products begin
to be imported, which in turn, become the raw materials for a new round
of import-substitution. Nonlinear computer simulations have been created
of this process, and they confirm Jacobs' intuition: a growing meshwork
of skills is a necessary condition for urban morphodynamics. The meshwork
as a whole is decentralized, and it does not grow by planning, but by a
kind of creative drift. {13} Of course, this dichotomy between command
hierarchies and meshworks should not be taken too rigidly: in reality,
once a market grows beyond a certain size, it spontaneously generates a
hierarchy of exchange, with prestige goods at the top and elementary goods,
like food, at the bottom. Command structures, in turn, generate meshworks,
as when hierarchical organizations created the automobile and then a meshwork
of services (repair shops, gas stations, motels and so on), grew around
it. {14} More importantly, one should not romantically identify meshworks
with that which is "desirable" or "revolutionary", since there are situations
when they increase the power of hierarchies. For instance, oligopolistic
competition between large firms is sometimes kept away from price wars
by the system of interlocking directorates, in which representatives of
large banks or insurance companies sit in the boards of directors of these
oligopolies. In this case, a meshwork of hierarchies is almost equivalent
to a monopoly. {15} And yet, however complex the interaction between hierarchies
and meshworks, the distinction is real: the former create structures out
of elements sorted out into homogenous ranks, the latter articulates heterogeneous
elements as such, without homogenization. A bottom-up approach to economic
modeling should represent institutions as varying mixtures of command and
market components, perhaps in the form of combinations of negative feedback
loops, which are homogenizing, and positive feedback, which generates heterogeneity.
What would one expect to emerge from such populations of more or less centralized
organizations and more or less decentralized markets? The answer is, a
world-economy, or a large zone of economic coherence. The term, which should
not be confused with that of a global economy, was coined by Immanuel Wallerstein,
and later adapted by Braudel so as not to depend on a conception of history
in terms of a unilineal progression of modes of production. From Wallerstein
Braudel takes the spatial definition of a world-economy: an economically
autonomous portion of the planet, perhaps coexisting with other such regions,
with a definite geographical structure: a core of cities which dominate
it, surrounded by yet other economically active cities subordinated to
the core and forming a middle zone, and finally a periphery of completely
exploited supply zones. The role of core of the European world-economy
has been historically played by several cities: first Venice in the fourteenth
century, followed by Antwerp and Genoa in the fifteenth and sixteenth.
Amsterdam then dominated it for the next two centuries, followed by London
and then New York. Today, we may be witnessing the end of American supremacy
and the role of core seems to be moving to Tokyo. {16} Interestingly, those
cities which play the role of core, seem to generate in their populations
of firms, very few large ones. For instance, when Venice played this role,
no large organizations emerged in it, even though they already existed
in nearby Florence. Does this contradict the thesis that capitalism has
always been monopolistic? I think not. What happens is that, in this case,
Venice as a whole played the role of a monopoly: it completely controlled
access to the spice and luxury markets in the Levant. Within Venice, everything
seemed like "free competition", and yet its rich merchants enjoyed tremendous
advantages over any foreign rival, whatever its size. Perhaps this can
help explain the impression classical economists had of a competitive stage
of capitalism: when the Dutch or the British advocated "free competition"
internally is precisely when their cities as a whole held a virtual monopoly
on world trade. World-economies, then, present a pattern of concentric
circles around a center, defined by relations of subordination. Besides
this spatial structure, Wallerstein and Braudel add a temporal one: a world-economy
expands and contracts in a variety of rhythms of different lengths: from
short term business cycles to longer term Kondratiev cycles which last
approximately fifty years. While the domination by core cities gives a
world-economy its spatial unity, these cycles give it a temporal coherence:
prices and wages move in unison over the entire area. Prices are, of course,
much higher at the center than at the periphery, and this makes everything
flow towards the core: Venice, Amsterdam, London and New York, as they
took their turn as dominant centers, became "universal warehouses" where
one could find any product from anywhere in the world. And yet, while respecting
these differences, all prices moved up and down following these nonlinear
rhythms, affecting even those firms belonging to the antimarket, which
needed to consider those fluctuations when setting their own prices. These
self-organized patterns in time and space which define world-economies
were first discovered in analytical studies of historical data. The next
step is to use synthetic techniques and create the conditions under which
they can emerge in our models. In fact, bottom-up computer simulations
of urban economics where spatial and temporal patterns spontaneously emerge
already exist. For example, Peter Allen has created simulations of nonlinear
urban dynamics as meshworks of interdependent economic functions. Unlike
earlier mathematical models of the distribution of urban centers, which
assumed perfect rationality on the part of economic agents, and where spatial
patterns resulted from the optimal use of some resource such as transportation,
here patterns emerge from a dynamic of conflict and cooperation. As the
flows of goods, services and people in and out of these cities change,
some urban centers grow while others decay. Stable patterns of coexisting
centers arise as bifurcations occur in the growing city networks taking
them from attractor to attractor. {17} Something like Allen's approach
would be useful to model one of the two things that stitch world-economies
together, according to Braudel: trade circuits. However, to generate the
actual spatial patterns that we observe in the history of Europe, we need
to include the creation of chains of subordination among these cities,
of hierarchies of dependencies besides the meshworks of interdependencies.
This would need the inclusion of monopolies and oligopolies, growing out
of each cities meshworks of small producers and traders. We would also
need to model the extensive networks of merchants and bankers with which
dominant cities invaded their surrounding urban centers, converting them
into a middle zone at the service of the core. A dynamical system of trade
circuits, animated by import-substitution dynamics within each city, and
networks of merchants extending the reach of large firms of each city,
may be able to give us some insight into the real historical dynamics of
the European economy. {18} Bottom-up economic models which generate temporal
patterns have also been created. One of the most complex simulations in
this area is the Systems Dynamics National Model at MIT. Unlike econometric
simulations, where one begins at the macroeconomic level, this one is built
up from the operating structure within corporations. Production processes
within each industrial sector are modeled in detail. The decision-making
behind price setting, for instance, is modeled using the know-how from
real managers. The model includes many nonlinearities normally dismissed
in classical economic models, like delays, bottlenecks and the inevitable
friction due to bounded rationality. The simulation was not created with
the purpose of confirming the existence of the Kondratiev wave, the fifty-two
year cycle that can be observed in the history of wholesale prices for
at least two centuries. In fact, the designers of the model were unaware
of the literature on the subject. Yet, when the simulation began to unfold,
it reached a bifurcation and a periodic attractor emerged in the system,
which began pulsing to a fifty year beat. The crucial element in this dynamics
seems to be the capital goods sector, the part of the industry that creates
the machines that the rest of the economy uses. Whenever an intense rise
in global demand occurs, firms need to expand and so need to order new
machines. But when the capital goods sector in turn expands to meet this
demand it needs to order from itself. This creates a positive feedback
loop that pushes the system towards a bifurcation. {19} Insights coming
from running simulations like these can, in turn, be used to build other
simulations and to suggest directions for historical research to follow.
We can imagine parallel computers in the near future running simulations
combining all the insights from the ones we just discussed: spatial networks
of cities, breathing at different rhythms, and housing evolving populations
of organizations and meshworks of interdependent skills. If power relations
are included, monopolies and oligopolies will emerge and we will be able
to explore the genesis and evolution of the antimarket. If we include the
interactions between different forms of organizations, then the relationships
between economic and military institutions may be studied. As Galbraith
has pointed out, in today's economy nothing goes against the market, nothing
is a better representative of the planning system, as he calls it, than
the military-industrial complex. {20} But we would be wrong in thinking
that this is a modern phenomenon, something caused by "late capitalism".
In the first core of the European world-economy, thirteenth century Venice,
the alliance between monopoly power and military might was already in evidence.
The Venetian arsenal, where all the merchant ships were built, was the
largest industrial complex of its time. We can think of these ships as
the fixed capital, the productive machinery of Venice, since they were
used to do all the trade that kept her powerful; but at the same time,
they were military machines used to enforce her monopolistic practices.
{21} When the turn of Amsterdam and London came to be the core, the famous
Companies of Indias with which they conquered the Asian world-economy,
transforming it into a periphery of Europe, were also hybrid military-economic
institutions. We have already mentioned the role that French armories and
arsenals in the eighteenth century, and American ones in the nineteenth,
played in the birth of mass production techniques. Frederick Taylor, the
creator of the modern system for the control the labor process, learned
his craft in military arsenals. That nineteenth century radical economists
did not understand this hybrid nature of the antimarket can be seen from
the fact that Lenin himself welcomed Taylorism into revolutionary Russia
as a progressive force, instead of seeing for what it was: the imposition
of a rigid command-hierarchy on the workplace. {22} Unlike these thinkers,
we should include in our simulations all the institutional interactions
that historians have uncovered, to correctly model the hybrid economic-military
structure of the antimarket. Perhaps by using these synthetic models as
tools of exploration, as intuition synthesizers, so to speak, we will also
be able to study the feasibility of counteracting the growth of the antimarket
by a proliferation of meshworks of small producers. Multinational corporations,
according to the influential theory of "transaction-costs", grow by swallowing
up meshworks, by internalizing markets either through vertical or horizontal
integration. {23} They can do this thanks to their enormous economic power
(most of them are oligopolies), and to their having access to intense economies
of scale. However, meshworks of small producers interconnected via computer
networks could have access to different, yet as intense economies of scale.
A well studied example is the symbiotic collection of small textile firms
that has emerged in an Italian region between Bologna and Venice. The operation
of a few centralized textile corporations was broken down into a decentralized
network of firms, in which entrepreneurs replace managers and short runs
of specialized products replace large run of mass produced ones. Computer
networks allow these small firms to react flexibly to sudden shifts in
demand, so that no firm becomes overloaded while others sit idly with spare
capacity. {24} But more importantly, a growing pool of skills is thereby
created, and because this pool has not been internalized by a large corporation,
it can not be taken away. Hence this region will not suffer the fate of
so many American company towns, which die after the corporation that feeds
them moves elsewhere. This self-organized reservoirs of skills also explain
why economic development cannot be exported to the third world via large
transfers of capital invested in dams or other large structures. Economic
development must emerge from within as meshworks of skills grow and proliferate.
{25} Computer networks are an important element here, since the savings
in coordination costs that multinational corporations achieve by internalizing
markets, can be enjoyed by small firms through the use of decentralizing
technology. Computers may also help us to create a new approach to control
within these small firms. The management approach used by large corporations
was in fact developed during World War II under the name of Operations
Research. Much as mass production techniques effected a transfer of a command
hierarchy from military arsenals to civilian factories, management practices
based on linear analysis carry with them the centralizing tendencies of
the military institutions where they were born. Fresh approaches to these
questions are now under development by nonlinear scientists, in which the
role of managers is not to impose preconceived plans on workers, but to
catalyze the emergence of meshworks of decision-making processes among
them. {26} Computers, in the form of embedded intelligence in the buildings
that house small firms, can aid this catalytic process, allowing the firm's
members to reach some measure of self-organization. Although these efforts
are in their infancy, they may one day play a crucial role in adding some
heterogeneity to a world-economy that's becoming increasingly homogenized.
FOOTNOTES: {1} Ilya Prigogine and Isabelle Stengers. Order out of Chaos.
(Bantam Books, New York 1984). p.169. {2} Stuart A. Kauffman. The Origins
of Order. Self Organization and Selection in Evolution. (Oxford Univ. Press,
New York 1993) p.280 {3} Christopher G. Langton. Artificial Life. In C.G.
Langton ed. Artificial Life. (Addison-Wesley, 1989) p.2 {4} Geoff Hodgson.
Critique of Wright 1: Labour and Profits. In Ian Steedman ed. The Value
Controversy. (Verso, London 1981). p.93 {5} John Keneth Galbraith. The
New Industrial State. (Houghton Mifflin, Boston 1978) p.24 {6} Fernand
Braudel.
Civilization and Capitalism, 15th-18th Century. Vol 2. (Harper and Row,
New York 1982) p.229 {7} ibid. p.559-561 {8} William H. McNeill. The Pursuit
of Power. (University of Chicago Press, 1982) p.49 {9} Merrit Roe Smith.
Army Ordnance and the "American system" of Manufacturing, 1815-1861. In
M.R.Smith ed. Military Enterprise and Technological Change. (MIT Press,
1987) p.47 {10} Richard Nelson and Sidney Winter. An Evolutionary Theory
of Economic Change. (Belknap Press, Cambridge Mass 1982) p.98 {11} Richard
Dawkins. The Selfish Gene. (Oxford University Press, New York 1989) ch.11
{12} Stuart Kauffman. The Evolution of Economic Webs. In Philip Anderson,
Kenneth Arrow and David Pines eds. The Economy as an Evolving Complex System.
(Addison-Wesley, 1988) {13} Jane Jacobs. Cities and the Wealth of Nations.
(Random House, New York 1984) p.133 {14} The dichotomy Meshwork/Hierarchy
is a special case of what Deleuze and Guattari call Smooth/Striated or
Rhizome/Tree. Gilles Deleuze and Felix Guattari. 1440: The Smooth and the
Striated. In A Thousand Plateaus. (University of Minnesota Press, Minneapolis
1987) ch.14 {15} John R. Munkirs and James I. Sturgeon. Oligopolistic Cooperation:
Conceptual and Empirical Evidence of Market Structure Evolution. In Marc.
R. Tool and Warren J. Samuels eds. The Economy as a System of Power. (Transaction
Press, New Brunswick 1989). p.343 {16} Fernand Braudel. op. cit. Vol 3.
p.25-38 {17} Peter M. Allen. Self-Organization in the Urban System. In
William C. Schieve and P.M.Allen eds. Self-Organization and Dissipative
Structures: Applications in the Physical and the Social Sciences. (University
of Texas, Austin 1982) p.136 {18} Fernand Braudel. op. cit. Vol 3. p.140-167
{19} J.D. Sterman. Nonlinear Dynamics in the World Economy: the Economic
Long Wave. In Peter Christiansen and R.D. Parmentier eds. Structure, Coherence
and Chaos in Dynamical Systems. (Manchester Univ. Press, Manchester 1989)
{20} John Galbraith. op. cit. p. 321 {21} Fernand Braudel. op. cit. Vol
2 p. 444 {22} Vladimir Lenin. The Immediate Tests of the Soviet Goverment.
Collected Works, Vol 27 (Moskow 1965). {23} Jean-Francois Hennart. The
Transaction Cost Theory of the Multinational Enterprise. In Christos Pitelis
and Roger Sudgen eds. The Nature of the Transnational Firm. (Rutledge,
London 1991). {24} Thomas W. Malone and John F. Rockart. Computers, Networks
and the Corporation. In Scientific American Vol 265 Number 3 p.131 Also:
Jane Jacobs, op. cit. p.40 Fernand Braudel, op cit Vol 3 p. 630 {25} Jane
Jacobs. op. cit. p.148 {26} F. Malik and G. Probst. Evolutionary Management.
In H.Ulrich and G. Probst eds. Self-Organization and Management of Social
Systems. (Springer Verlag, Berlin 1984) p. 113 =============== Manuel DeLanda,
writer and artist, has published, among other works, War in the Age of
Intelligent Machines, and A Thousand Years of Nonlinear History. by Manuel
DeLanda One constant in the history of Western philosophy seems to be a
certain conception of matter as an inert receptacle for forms that come
from the outside. In other words, the genesis of form and structure seems
to always involve resources that go beyond the capabilities of the material
substratum of these forms and structures. In some cases, these resources
are explicitly transcendental, eternal essences defining forms which are
imposed on infertile materials. The clearest example of this theory of
form is, of course, religious Creationism, in which form begins as an idea
in God's mind, and is then imposed by a command on an obedient and docile
matter. But more serious examples also exist. In ancient philosophies Aristotle's
essences seem to fit this pattern, as do those that inhabit Platonist heavens.
And although classical physics began with a clean break with Aristotelian
philosophy, and did endow matter with some spontaneous behavior (e.g. inertia),
it reduced the variability and richness of material expression to the concept
of mass, and studied only the simplest material systems (frictionless planetary
dynamics, ideal gases) where spontaneous self-generation of form does not
ocurr, thus always keeping some transcendental agency hidden in the background.
Yet, as Gilles Deleuze has shown in his work on Spinoza, not every Western
philosopher has taken this stance. In Spinoza, Deleuze discovers another
possibility: that the resources involved in the genesis of form are not
transcendental but immanent to matter itself. A simple example should suffice
to illustrate this point. The simplest type of immanent resource for morphogenesis
seems to be endogenously-generated stable states. Historically, the first
such states to be discovered by scientists studying the behavior of matter
(gases) were energy minima (or correspondingly, entropy maxima). The spherical
form of a soap bubble, for instance, emerges out of the interactions among
its constituent molecules as these are constrained energetically to "seek"
the point at which surface tension is minimized. In this case, there is
no question of an essence of "soap-bubbleness" somehow imposing itself
from the outside, an ideal geometric form (a sphere) shaping an inert collection
of molecules. Rather, an endogenous topological form (a point in the space
of energetic possibilities for this molecular assemblage) governs the collective
behavior of the individual soap molecules, and results in the emergence
of a spherical shape. Moreover, the same topological form, the same minimal
point, can guide the processes that generates many other geometrical forms.
For example, if instead of molecules of soap we have the atomic components
of an ordinary salt crystal, the form that emerges from minimizing energy
(bonding energy in this case) is a cube. In other words, one and the same
topological form can guide the morphogenesis of a variety of geometrical
forms. A similar point applies to other topological forms which inhabit
these spaces of energetic possibilities. For example, these spaces may
contain closed loops (technically called "limit cycles" or "periodic attractors").
In this case the several possible physical instantiations of this space
will all display isomorphic behavior: an endogenously generated tendency
to oscillate in a stable way. Whether one is dealing with a socio-technological
structure (such as a radio transmitter or a radar machine), a biological
one (a cyclic metabolism), or a physical one (a convection cell in the
atmosphere), it is one and the same immanent resource that is involved
in their different oscillating behavior. Since this is a crucial issue
in Deleuze's philosophy let me explain this point in a little more detail.
Deleuze calls this ability of topological forms to give rise to many different
physical instantiations, a process of "divergent actualization", taking
the idea from French philosopher Henri Bergson who, at the turn of the
century, wrote a series of texts where he criticized the inability of the
science of his time to think the new, the truly novel. The first obstacle
was, according to Bergson, a mechanical and linear view of causality and
the rigid determinism that it implied. Clearly, if all the future is already
given in the past, if the future is merely that modality of time where
previously determined possibilities become realized, then true innovation
is impossible. To avoid this mistake, he thought, we must struggle to model
the future as truly open ended, and the past and the present as pregnant
not only with possibilities which become real, but with virtualities which
become actual. The distinction between the possible and the real, assumes
a set of predefined forms (or essences) which aquire physical reality as
material forms that resemble them. From the morphogenetic point of view,
realizing a possibility does not add anything to a predefined form, except
reality. The distinction between the virtual and the actual, on the other
hand, does not involve resemblance of any kind (e.g. our example above,
in which a topological point becomes a geometrical sphere) and far from
constituting the essential identity of a form, it subverts identity, since
now forms as different as spheres and cubes emerge from the same topological
point. To quote from what is probably his most important book, "Difference
and Repetition": "Actualization breaks with resemblance as a process no
less than it does with identity as a principle. In this sense, actualization
or differenciation is always a genuine creation." And Deleuze goes on to
discuss processes of actualization more complex than bubbles or crystals,
processes such as embryogenesis, the development of a fully differenciated
organism starting from a single cell. In this case, the space of energetic
possibilities is more elaborate, involving many topological forms governing
complex spatio-temporal dynamisms: "How does actualization ocurr in things
themselves?...Beneath the actual qualities and extensities [of things themselves]
there are spatio-temporal dynamisms. They must be surveyed in every domain,
even though they are ordinarly hidden by the constituted qualities and
extensities. Embryology shows that the division of the egg is secondary
in relation to more significant morphogenetic movements: the augmentation
of free surfaces, stretching of cellullar layers, invagination by folding,
regional displacement of groups. A whole kinimatics of the egg appears
which implies a dynamic". In "Difference and Repetition", Deleuze repeatedly
makes use of these "spaces of energetic possibilities" (technically refered
to as "state spaces" or "phase spaces"), and of the topological forms (or
"singularities") that shape these spaces. Since these ideas reappear in
his later work, and since both the concept of "phase space" and that of
"singularity" belong to mathematics, it is safe to say that a crucial component
of Deleuzian thought comes from the philosophy of mathematics. And, indeed,
chapter four of "Difference and Repetition" is a meditation on the metaphysics
of the differential and integral calculus. On the other hand, given that
"phase spaces" and "singularities" become physically significant only in
relation to material systems which are traversed by a strong flow of energy,
Deleuze philosophy is also intimately related to that branch of physics
which deals with material and energetic flows, that is, with thermodynamics.
And, indeed, chapter five of "Difference and Repetition" is a philosophical
critique of nineteenth century thermodynamics, an attempt to recover from
that discipline some of the key concepts needed for a theory of immanent
morphogenesis. At the beginning of that chapter, Deleuze introduces some
key distinctions that will figure prominently in his later work, specifically
the concept of "intensity", but more importantly, he reveals in the very
first page his ontological commitments. It is traditional since Kant to
distinguish between the world as it appears to us humans, that is, the
world of phenomena or appereances, and the world as it exists by itself,
regardless of whether there is a human observer to interact with it. This
world "in itself" is refered to as "nuoumena". A large number of contemporary
thinkers, particularly those that call themselves "postmodernists", do
not believe in nuomena. For them the world is socially constructed, hence,
all it contains is linguistically-defined phenomena. Notice that even though
many of these thinkers declare themselves "anti-essentialist", they share
with essentialism a view of matter as an inert material, only in their
case form does not come from a Platonic heaven, or from the mind of God,
but from the minds of humans (or from cultural conventions expressed linguistically).
The world is amorphous, and we cut it out into forms using language. Nothing
could be further from Deleuzian thought than this postmodern linguistic
relativism. Deleuze is indeed a realist philosopher, who not only believes
in the autonomous existance of actual forms (the forms of rocks, plants,
animals and so on) but in the existance of virtual forms. In the first
few lines of chapter five of "Difference and Repetition", where Deleuze
introduces the notion of "intensity" as a key to understand the actualization
of virtual forms, he writes: "Difference is not diversity. Diversity is
given, but difference is that by which the given is given...Difference
is not phenomenon but the nuoumenon closest to the phenomenon...Every phenomenon
refers to an inequality by which it is conditioned...Everything which happens
and everything which appears is correlated with orders of differences:
differences of level, temperature, pressure, tension, potential, difference
of intensity". {2} Let me illustrate this idea with a familiar example
from thermodynamics. If one creates a container separated into two compartments,
and one fills one compartment with cold air and the other with hot air,
one thereby creates a system embodying a difference in intensity, the intensity
in this case being temperature. If one then opens a small hole in the wall
dividing the compartments, the intensity difference causes the onset of
a spontaneous flow of air from one side to the other. It is in this sense
that intensity differences are morphogenetic, even if in this case the
form that emerges is too simple. The examples above of the soap bubble
and the salt crystal, as well as the more complex foldings and stretchings
undergone by an embryo, are generated by similar principles. However, in
the page following the quote above, Deleuze argues that, despite this important
insight, nineteenth century thermodynamics cannot provide the foundation
he needs for a philosophy of matter. Why? Because that branch of physics
became obsessed with the final equilibrium forms, at the expense of the
difference-driven morphogenetic process which gives rise to those forms.
But as Deleuze argues, the role of virtual singularities can only be grasped
during the process of morphogenesis, that is, before the final form is
actualized, before the difference dissapears. This shortcoming of nineteenth
century thermodynamics, to overlook the role of intensity differences in
morphogenesis, to concentrate on the equlibrium form that emerges only
once the original difference has been cancelled, has today been repaired
in the latest version of this branch of physics, appropriatedly labeled
"far-from-equilibrium thermodynamics". Although Deleuze does not explicitly
refer to this new branch of science, it is clear that far-from-equilibrium
thermodynamics meets all the objections which he raises against its nineteenth
century counterpart. In particular, the systems studied in |
this new discipline
are continuously traversed by a strong flow of energy and matter, a flow
which does not allow the differences in intensity to be cancelled, that
is, maintains these differences and keeps them from cancelling themselves.
It is only in these far-from-equilibrium conditions that the full variety
of immanent topological forms appears (steady state, cyclic or chaotic
attractors). It is only in this zone of intensity that difference-driven
morphogenesis comes into its own, and that matter becomes an active material
agent, one which does not need form to come and impose itself from the
outside. To return once more to the example of the developing embryo, the
DNA that governs the process does not contain, as it was once believed,
a blueprint for the generation of the final form of the organism, an idea
that implies an inert matter to which genes give form from the outside.
The modern understanding of the procesess, on the other hand, pictures
genes as teasing out a form out of an active matter, that is, the function
of genes and their products as now seen as merely constraining and channeling
a variety of material processes, ocurring in that far-from-equlibrium zone,
in which form emerges spontaneously. To complete my characterization of
Deleuze theory of the genesis of form, I would like to explore the way
in which his more recent work (in collaboration with Felix Guattari) has
extended these basic ideas, greatly increasing the kind of immanent resources
that are available to matter for the creation of form. In particular, in
their joint book "A Thousand Plateaus", they develop theories of the genesis
of two very important types of structures, to which they refer with the
terms "strata" and "self-consistent aggregates" (or alternatively "trees"
and "rhizomes"). Basically, strata emerge from the articulation of homogeneous
elements, whereas self-consistent aggregates emerge from the articulation
of heterogeneous elements as such. {3} Both processes display the same
"divergent actualization" which characterized the simpler processes behind
the formation of soap bubbles and salt crystals. In other words, in both
processes we have a virtual form (or abstract machine, as they now call
it) underlying the isomorphism of the resultant actual forms. Let's begin
by briefly describing the process behind the genesis of geological strata,
or more specifically, of sedimentary rock, such as sandstone or limestone.
When one looks closely at the layers of rock in an exposed mountain side,
one striking characteristic is that each layer contains further layers,
each composed of small pebbles which are nearly homogeneous with respect
to size,shape and chemical composition. It is these layers that are referred
to as "strata". Now, given that pebbles in nature do not come in standard
sizes and shapes, some kind of sorting mechanism seems to be needed to
explain this highly improbable distribution, some specific device which
takes a multiplicity of pebbles of heterogeneous qualities and distributes
them into more or less uniform layers. One possibility uncovered by geologists
involves rivers acting as sorting machines. Rivers transport rocky materials
from their point of origin to the place in the ocean where these materials
will accumulate. In this process, pebbles of variable size, weight and
shape tend to react differently to the water transporting them. These different
reactions to moving water are what sorts out the pebbles, with the small
ones reaching the ocean sooner than the large ones. This process is called
"sedimentation". Besides sedimentation, a second operation is necessary
to transform these loose collections of pebbles into a larger scale entity:
a sedimentary rock. This operation consists in cementing the sorted components,
an operation carried out by certain substances dissolved in water which
penetrate the sediment through the pores between pebbles. As this percolating
solution crystallizes, it consolidates the pebble's temporary spatial relations
into a more or less permanent "architectonic" structure. These double articulation,
sorting and consolidation, can also be found in biological species. Species
form through the slow accumulation of genetic materials. Genes, of course,
do not merely deposit at random but are sorted out by a variety of selection
pressures which include climate, the action of predators and parasites
and the effects of male or female choice during mating. Thus, in a very
real sense, genetic materials "sediment" just as pebbles do. Furthermore,
these loose collections of genes can (like sedimented pebbles) be lost
under some drastically changed conditions (such as the onset of an Ice
age) unless they become consolidated together. This second operation is
performed by "reproductive isolation", that is, by the closure of a gene
pool which occurs when a given subset of a reproductive community, becomes
incapable of mating with the rest. Through selective accumulation and isolative
consolidation, a population of individual organisms comes to form a larger
scale entity: a new individual species. We can also find these two operations
(and hence, this virtual diagram) in the formation of social classes. Roughly,
we speak of "social strata" whenever a given society presents a variety
of differentiated roles to which not everyone has equal access, and when
a subset of those roles (i.e. those to which a ruling elite alone has access)
involves the control of key energetic and material resources. In most societies
roles tend to "sediment" through a variety of sorting or ranking mechanisms,
yet not in all of them ranks become an autonomous dimension of social organization.
In many societies differentiation of the elites is not extensive (they
do not form a center while the rest of the population forms an excluded
periphery), surpluses do not accumulate (they may be destroyed in ritual
feasts), and primordial relations (of kin and local alliances) tend to
prevail. Hence a second operation is necessary: the informal sorting criteria
need to be given a theological interpretation and a legal definition. In
short, to transform a loose ranked accumulation of traditional roles into
a social class, the social sedimement needs to become consolidated via
theological and legal codification. {8} Is there also a virtual diagram
behind the genesis of meshworks? In the model proposed by Deleuze and Guattari,
there are three elements in this other virtual diagram, of which two are
particularly important. First, a set of heterogeneous elements is brought
together via an articulation of superpositions , that is, an interconnection
of diverse but overlapping elements. And second, a special class of operators,
or intercallary elements, is needed to effect this interlock via local
connections. Is it possible to find instances of this diagram in geology,
biology and sociology? Perhaps the clearest example is that of an ecosystem
. While a species may be a very homogeneous structure, an ecosystem links
together a wide variety of heterogeneous elements (animals and plants of
different species) which are articulated through interlock, that is, by
their functional complementarities. Since one of the main features of ecosystems
is the circulation of energy and matter in the form of food, the complementarities
in question are alimentary: prey-predator or parasite-host being two of
the most common. In this situation, symbiotic relations can act as intercallary
elements aiding the process of building food webs by establishing local
couplings. Examples include the bacteria that live in the guts of many
animals allowing them to digest their food, or the fungi and other microorganisms
which form the "rhizosphere", the underground food chains which interconnect
plant roots and soil. The world of geology also has actualizations of these
virtual operations, a good example being that of igneous rocks. Unlike
sandstone, igneous rocks such as granite are not the result of sedimentation
and cementation, but the product of a very different construction process
forming directly out of cooling magma. As magma cools down its different
elements begin to separate as they crystallize in sequence, those that
solidify earlier serving as containers for those which acquire a crystal
form later. In these circumstances the result is a complex set of heterogeneous
crystals which interlock with one another, and this is what gives granite
its superior strength. Here the intercallary elements include anything
which brings about local articulations from within the crystals, including
nucleation centers and certain line defects called dislocations, as well
as local articulation between crystals, such as events ocurring at the
interface between liquids and solids. Thus, granite may be said to be an
instance of a meshwork. In the socio-economic sphere, pre-capitalist markets
may be considered examples of cultural meshworks. In many cultures weekly
markets have traditionally been the meeting place for people with heterogeneous
needs and offers. Markets connect people by matching complementary demands,
that is, by interlocking them on the basis of their needs and offers. Money
(even primitive money such as salt blocks or cowry shells) may be said
to perform the function of intercallary element: while with pure barter
the possibility of two exactly matching demands meeting by chance is very
low, when money is present those chance encounters become unnecessary,
and complementary demands may find each other at a distance, so to speak.
Thus, much as sandstone, animal species and social classes may be said
to be divergent actualizations of a virtual process of "double articulation"
which brings homogenous components together, granite, ecosystems and markets
are actualizations of a virtual process which links heterogenous elements
through interlock and intercalation. These virtual processes are, according
to Deleuze, perfectly real, a real virtuality which has nothing to do with
what we call virtual reality. And yet, because this real virtuality constitutes
the nuomenal machinery behind the phenomena, that is, behind reality as
it appears to us humans, because this real virtuality governs the genesis
of all real forms, it cannot help but be related to virtual realities,
not only those created by computer simulations, but also by novelists,
filmmakers, painters and musicians. Deleuze's work is, from the beginning,
concerned as much with physics and mathematics, as it is with art. But
it seems to me, only when we understand the Deleuzian world of material
and energetic flows, and the forms that emerge spontaneously in these flows,
can we begin to ask "what is a novel or a painting or a piece of music"
in this world? In other words, the movement should be from a rich material
world pregnant with virtualities, to literature or art, and not from literature
(and texts, discourses, metaphors) to a socially constructed world where
matter has once again, become an inert receptacle for external forms. It
is in this sense, that Deleuze's work constitutes a true challenge to language-obsessed
postmodernism, a neomaterialism which promises to enrich the conceptual
reservoirs of both science and art and that one day could lead to a complete
reconceptualization of our history as well as of our alternatives for the
future.~ ========== and ../popefuller.htm WARNING: This Computer Has Multiple
Personality Disorder by Simon Pope and Matthew Fuller Pandemonium is the
complete system of Lemurian demonism and time sorcery. It consists of two
principal components: Numogram (time-map) and Matrix (listing the names,
numbers and attributes of the demons). The system is constructed according
to immanent criteria latent in decimal numeracy, and involves only basic
arithmetical operations (assembled from additions and subtractions). The
Numogram, or Decimal Labyrinth, is composed of ten zones (numbered 0-9)
and their interconnections. These zones are grouped into five pairs (syzygies)
by nine-sum twinning [zygonovism]. The arithmetical difference of each
syzygy defines a current (or connection to a tractor zone). Currents constitute
the primary flows of the numogram. Each zone number when digitally cumulated
defines the value of a gate, whose reduction sets the course of a corresponding
channel. Channels constitute the secondary flows, time-holes, or secret
interconnections of the numogram. The arrangement of currents divides the
Maze into three basic time-systems. Firstly, the currents of the three
central syzygies mutually compose a cycle, rotating in anticlockwise steps.
Lemurian sorcery calls this inner loop the Time-Circuit. Secondly, and
thirdly, in both the Upper and the Lower syzygies the currents produced
fold back into (a half of) themselves, constituting autonomous loops: the
Warp (upper), and Plex (lower). Warp and Plex circuitries are of an intrinsically
cryptic nature, which is compounded by the enigmas of their interconnection.
They are variously considered to be Outside- or Outer-time. The gates and
their channels knit the Maze together, providing connections between otherwise
incompatible time-systems. They open and close the ways of sorcerous traffic.
Although each gate deranges time in its own way, their operations vary
with a certain regional consistency. 1. Numogram and Otz Chaiim. To those
familiar with the Western Magical Tradition, it is likely that the Numogram
will initially evoke the Qabbalistic Tree of Life. Both are constructed
as decimal diagrams, involving webs of connectivity between ten basic zones,
mysteriously twisted into a cryptic ultra-cycle (that links upper and lower
regions). Both treat names as numbers, and numerize by digital reduction
and cumulation. Both include passages across abysmal waters and through
infernal regions. Both map zones onto spinal levels. Despite these manifold
interlinkages, there are compelling reasons to consider the Tree of Life
a scrambled variant of the Numogram, rather than a parallel system. During
its long passage through Atlantean and post-Atlantean hermetic traditions
the systematic distortions of the Numogram (introduced to confuse the uninitiated)
gradually hardened into erroneous doctrines, and a dogmatic image of the
Tree. Most evidently, a vulgar distribution of the numbers - in their exoteric
counting-order - was substituted (redundantly) for the now esoteric numogrammatical
distribution, which proceeds in accordance with immanent criteria (the
web emerging qabbalisitically from the zone-numbers themselves). More devastatingly,
the orginal consistency of numeracy and language seems to have been fractured
at an early stage, introducing a division between the number of the Sephiroth
(10) and that of the Hebrew alphabet (22). The result was a break between
the nodes of the tree and the interconnecting paths, ruining all prospect
of decipherment. The Sephiroth -segmented over-aganist their connections
- become static and structural, whilst the paths lose any rigorous principle
of allocation. A strictly analogous outcome is evident in the segmentation
of the Tarot into Major and Minor Arcana. Increasingly desperate, arbitrary,
and mystifying attempts to re-unite the numbers and their linkages seems
to have bedevilled all succeeding occult traditions. 2. Numogram and I
Ching. There is considerable evidence, both immanent and historical, that
the chinese I Ching and the Nma numogram share a hypercultural matrix.
Both are associated with intricate zygonomies, or double-numbering systems,
and process abstract problematics involving subdivisions of decimal arrays
(as suggested by the Ten Wings of traditional I Ching commentary). Digital
reduction of binary powers stabilizes in a six-step cycle (with the values
1, 2, 4, 8, 7, 5). These steps correspond to the lines of the hexagram,
and to the time-circuit zones of the Numogram, producing a binodecimal
6-Cycle (which is also generated in reverse by quintuplicative numbering).
In both cases a supplementary rule of pairing is followed, according to
a zygonovic criterion (9-twinning of reduced values: 8:1, 7:2, 5:4, mapping
the hexagram line pairs). The numogram time-circuit, or I Ching hexagam,
implictly associates zero with the set of excluded triadic values. It is
intriguing in this respect that numerous indications point to an early
struggle between triadic and binary numbering practices in ancient chinese
culture, suggesting that the binary domination of decimal numeracy systematically
produces a triadic residue consistent with nullity. The hexagram itself
exhibits obvious tension in this respect, since it reinserts a triadic
hyperfactor into the reduced binodigital set (compounded by its summation
to twenty-seven, or the third power of three). An ancient binotriadic parallel
to the I Ching, called the T'ai Hsuan Ching (or Book of the Great Dark)
consisted of eighty-one tetragrams, reversing the relation of foregrounded
and implicit numerical values. The division of Lao Tse's Tao Te Ching into
eighty-one sections suggests that this numerical conflict was an animating
factor in the early history of Taoism. 3. Ethnography of the Nma. Nma culture
cannot be decoded without the key provided by the Lemurian Time-Maze. The
influence of a hyper triadic criterion of time is evident in the relics
of Nma kinship organization, calendrics, and associated rituals. Prior
to the calamity of 1883, the Nma consisted of true tribes (tripartite macrosocial
divisions). They were distributed in a basic tridentity (interlocking large-scale
groupings into Tak- Mu- and Dib-Nma), supported by a triangular patrilocal
marriage-cycle. Each marriage identified a woman with a numogram current,
or time-passage. (Tak-Nma women marrying into the Mu-Nma, Mu-Nma ditto
Dib-Nma, Dib-Nma ditto Tak-Nma). The common calendar of all three tribes
was based upon a zygotriadic system (using 6 digits to divide a double-year
period of 729 days into fractional powers of three). The Mu-Nma still employ
such a calendar today. (The current Mu-Nma calendar is adjusted by regular
intercalations of three additional days every second cycle, or four years.
The earlier practice of intercalations is not easily recoverable). In the
rituals of the Nma the time-circuit is concretized as a hydro-cycle: a
division and recombination of the waters. The three stages of this recurrent
transmutation are, 1) the undivided waters (oceanic), 2) cloud-building
(evaporation), and 3) down-pour (precipitation, river-flow). These are
associated with the great sea-beast (Mur Mur), the lurker of steaming swamps
(Oddubb), and that which hunts amongst the raging storms (Katak). The cycle
is closed by a return to the abysmal waters, intrinsically linking the
order of time, and its recurrence, to an ultimate cataclysm (prior to any
opposition of cyclic and apocalyptic time). It is in this context that
the transcultural deluge-mythos can be restored to its aboriginal sense
(which also corresponds to the Hindu Trimurti, with its three stages of
creation, preservation and destruction). 4. The Numogram Zones The Zones.
Zone Zero. Zone One. Zone Two. Zone Three. Zone Four. Zone Five. Zone Six.
Zone Seven. Zone Eight. Zone Nine. Ccru is committed to an ongoing research
program into the numeracy of the 'lost lemurian polyculture' apparently
terminated by the KT missile of BCE 65 000 000. During the last century,
various aspects of this primordially ancient 'digital hyperstition,' 'mechanomics,'
'schizonumerics,' or 'numbo-jumbo' have been painstakingly re-assembled
through certain cryptic investigations, pre-eminently those associated
with the names Echidna Stillwell, Chaim Horovitz, and Daniel Barker. From
the Mu-Archive in Tibet Horovitz unearths an 'ultimate decimal qabbala'
oriented to the cultic exploration of the numerals zero-to-nine as cosmic
zones. In contradistinction to the late-Babylonian (or Judeo-Christian)
qabbala, the 'method of Mu' involves a rigorous collapse of transcendent
symbolism into intrinsic or immanent features, excavating the latent consistency
between the numerical figures, their arithmetic functions, and their cultural
associations. Horovitz describes these procedures as a diagonal path between
esoteric numerology and exoteric mathematics, and also defines them negatively
as a 'non-numerology' or 'ulterior-arithmetic.' Atlanto-Babyonian State-societies
preserved some of the most fully degraded late-Muvian conclusions, but
only by assimilating them to a 'Gnostic Arithmetic,' fossilizing the numbers
into spiritual beings, ideal individuals, and general concepts. Within
these familiar traditions the sense of the numbers as raw functions of
cosmic distribution was systematically subordinated to magical and religious
principles, whilst their intensive potentials as transmutational triggers
was drained-off into geometrical structures and logical representations.
The productive synthesis of Stillwell's numogrammatic researches with Barker's
'tic-systemic' approach provides the requisite cutting-tools for re-opening
the virtual-numeric labyrinth. This involves the re-activation of those
'lemurian' cultural practices which traffick with numbers as techno-sorcerous
entities: the diagrammatic tokens, intensive thresholds, cosmic coincidences
and hyperstitional influences that populate the plane of Unlife. Ccru has
collated material from a series of occultural investigations that demonstrate
the virtual existence of a lost lemurian art of interplanetary communication,
or ‘planetwork.' This system maps the major bodies of the Solar-system
onto the ten digital labyrinth Zones (beginning from Sol = 0). The numerals
one to nine function as astronomical ordinals, designating the terms of
the planetary sequence in ascending order of orbital span (mean distance
from the sun), orbital period (local year length), and gravitational attenuation
(einsteinean spatial flatness). This heliocentrism (with its implicit repudiation
of terrestrial phenomenology) does not contradict the broad counter-solar
trend in lemurian culture, with its repulsion of centralization and gravitational
capture. There has never been a lemurian solar cult. Lemurian Planetwork
communicates with the substellar bodies as distributed hyper-intelligences
exerting singular influences (or ‘Barker-traces'). These planetary forces
of suggestion are propagated through contemporary mythologies, systematic
coincidences, and accidental scientific fictions (whether lunar seas, martian
canals, jovian monoliths, or life on Europa). Various cryptic records indicate
the existence of considerable calendrical lore based upon the Planetwork
system, yet little of this has been definitively reconstructed. What is
certain is that it takes the mercurian year for its basic unit, and uses
this regular beat in the calendrical interweaving of (nonmetric) speeds
and slownesses. The Zone-Sequence -by mercurian periods- with planetary
attributions: Zn-0 [0000.00] Sun Zn-1 [0001.00] Mercury Zn-2 [0002.55]
Venus Zn-3 [0004.15] Earth Zn-4 [0007.95] Mars Zn-5 [0049.24] Jupiter Zn-6
[0122.32] Saturn Zn-7 [0348.78] Uranus Zn-8 [0684.27] Neptune Zn-9 [1028.48]
Pluto Many tales tell of a lemurian hyperstition composed of numbers that
function as interconnected zones, zone-fragments, and particles. With Stillwell's
epoch-switching discovery of the Numogram - and subsequent mapping of this
'digital labyrinth' - it became possible to compile cartographies of these
zones, in which numbers distribute themselves throughout tropics, clusters,
and regions. The zones thus function as diagrammatic components of flat
cosmic maps (variously charting systems of coincidence, nebular circulations,
spinal nestings, and the folds of inner/outer time). Amongst numerous systematizations
of occult cartography that of Chaim Horovitz (direct descendant of the
infamous 'mad rabbi of Kiev') is especially remarkable. Based upon lemurian
digital relics extracted from the Mu-Archive, it enables the conversion
of numogram-zones (and sub-zones) into cascade-phases, accessed through
numerical 'doors.' The Horovitzean phases constitute qabbalistic groupings
or cross-sections of the pandemonium population (simultaneously numbering
the impulse-entities and defining their collective 'tone'). Those critics
who seek to reduce Horovitz's work to an 'immensely indirect rediscovery
of Pascal's triangle' fail to appreciate either the true antiquity of 'Pascal's'
system or the machinic novelty of it's Horovitzean reanimation. Systematic
issues concerning the Numogram Gates have been separated out from the other
interconnective features of the zones. It has been known since the dawn
of occult cartography that every Zone supports a Gate, and that their corresponding
channels spin the web of esoteric fibres. All sorcerous cultures involve
themselves in the exploration of these paths. A Sarkonian mesh-tag is provided
for each zone as a key to Axsys-format and Crypt-compatibility. The modern
figure zero (0) is a closed circle, oval, or ellipse (sometimes differentiated
from the letter ‘O' by a diagonal slash). Its archaic Hindu form was the
‘bindu' (or dot, retained in the modern system as the ‘decimal point').
Both of these ciphers are of such abstraction that no rapid summary can
be other than misleading. The figure ‘0' designates the number zero, anterior
to the distinction odd/even, and also to the determination of primes (zeroth
prime = 1). Zero is the only natural number that is indivisible by one.
The division of any number by zero produces an infinity (multiplication
of any number by zero = 0), in this respect zero treats itself as any other
number. Zero digitally cumulates to zero. Numeric Keypad direction: anomalous.
As an arithmetical function zero is strongly affined to place value - or
‘positional' - systems in which it typically operates as the designator
of empty magnitudes. The modern decimal and binary systems are the most
familiar examples of such modular numeracies. (The widespread assumption
that such a zero-function is indispensable to any possible place-value
numeracy is, however, a fallacious one). On the number line zero marks
the transition from negative to positive numbers. In modern binary code
zero is instantiated by electronic ‘off' (see One). In set theory zero
corresponds to the null (or empty) set. In coordinate geometry zero marks
the ‘origin' or intersection point of all dimensional axes, and is marked
uniquely across all dimensions. In game theory a zero-sum game is one in
which all gains and losses are mere redistributions (‘I win, you lose'
or inversely). In Boolean algebra zero symbolizes logical negation. Absolute
zero (or zero-degrees Kelvin) marks the cryonic limit of physically attainable
temperature. In schizoanalysis ‘zero-intensity' designates the planomenon,
plane of consistency, or body without organs (immanent to all intensities).
With no other number does arithmetical function cross so imperceptibly
into religious and philosophical abstraction. There is a mathematico-cosmic
continuum connecting the numeral zero to concepts of nullity, nihility,
nothingness, absence, emptiness, void (Hindu ‘Sunya'), vacuum, and neutrality.
Zero also has an initiatory sense, as indicated by the ‘year zero' of new
beginnings, or cycles (as in the case of the Y2K 00-date, and in that of
the zero-strings marking accumulations in the Hindu ‘Yugas'). A similar
function is that of ‘time-zero,' or ‘zero-hour' (synchronizing a distributed
operation, particular one of a military nature). The Sun (Sol-0) is by
far the largest body in the Solar-system. It spins on its own axis at different
speeds (with a period of rotation varying from about 36 Earth-days at the
poles, to 25.4 at the equator). The sun-spot cycle - driven by periodic
reversals in the solar magnetoprocess - lasts for approximately twenty-two
years. The Sun is approximately 4.5 billion years old, roughly half-way
through it's normal existence (or main-sequence of hydrogen-to-helium nucleosynthesis).
After the completion of this phase it is destined to expand into a Red
Giant (consuming the inner Solar-system). Its current temperature varies
enormously according to depth, between 5 800 k at the surface to 15 600
000 k at the core. The pressure at the Sun's core is equivalent to 250
billion (terrestrial) atmospheres. The preponderance of the Sun's mass
within the Solar-system is such that the orbits of the planets (notably
Jupiter) only produce minor perturbations in its behavior. Solar radiation
sustains all photosynthetic activity on Earth, and thus all plant-based
terrestrial life. Its wide range of complex and ambivalent effects extend
from regulating circadian biorhythms to triggering skin-cancers. The Sun's
magnetic field (the heliosphere) is immensely powerful by planetary standards,
extending beyond the edge of the outer Solar-system (as defined by the
orbit of Pluto). Other Solar-influences - in addition to gravitational
and electomagnetic forces - include the solar-wind (which also extends,
on a declining gradient, beyond the edge of the solar-system). The two
predominant aspects of the Earth's mechanical relation to the Sun - the
day and the year - have been the basis of traditional human time-keeping.
The earliest known clocks were sun-dials. Sun-worship is extremely prevalent
within human religious history. The apparent rotation of the Sun around
the zodiac is the keystone of exoteric astrology (allocating Sun-signs).
Sunday is dedicated to Sol. Zone-0 is the first of two zones mutually composing
the Plex-region of the Numogram. Its Syzygetic-twin is Zone-9. This 9+0
Syzygy is carried by the demon Uttunul (see Zone-9). Zone-0 provides the
terminus for a single Plex-channel (the 0th ). Systematic consistency suggests
that Zone-0 envelops the Zeroth-Phase of Pandemonium, but as this includes
nothing beyond itself it constitutes a nominal or virtual multitude and
an "absolute abstraction." Zone-0 has no separable power of initiation,
and since it does not support imps (or impulse-entities) - even of the
first degree - there is no zeroth door. The Zeroth Gate (Gt-00) seems to
connect Zone-0 to itself, but its nature is peculiarly problematical, and
within the Mu-Archive texts its ultimate reality is fundamentally disputed.
Many versions of the Numogram delete it entirely. Horovitz says of this
Gate that "between its existence and nonexistence there is no difference."
Mu Tantrism plots Zone-0 intensities onto the Coccygeal level of the spine,
the vestigial remnant of a lost tail (and biotectonic link to the ancient
lemur-people). Zone-0 is allotted the Sarkonian Mesh-Tag 0000. -----------
The figure one (1) - elaborated from a simple vertical stroke - is at least
semi-ideographic as a relic tally-mark (basically identical in this respect
to the Roman numeral . I'). This figure has obvious phallic resonance (especially
in contrast to the sign for zero (0)). Its relation to the figure seven
(7) is supported by numerological analyses (since seven cumulated (28)
reduces to one). The figure . 1' designates the number one, first odd number
(with odditude of aleph-null), and the zeroth prime (first prime = 2).
One is the lowest cardinal number, and the first ordinal. One digitally
cumulates to one. Numeric Keypad direction: South-West. In modulus-2 systems
the numeral one bears all (non-zero) values (corresponding to powers of
two). Binary informatic systems code electronic 'on' as 'one.' The number
one is exceptionally multivalent. It has two basic cardinal values - both
deriving from its status as the smallest, basic, or irreducible factor
defining the natural number series - that of the elementary, the atom,
the unit or module - 'one alone' - and also that of the whole, the complete,
unity as totality, the universe. Its ordinal value as first, primary, principal,
or initial is fractured by the ordinal function of zero, but retains much
of its ancient dignity as the beginning of the counting series. In addition
one bears a diversity of quasinumerical and logical associations, including
self-identity ('oneself,' 'one and the same'), nondifferentitation, uniqueness
('one of a kind'), logical universality, uniformity, and - at a further
remove, or more problematically - singularity (anomaly, exception), and
the unilateral ('one-sided,' unbalanced, disequilibriated). One also has
a complicated syntactical-linguistic usage that interlinks with its numerical
and logical functions. In particular it operates as a carrier of nominal
and indefinite reference ('the one that,' 'someone or anyone,' 'once upon
a time'), which extends also to relation ('one another'). Within monotheistic
cultures One attains a supreme dignity, identifying God directly with 'the
One' (or 'the Old One'). In this context one is bound to the 'I am that
I am' of YHVH, and to the absolute concentration of religion within the
assertion that 'there is no God but God.' H P Lovecraft upsets this exclusive
and definitive sense of the One by reintroducing the plural and multiple,
whether grammatically as in the case of 'the Old Ones,' or thematically,
as in that of Yog Sothoth, who is described as the 'all in one, and one
in all.' Mercury is the innermost planet of the Solar-system (with a mean
orbital distance from the sun of approx. 58 000 000 km). The mercurian
year (approx. 88 Earth-days in length) is also the swiftest, which accounts
for it's use as the base calculative unit in planetwork calendrics. Due
to it's long day (approx. 58.7 Earth-days in length) Mercury has semi-permanent
light and dark sides (with average temperatures of +430 and -180 degrees
celsius respectively). Mercury has a weak magnetic field (approx 1% the
strength of Earth's). In Roman mythology Mercury (a latinization of the
greek Hermes) is known as the messenger of the gods, associated with communication
and trade. The element Mercury (or 'quicksilver,' symbol Hg) has particular
alchemical importance, shared in Indian yogic traditions (where it is ritually
ingested to produce an anorganic cosmic body). In Lemurian Planetwork Mercury
is astrozygonomously paired with Neptune. Zone-1 is the first of the six
Torque-region Zones of the Numogram, and Tractor-Zone of the 5-4 (or 'Sink')
Current. Its Syzygetic-twin is Zone-8. This 8+1 Syzygy is carried by the
demon Murmur (see Zone-8). Zone-1 provides the terminus for three Torque-channels
(the 1st, 4th, and 7th). Zone-1 both initiates and envelops the First-Phase
of Pandemonium (including 2 impulse-entities). This phase consists of nothing
beyond the Zone (1) and the Door (1::0), thus tending to a highly 'idealized'
state. Zone-1 has a particularly powerful and manifest initiatory dimension.
The First Door - or 'Door of Doors'- is attributed by Muvian sorcery to
the amphidemon (and imp of the first degree) Lurgo (1::0) 'the Initiator,'
and widely related to Legba (the first and last Loa to be invoked in any
vudu ceremony). The First Gate (Gt-01) connects Zone-1 to itself, and its
corresponding channel provides a reduced microcosmic model of the Torque
as a whole, in which Zone-1 provides both beginning and end. In this respect
Horovitz describes Zone-1 'turning forever into itself.'The resulting metastability
of this channel accounts for its strong associations with all known variants
of the Bubble-Pod mythos. Mu Tantrism plots Zone-1 intensities onto the
Dorsal (or Thoracic) level of the spine, which maps onto the domain of
lunged creatures (and colonization of the land). Zone-1 is allotted the
Sarkonian Mesh-Tag 0001 (matching the primordial click of Tzikvik cipher-shamanism).
---------------- The figure two (2) is quasisymmetric with the figure five
(5). This pairing is echoed in the alphabet by the letters ‘Z' and ‘S'
(whose shared consistency across case and phonetic coherence has been taken
by figural grammarians as indicative of a zygophidian - or ‘forked-tongue'
- cultural source). The figure ‘2' designates the number two, the first
and definitive even number, and the first prime (second prime = 3). The
encounter with the irrationality of the square-root of two has special
importance in the disturbance of Hellenic (‘rationalistic') arithmetic.
It is rumoured that the Pythagoreans resorted to assassination in their
attempt to suppress this discovery. Two digitally cumulates to three. Numeric
Keypad direction: South The mechanical importance of bi-stable (on/off)
micro-states within contemporary electronic data-systems has resulted in
a vast and diffuse cultural investment in modulus-2 numeracy (pure place-value
semiotics). ‘Digital' and ‘binary-coded' have now become almost synonymous
in their colloquial usage. Perhaps the supreme exemplar of a binary-numeric
system is that of the ancient Chinese I Ching (or ‘book of changes'), which
involves both binary numeracy (of broken and unbroken lines) and double-numbering
(of numeric hexagrams tagged by a series of ordinal numbers). It is Leibniz'
study of this text which elaborates the first Western example of modern
binary arithmetic. The syzygetic (or zygonomous) power of two is a productive
of an entire series of subtly differentiated binary concepts, which include:
coupling, twinning, doubling, polarity, schism, contrast, balance, opposition,
and reflection. Binarity is multiply ambivalent. It conspires with both
the certainties of analytical reason in general, by way of two-value logics
(governed by the principle of the ‘excluded middle'), and also the uncertainties
of dialogue, or ‘two-way' communication. It is associated - equally or
unequally - with both justice (even-handedness, seeing both sides of a
‘dilemma'), and deceit (two-faced, two-timing, double-dealing ...). Duality
is particularly widespread within biological order, from the ‘base-pairs'
of (RNA and) DNA code, through the binary fission of bacterial propagation,
the (binary) sexual difference of meiotic reproduction, to the bilateral
symmetry of the typical vertebrate organism with consequent pairing of
limbs (arms, legs), sense-organs (eyes, ears), lungs, brain-hemispheres,
etc. ‘Dual-organization' provides a basic model for primordial human kinship
structure. Many aspects of binarity are prominent within religious systems,
whether gods with two heads or faces (such as the Roman Janus, and the
Deleuze-Guattari gods of the State), twin gods (the Dogon Nommo, or the
Zoroastrian couple Ahriman/Ormuzd), divine couples (god-goddess pairings
being widespread throughout many religions), and twice-born gods (both
Zeus and Dionysos amongst the Greek pantheon, for instance). Hindu culture
describes Brahmins as ‘twice-born.' Venus (or Sol-2) has a mean orbital
distance from sun of 108.2 million km. The Venusian year is approx. 224.4
Earth days in length. Since the rotation of Venus is very slow (and also
retrograde) a Venusian day (lasting 243 Earth-days) is longer than its
year. In recent times Venus has become the exemplary victim of a ‘runaway
greenhouse effect' which has rendered it infernal (with a uniform surface
temperature of +462 degrees celsius). Venus has no magnetic field. Venus
has been historically identified by two different names, known as the morning
star (Phosphorous or Lucifer) when seen in the East at sunrise, and the
evening star (Hesperus) when seen in the West at sunset. The Roman goddess
Venus (a latinization of the greek Aphrodite) was the deity associated
with female beauty and love (accounting in part, perhaps, for Burroughs'
hatred of Venusians). In Lemurian Planetwork Venus is astrozygonomously
paired with Uranus. Zone-2 is the second of the six Torque-region Zones
of the Numogram. Its Syzygetic-twin is Zone-7. This 7+2 Syzygy is carried
by the demon Oddubb (see Zone-7). Zone-2 both initiates and envelops the
Second-Phase of Pandemonium (including 4 impulse-entities). With cryptic
rigor Horovitz thus describes Zone-2 as "reduplicating its double-twinness
though its multitude." As initiator it functions as the Second Door, invoked
by K-goth cults as the "Main Lo-Way" into the Crypt. Muvian sorcery identifies
this door with the amphidemon (and imp of the first degree) Duoddod (2::0).
The Second Gate (Gt-3) connects Zone-2 to Zone-3, and its corresponding
channel draws an intense line of escape from the Torque to the Warp. This
passage is especially compelling, since it is multiply consolidated by
cumulation, prime-ordination, and mesh-tagging. Tzikvik shamanism both
honours and fears the Second Gate as the opening to the "way of the Storm-Worm."
Zone-2 is allotted the Sarkonian Mesh-Tag 0003. [M#01] Phase-2::0 Duoddod
[M#02] Phase-2::1 Doogu ============== The figure three (3) is semi-iconic
(incorporating a stack of three horizontal strokes). It is quasisymmetric
with the (upper-case) letter ‘E,' and partially echoed in the figure ‘8'
(designating the third power of two). Figural grammarians consider it to
involve a progression of compressive folding beyond ‘1' and ‘2.' The figure
‘3' designates the number three, the second odd number (with odditude of
1), and second prime (third prime = 5). Three is the square-root of nine
(relating it intimately to Barkerian arithmetic and Zygonovism). Three
digitally cumulates to six. Three is itself the sum of the three preceding
natural numbers (0 + 1 + 2 = 3), demonstrating a unique affinity with numerical
triangularity. Numeric Keypad direction: South-East. A peculiarly obsessive
triadic numeracy is evidenced in the vulgar (‘zygotriadic') calendar of
the Mu Nma. The number three is unique for both the intensity and diversity
of its cross-cutting hyperstitious investments. It is associated on the
right hand with numerological completeness and transcendence, and on the
left hand with the middle, the between, and the diagonal line. Prevalent
triplicities include (amongst many others) the three dimensions of manifest
time (past, present, future) and space (height, length, depth), the triad
game (paper, scissors, stone), the Atlantean Tridentity (Nunnil-Ixor, Domu-Loguhn,
Hummpa-Taddum), the Hindu trimurty (Brahma, Vishnu, Shiva) and gunas (rajas,
tamas, and sattva), the alchemical elements (salt, sulphur, and mercury),
the Christian trinity (Father, Son, Holy ghost), the stages of formalized
dialectic (thesis, antithesis, synthesis), the oedipal triangle (daddy,
mummy, me), and the three virtuous monkeys (blind, deaf, and mute to evil).
History exhibits strong tendencies towards a triadic order of the world,
both in the realm of mythology (heaven, hell, limbo), and in that of geopolitics
(first-, second-, and third-world). The extraordinary numinousness of the
number three is also indicated by ethnomes such as tribalism, tributaries,
trickery, and trials, the three body problem, three wishes, three fates,
three graces, the third-eye, and the arch-magician (Hermes) Trismegistus.
Atlantean sources relate the cultural dominance of the number three to
the fact that Alpha Centauri is a triple system. Earth (or Sol-3) has a
mean orbital distance from the sun of approx. 149 600 000 km, defining
the standard Astronomical Unit (AU). Its orbital period (of approx. 365.2422
Earth days) and rotational period (approx. 24 hours) are used as the basis
of terrestrial calendrics (along with the period of its satellitic - lunar
- orbit), and traditionally for time-keeping (now supplanted by atomic
clocks). The Earth has one moon - Luna - of abnormal size relative to that
of the planet, and exercizing considerable influence, principally through
tidal forces. Lunar influences - such as that evident in the human ovulatory
cycle - have consolidated deep cultural associations between the moon,
oceans, women, blood, sorcery, and madness (lunacy). The Earth is the densest
major body in the Solar-system. It is polarized by a moderate magnetic
field which reverses intermittently (once or twice every million years).
By the end of the second millennium of the Common Era the Earth was still
the only known source of life in the Universe. Prior to the Copernican
revolution (in the C16th) the Earth was considered to be the centre of
the Solar-system - and even of the universe - by the dominant cultures
of mankind (an orthodoxy ruthlessly defended by the Christian Church among
others). Alone amongst the Planets, the Earth is not named after a Greek
or Roman deity. The name ‘Earth' is of Anglogermanic origin. (The Greek
goddess Gaia is increasingly evoked as the name for Earth conceived as
a living macro-entity, provoked in part by systemic - or ‘ecospheric' -
changes in climate, atmosphere, and biodynamics). In Lemurian Planetwork
the Earth is astrozygonomously paired with Saturn. Zone-3 is the first
of the two Warp-region Zones of the Numogram, and Tractor-Zone of the 6-3
(or 'Warp') Current. Its Syzygetic-twin is Zone-6. This 6+3 Syzygy is carried
by the demon Djynxx (see Zone-6). Zone-3 provides the terminus for two
channels, one each from the Torque (the 2nd), and the Warp (the 6th). Zone-3
both initiates and envelops the Third-Phase of Pandemonium (including 8
impulse-entities). In the first of these aspects it functions as the Third
Door, which opens onto the Swirl, and is attributed by Muvian sorcery to
the chaotic xenodemon (and imp of the first degree) Ixix (3::0). The Third
Gate (Gt-6) twists Zone-3 through Zone-6, with its corresponding channel
vortically complementing that of the Sixth Gate (Gt-21), and also the Warp-Current
itself, thus adding an increment of spin to the entire region. Horovitz
invests Zone-3 with a particular potency of intrinsic coincidence, since
its second cumular power (6) is also the number of its Syzygetic double
(through which he accounts for the compact tension of the Warp system).
Mu Tantrism plots Warp-region intensities onto the plane of the third-eye.
Zone-3 is allotted the Sarkonian Mesh-Tag 0007. [M#-03] Phase-3::0 Ixix
[M#-04] Phase-3::1 Ixigool [M#-05] Phase-3::2 Ixidod There are two basic
versions of the figure, one ‘open' and the other closed into a triangle.
The former design is echoed in the symbol for the planet Jupiter. It is
the latter (instantiated here) that figurally relates four to the sign
for delta (fourth letter of the Greek alphabet), and accounts for the fact
that in certain hacker numerolects it is substituted for the (upper-case)
letter ‘A.' The figure ‘4' designates the number four, the second even
number, and first non-prime (or complex) natural number, with prime factors
of 2 x 2. (The fourth prime = 7). The triangular summation - or digital
cumulation - of four equals ten (numerologically identified with a superior
power of unity, classically conceived as the pythagorean Tetrakys). The
pre-eminences of four - as ‘first' non-prime and ‘first' square - are formally
or germinally anticipated by unity. Four digitally cumulates to ten (see
above). Numeric Keypad direction: West. Due to the internal redundancy
of its dual symmetry (2 x 2 = 2 + 2 = 4), four is commonly conceived as
the model outcome of calculation - as indicated by the phrase ‘putting
two and two together.' The dominant associations of the number four are
balance and stability, exemplified by the ‘four-square' - or solidary -
structure of four walls, wheels, or quadrupedal support, as well as by
the ‘four-four beats' of rigidly metric dance-music. It is this sense of
quadrature that predominates in the four elements (earth, air, water, fire),
the four cardinal directions (north, south, east, and west), and the four
DNA bases (adenine, cytosine, guanine, and thymine). A similar fourfold
typology is expressed by the four suits of the playing-card pack (clubs,
diamonds, hearts, spades). Four is also associated with temporal stability
- or cyclic regeneration - , as evidenced by the four seasons (Spring,
Summer, Autumn, Winter), four classical ages (those of gold, silver, bronze,
and lead), and in Hindu culture, far more intricately, by the four Yugas
(those of Krita, Treta, Dvapara, and Kali). The system of the Yugas is
a fully elaborated quadro-decimal system (highly suggestive in relation
to the Tetrakys). Within the Judaeo-Christian tradition the number four
is invested with extraordinary significance, from the four letters of the
Tetragrammaton, through the four gospels, to the four great ‘Zoas' and
four horsemen of apocalypse. The biblical time - of both old and new testaments
- places particular importance on the period of forty days (e.g. the duration
of the flood, and of Jesus' temptation in the desert). This privileging
of quadrate order - as the ground-plan of the temple - is also instantiated
by the masonic ‘square.' The number four is also of special importance
to Buddhism, as exemplified by the ‘four noble truths' of its basic doctrine,
and by the typical (quadrate) design of the mandala. On the flip-side the
number four is connected with excess (the fourth dimension), anomaly (the
four-leafed clover), and vulgarity (four-letter words). Mars (or Sol-4)
has a mean orbital distance from the sun of approx. 228 000 000 km. The
Martian year is roughly twice as long as that of Earth, and its day about
30 minutes longer. Mars has two moons, Phobos and Deimos. The surface of
Mars is swept by vast dust-storms that occasionally envelop the whole planet
for months. In popular legend Mars has long been envisaged as the home
of intelligent alien life. Recent examples include the ‘canals' discovered
by Percival Lowell, the fictions of H.G. Wells and Edgar Rice Burroughs,
and the Cydonia ‘face' (based on images from the 1976 Viking missions).
Mars is widely seen as a plausible candidate for human colonization. It
has also become notorious for cursed space-missions. In August 1996 scientists
announced the discovery of Martian nanoworms in a ancient meteorite (cat.
ALH84001). In Roman mythology Mars (a latinization of the Greek Ares) is
the god of war, and father of Romulus (legendary founder of Rome). Mars
is commemorated by the month of March. In Lemurian Planetwork Mars is astrozygonomously
paired Jupiter. Zone-4 is the third of the six Torque-region Zones of the
Numogram. Its Syzygetic-twin is Zone-5. The 5+4 Syzygy is carried by the
demon Katak (see Zone-5). Zone-4 both initiates and envelops the Fourth-Phase
of Pandemonium (including 16 impulse-entities). This equation of phase-population
with the square of the zone-number establishes an exceptional solidarity
between the two, although this rigidity has as its flip-side a tendency
to cataclysmic instability. In its initiatory aspect Zone-4 functions as
the Fourth Door (or 'Time-Delta,' familiar from variations of the Kurtz-mythos
as 'the worst place in the world'). Muvian sorcery attributes this door
to the amphidemon (and imp of the first degree) Krako (4::0). The Fourth
Gate (Gt-10) feeds Zone-4 forward to Zone-1. Its ancient (proto-Atlantean)
name the 'Gate of Submergence' hints at its interlocking associations with
completion, catastrophe, subsidence, and decadence. The Channel corresponding
to the Fourth Gate is one of three concluding in Zone-1, and the only pro-cyclic
channel within the Torque. Its course reinforces the 5-4 (or 'Sink') Current
in its rush towards termination, and augments the weight of destiny (it
was under the influence of this line that Cecil Curtis departed upon his
fatal journey into the land of the Tak Nma). Zone-4 is allotted the Sarkonian
Mesh-Tag 0015. [M#-06] Phase-4::0 Krako [M#-07] Phase-4::1 Sukugool [M#-08]
Phase-4::2 Skoodu [M#-09] Phase-4::3 Skarkix WRITING MACHINES by Mark Fisher
"Cyberpunk torches fiction in intensity, patched-up out of cash-flux mangled
heteroglossic jargons, and set in a future so close it connects: jungled
by hypertrophic commercialization, socio-political heat-death, cultural
hybridity, feminization, programmable information systems, hypercrime,
neural interfacing, artificial space and intelligence, memory trading,
personality transplants, body-modifications, soft- and wetware viruses,
nonlinear dynamic processes, molecular engineering, drugs, guns, schizophrenia."
No-one is quite sure what they are: Nick Land, Stephen Metcalf, Sadie Plant.
Part theory, part fiction, nothing human, constructs so smoothly assembled
you can't see the joins. They don't write text; they cook up intensities.
They don't theorise; they secrete, datableed. What we used to call cyberpunk
is a convergence: a crossover point not only for fiction and theory, but
for everything that either doesn't know its place or is in the process
of escaping it. Whatever is emerging where authority is getting lost and
middle men are being made redundant. Anything interesting was always like
that. Metalhead Michel Foucault was never easy to place. They asked him
if he had ever wanted to write fiction. He said he'd never done anything
else. So more than a fusion of fiction and theory, it's all about cross
fertilizing the most intense elements of both in monstrous nuptials against
nature. Synthetix. "The present writing would not be a book; for there
is no book that is not the ideal of the immobilised organic body. These
would be only diverse pieces, each piece of variable format and belonging
to its own time with which it begins and ends ... Not a book, only libidinal
instalments." 1974: delirial Jean Francois-Lyotard melts the still glowing-hot
shards of post 68 street revolutionary intensity together with Bataille,
cybernetics and anti-socialised Marx to produce the pre-punk, non-organic,
inhuman assemblage he calls Libidinal Economy. With Deleuze and Guattari's
Anti-Oedipus and Luce Irigaray's Speculum: Of the Other Woman it's part
of an irruption of rogue materialism into the French academy that is as
far from the dreary, idealist textocracy of Parisian post-structuralism
as it is from the dry-as-chalkdust dreariness of Oxbridge common sense
. What is refused, in the name of incandescence, is the neutralizing, disintensifying,
distanced tone de rigeur in academic prose. The aim, as Deleuze and Guattari
put it in Anti-Oedipus , to accelerate the process. All of this consummated
in the migration of intelligence out of the university (if indeed intelligence
ever was in the university), something that, two decades on, the technical
machines will help to facilitate. "The academy loses its control over intelligence
once it is possible to even imagine a situation in which information can
be accessed from nets which care for neither old boy status nor exam results.
The university in flames. "Dozens of different argots are now in common
currency; most people speak at least three or four separate languages,
and a verbal relativity exists as important as any of time and space. To
use the stylistic conventions of the traditional oral novel - the sequential
narrative, characters 'in the round', consecutive events, balloons of dialogue
attached to 'he said' and 'she said' - is to perpetuate a set of conventions
ideally suited to the great tales of adventure in the Conradian mode, or
an overformalized Jamesian society, but now valuable for little more than
the bedtime story and the fable. To use these conventions to describe events
in the present decade is to write a kind of historical novel in reverse...²1964.
Writing in the pages of the SF magazine New Worlds , J. G. Ballard celebrates
the multipliticous, impure junk languages of William Burroughs. Ballard
wheels away the decorous scenery of the literary novel to reveal the atrocity
exhibition of the late twentieth century as it emerges in Burroughs' carnivalesque
prose: "swamps and garbage heaps, alligators crawling around in broken
bottles and tin cans, neon arabesques of motels..."Burroughs has already
intravenously pumped pulp fictional vernacular into the hi-cultural zone
of Joyce-Eliot experimentalism, fatally contaminating it. Ballard's own
condensed novels are in preparation. Cyberpunk fiction lies in wait; assembling
itself out of machinic convergence, it is a direct but unanticipated consequence
of the intersection of the PC, TV and the telephone. Invading clean white
Kalifornia dreams with nightmares from the machinic unconscious, William
Gibson and Pat Cadigan populate cyberspace with nonorganic gothic avatars
and voodoo entities. The bourgeois novel in flames. The near future. (But
it's already happening) "Twisted trading systems have turned the net into
a jungle, pulsing with digital diseases, malfunctioning defence packages,
commercial predators, headhunters, loa and escaped AIs hiding from Asimov
security."Dead hippies create cyberspace, but what comes together is the
jungle: Cubase materialism smearing white economies with black marketization.
Illicit distribution networks, rogue retail, faceless bacterial commerce.
Silicon valley in flames. And it's not over yet. In the intense heat of
the cyberjungle, where distribution is too quick and imperceptible for
copyright lawyers to keep up, the authorised text is decomposing; a process
accelerated by the technical machines. Hypertext is in part an answer to
Deleuze and Guattari's inquiry in A Thousand Plateaus : "A book composed
of chapters has culmination and termination points. What takes place in
a book composed instead of plateaus that compose with one another across
microfissures, as in a brain?" Marshall McLuhan had already seen this happening
in 1964, when, in Understanding Media, he announced the end of print culture
and its associated linear thought patterns. The Gutenberg Galaxy in flames.
The death of the author is an entirely technical matter, not at all a metaphor.
The cool, efficient decommissioning of the author-function in music shows
the way. Remixes displace (fixed, finalised) texts; DJs, producers and
engineers replace authors. What succeeds all this is the version, in the
sense Jamaican reggae culture gave to the term. Unofficial, potentially
infinite, illegitimate: there's no such thing as an authorised version.
Authority is already dead, persisting in undead form only in those places
(the academy for instance) so atrophied and decayed they can't tell the
difference between what's alive and what isn't . Everything you'd want
to be involved in is happening without permission. No law in the jungle.
"The state's pre-arrangement of overlaid bridges, junctions, pathways and
trade routes trajectorize the scorching advance as it impacts upon the
hapless head of the social. Detonation of nuclear arsenals of the world
merely pushes the nomads underground: shedding their skins in reptilian
camouflage, vanishing without a forensic trace in ambient recession into
the underground... Things sometimes converge in the most unpropitious locations.
Coventry, for example. The Cybernetic Culture Research Unit processes cybernetics
and culture together, apprehending culture cybernetically and cybernetics
culturally. The impetus is not so much inter- as anti-disciplinary, the
concrete problem being the freeing up of thought as synaptic-connectivity
from its prison as subject-bound logos. Following flows where they want
to go leads not into random noise but out onto what Deleuze and Guattari
call the plane of consistency . "If we consider the plane of consistency,
we notice that the most disparate things and signs move upon it: a semiotic
fragment rubs shoulders with a chemical interaction, an electron crashes
into a language, a black hole captures a genetic message... There is no
'like' here, we are not saying 'like an electron,' 'like an interaction',
etc. The plane of consistency is the abolition of metaphor; all that consists
is Real." The CCRU is part-populated by names you don't know yet, but are
bound to soon - moving as a massive, with our street-gun samplers, never
alone - a k-class swarmachine infecting White Man Face with afro-futurist
and cyber-feminist cultural viruses . "Writing becomes a process of software
engineering, making connections, and connecting with the other connectionist
systems and their connections too; 'does not totalize', but 'is an instrument
for multiplication and it also multiplies itself.'"What Pat Cadigan calls
synning : synthesizing. No more cerebral core-texts, no more closed books.
Looking instead to games or the dancefloor for inspiration. Attempting
to produce something that will match the ambitions of Lyotard 1974: "To
understand, to be intelligent, is not our overriding passion. We hope rather
to be set in motion. Consequently, our passion would sooner be the dance,
as Nietzsche wanted ... A dance ... not composed and notated but, on the
contrary, one in which the body's gesture would be, with the music, its
timbre, its pitch, intensity and duration, and with the words (dancers
are also singers), at each point in a unique relation, becoming at every
moment an emotional event..."(LE 51) Intensity conductors operating at
non-human machine speed, writing machines, machinic writing,text at sample
velocity. Text samples from: J. G. Ballard, "Mythmaker of the Twentieth
Century", reprinted in RE/search: J. G. Ballard Gilles Deleuze and Felix
Guattari, Anti-Oedipus and A Thousand Plateaus (both Athlone Press) Luce
Irigaray, Speculum: Of the Other Woman (Cornell University Press) Nick
Land, "Meltdown", unpublished Stephen Metcalf, "Black Capital"in Collapse
2 and IOD 1 Jean Francois-Lyotard, Libidinal Economy (Athlone Press) Sadie
Plant, "The Virtual Complexity of Culture"in Future Natural (Routledge)
Also essential: Nick Land, "Machinic Desire"in Textual Practice 7:3 Nick
Land, "The Meat (or How to Kill Oedipus in Cyberspace)"and Sadie Plant,
"The Future Looms"in Cyberspace, Cyberbodies, Cyberpunk (Sage) Sadie Plant
"Cyberfeminist Simulations"in Cultures of the Internet Nick Land and Sadie
Plant "Cyberpositive"and Stephen Metcalf, "The Third Terminal"in Unnatural
Stephen Metcalf, introduction to Hammer of the Gods (Creation Press) ***Collapse
cyberzine Forthcoming: Sadie Plant, Zeroes and Ones (Fourth Estate) Kodwo
Eshun, Origin Unknown Nick Land, Schizotechnics CCRU director: Sadie Plant
pysap@csv.warwick.ac.uk CCRU associate members: Kodwo Eshun Beth Stryker
Manuel De Landa O[rphan] D[rift] Matthew Fuller Linda Dement Other names
to watch (CCRU, Switch, Collapse): Suzanne Livingston Rob Heath Tom Epps
Julie Nugent Anna Greenspan Rohit Lekhi Angus Carlyle Steve Goodman Robin
Mackay "Therefore, no bad conscience, nor the feeling of crushing responsibility,
two relations to the text that circumscribe and define the relation proper
to the White Man of the left. We deliver no message, we bear no truth,
and we do not speak for those who remain silent." (259) "What you demand
of us, theoreticians, is that we constitute ourselves as identities, and
responsible ones at that! But if we are sure of anything it is that this
operation (of exclusion) is a sham, that no-one produces incandescences
and that they belong to no-one, that they have effects but not causes."(LE
258) ========== Communique Two: Message to Maxence Grugier: 2001 1. What
are "pulp theory/fiction hybrids?" In France - the old old continent -
we don't have any kind of cultural studies and " cyber-culture " means
nothing. Can you explain your theories in newbies words... Many members
of the Ccru had fled cultural studies, disgusted by its authoritarian prejudices,
its love of ideology, and pompous desire to 'represent the other' or speak
on behalf of the oppressed. To us, it never seemed that the real articulacy
of the left academic elites was in any way superior to the modes of popular
cultural expression which were either ignored or treated as raw material
to be probed for a 'true' (ie ideological) meaning by white middle-class
intellectuals. Ccru has tried to connect and cross-intensify with peripheral
cultural processses (dark-side digital audio, cyberpunk, neolemurian sorcery,
numbo-jumbo, afro-futurism, indo-futurism, sino-futurism …). It seeks to
think, theorize, and produce with rather than 'about' (or -even worse -
'for') them. We think everything interesting happens on the periphery,
outside the standard modes of 'developed' existence. Ccru engages with
peripheral cultures not because they are 'down-trodden' or oppressed, but
because they include the most intense tendencies to social flatness, swarming,
populating the future, and contagious positive innovation, hatching the
decisive stimuli for the systematic mutation of global cybernetic culture.
Cyber-culture has come to be synonymous with Internet-studies. Ccru has
a more 'fundamentalist' commitment to cybernetics, whose abstract principles
of feedback dynamics, nonlinear causality, and machinic involvement are
linked to numerous issues concerning digital technology and telecommunications,
but in no way restricted to these. Ccru has consistently endorsed Deleuze
and Guattari's insistence that machines are irreducible to technology.
We consider cybernetics to be the practical science of excitement (amplification
/ inhibition of communication, mutation, and innovation). A Ccru list of
important influences would include Deleuze and Guattari's two Capitalism
and Schizophrenia volumes, with it's 'virtual materialism,' assault upon
the privilege of representation, anti-evolutionism, and implacable hostility
to the State. Fernand Braudel's rigorous differentiation (and even opposition)
between capitalism and the market economy, with 'pro-market anti-capitalism'
functioned as a guiding slogan. William Gibson's Cyberspace trilogy, which
spreads voodoo into the digital economy, demonstrating (with the Cyberspace
Matrix) how a fictional concept makes itself real. Octavia Butler's Xenogenesis
novels, for their tentacled aliens, gene-traffic, and decoded sex. Lynn
Margulis' bacterial microbiology for outlining the world of destratified
life. HPLovecraft's gothic obsessions with time-anomaly, sacred horror
of teeming, bubbling, foaming multiplicities … We are currently enthralled
by the work of Jacques Vallee and it's extraordinarily sophisticated path
to hyperstition through the UFO-phenomenon. Ccru is working on a cybergothic
'unnon-fiction' (to steal a term from Steve Beard) which interconnects
the history of computing and AI research with UFO-phenomena (alien abduction,
false-memory, and cover-ups), secret societies, and esoteric religion,
amongst other things. Ccru is an ongoing experiment in collectivity, collective
production, anonymity, and masks, dedicated to practically dismantling
standard models of social existence, by pursuing ethics in the spinozistic
sense (experimental production of collective bodies). Ccru feeds its own
researches back into its own microcultural production. Its basic tool in
this respect is 'pulp-theory/fiction hybridity' or Hyperstition (see below).
2. What were the goals of Virtual Futures, Afro-Futures and Virotechnics
? These events sought to reinforce and energize the interrelations between
elements of theoretical research and popular culture. It was important
to us that they were characterized by a minimum of academic stuffiness,
and that contemporary sonic culture (techno and jungle) were as thoroughly
mixed into proceedings as possible. Ccru particularly encouraged polymedia
presentations, involving spoken text, audio, and video or other visuals.
Our assumption throughout was that philosophy/social theory could be exciting
and that the deadening of all visceral response to intellectual exchange
was a semi-deliberate strategy serving oppressive social interests. The
three Virtual Futures conferences were large international events, and
thus only diffusely focused. Over the years guests included Manuel Delanda,
Pat Cadigan, Stelarc, Scanner, and many others. Afro-Futures was a smaller
scale event in which members of the Ccru along with key ally Kodwo Eshun
explored the interlinkages between peripheral theory, rhythmic systems,
and Jungle/Drum&Bass audio. Virotechnics was organized outside the
academy, and was dedicated to the theme of cross-propagation between cultural
viruses and digital technologies. 3. What is the concept of the the Syzygy
hyperstition matrix ? Syzygy was the title of a five week 'art' show co-produced
by Ccru and Orphan Drift. The name means 'twinning' or 'twin-system,' and
this theme operated as a multilevelled guiding thread. It was during the
production of this event that Ccru made contact with the virtual Continentity
of Lemuria, which taught us many secrets that we have since attempted to
formulate as 'Digital Hyperstition.' Digital hyperstition is already widespread,
hiding within popular numerical cultures (calendars, currency systems,
sorcerous numbo-jumbo, etc.). It uses number-systems for transcultural
communication and cosmic exploration, exploiting their intrinsic tendency
to explode centralized, unified, and logically overcoded 'master narratives'
and reality models, to generate sorcerous coincidences, and to draw cosmic
maps. The Lemurian biomechanical hyperculture propagates itself through
decimal notation, whose latent interconnections are demonstrated in the
Numogram (see web-site): an occult diagram of time and practical guide
to the ethics of unbelief. An initial attempt to clarify this topic has
been made in the most recent issue of our journal Abstract Culture. According
to the tenets of Hyperstition, there is no difference in principle between
a universe, a religion, and a hoax. All involve an engineering of manifestation,
or practical fiction, that is ultimately unworthy of belief. Nothing is
true, because everything is under production. Because the future is a fiction
it has a more intense reality than either the present or the past. Ccru
uses and is used by hyperstition to colonize the future, traffic with the
virtual, and continually re-invent itself. ------------- http://www.altx.com/wordbombs/fuller.html
People Would go Crazy: hardcore methodology by Matthew Fuller In the 17th
century the word technology was used almost exclusively to talk about grammar.
At the and of the twentieth century, the machines of language and those
of electrons are so irresolvably cross-contaminated that, whilst we are
at once facing a world in which you can only tell whether you're a child
or an adult by making the transition from being infantilised by the hardwired
ontology of Disney to being infantilised by that of Microsoft, we are also
being taken by the migration of language into new, transmogrifying contexts
that shred such stable relations on contact. Flesh is becoming increasingly
protean. Those who have historically been considered morphologically dubious
share the doubled situation of facing both immense opportunity and of becoming
increasingly subject to alteration and 'improvement'. Current stories about
the human being do not fit what is actually occurring. A hybrid methodology
of existence and of fiction is required to encompass a new, complex, and
contradictory lived experience. Machinic fictions are ideally placed to
deal with this mess of situations. As JG Ballard has pointed out, Science
Fiction is the literature of the Twentieth Century. This having been realised
as a commonplace, we have seen a frenzy of stitching healthy plastic organs
into a dead patient, SF writers such as Gibson and Stephenson have been
decontextualised to revivify mainstream fiction. Never at one with itself
though, Science Fiction has always been a place for border creatures of
every kind. A speculative fiction that throws up more instruments of speculation
than a looted hospital. As a motor of aberrant reflection it throws the
truths that we are told about the world by endlessly narcissistic father
/ critic / author / authority into the blender. The multivalent paranoia
of Science Fiction writers like Dick or Ballard or Pat Cadigan for instance
is also echoed in the feminist writings of Luce Irigaray where: "The logos
would no longer simply be, for itself, the means of translating his will
alone; of establishing, defining, and collecting his properties into one
Whole. Truth would loose its univocal and universal character. It could
be doubled, for example. At the very least it would have a reverse and
an inverse to shore up its constitution as such. In any case, another,
still hidden, aspect. Or another focus? Another mirror? There would be
no way of knowing which way to look any more, which way to direct the eyes
(of the soul) in order to see properly. People would go crazy." An addiction
to driving themselves or other people crazy throws writers into a directly
political conflict. News Corporation's Asia Star satellite broadcasting
is programmed in Mandarin, the region's numerically and culturally dominant
language. Steamrollering out linguistic variation from space is seen by
Rupert Murdoch as being extremely useful in the furtherance of the global
mind become global clapometer. As he says: "it will be not only prosperity
that we catch in our networks, but also order - and ultimately peace."
This plane of consistency enforces nothing but a peace smoothed by conformism:
a smothering of tongues. Murdoch sends his image into orbit, but there
is an intense fear at the source, the single satellite commanding millions
of reflections of its signal. This is the fear of an alteration in some
mirage that is always on the verge of being deformed, or transformed, but
of which it still claims to be the source. Hypertext One development that
has been thrown up in a literary context as a way of maximising the circumvention
of the annexation of speech by a single unitary source is hypertext. The
institutionally solidified nexus of hypertext studies around George Landow,
Jay David Boulter and Michael Joyce has largely failed to produce anything
much more than smug diagrammatical work-outs of a neutered poststructuralism.
The great thing about hypertext after all is that there can be no masters,
no final word. You can go on. and on. and on. Its interminability may just
have something to do with why it is so favoured by many academics with
an eye for the long chance. Thankfully though, the proponents of the tastefully
interminable find themselves terminated, or at least locked into a loop
by a kind of sorites paradox: how much literariness can you remove or do
without before language ceases to be literature and your status by association
evaporates into just one more breath? Both hypertext and print have the
doubled aspect of striation and smoothness. In a solely hypertexted universe,
someone would have to invent the book. Once hypertext leaks out of the
hands of its apologist priesthood however the virulence of its dynamic
becomes apparent. One thing that is particularly important, and that we
relish for instance in the production of I/O/D is that because of the strictly
transitory nature of the format, the thing has got to be done for the moment.
As Bruce Sterling notes: "It doesn't matter how brilliant your program
for the Atari 400 is, it's dead. Some huge section of the American populace
sweated blood over software for the Apple IIe and pretty soon it will all
be gone. It's just dreadfully impermanent; it's more like performance art
than literature" Already well beyond the power of its would-be intellectual
protectorate, the dynamic of hypertext - which has to be treated as a conceptual,
or virtual dynamic, rather than one which is 'realised' once and for all
by any particular existing system - is emerging and taking shape as the
result of a massive amount of distributed action. As Sadie Plant states,
embodied as the Nets, this dynamic is starting to 'creep through the screens
of representation which have previously filtered it out.' This is not of
course to say that these screens themselves are not subject to constant
reinvention. Again calling up Irigaray, we can note that, attempting to
reinvigorate the Source, John Perry Barlow, certainly a poet warrior in
the truly classic sense, can glibly assert: "The Internet is female because
it is horizontal." Whilst being on your back ain't of course so bad, I
can think of better ways of getting formulated over than by a barely renovated
third wave control schematic gaining iteration via a superannuated technoshaman.
Forced into maintaining its functionality at the edge of dissolution into
the hall of mirrors, control has had to become fractal, operating in a
recursive manner at many different scales. The insect panic of not being
the Source - the priest of the future - the same fear that composes Murdoch,
is largely what passes for a culture of legitimation of the networks. One
can sympathise of course with the motivation: "Next to domination, ownership
is no doubt somewhat trivial." Blocking up and dismantling the future filtration
systems of the net has been taken up by political activists, notably the
Net Strikes initiated by the Italian group Strano Network. Complementarily,
finding ways to maximise their inevitable leakage, has, on the Alt X web
site hosted by Mark Amerika provided a crucial example of how in a networked
environment a writing project differentiates and, more necessarily, conjoins
at times with a publishing project. One effect of the nets, and of this
site in particular, is to encourage a mutant fictionality which currently
could hardly exist in any other context. Stupid At present, technoculture
in the UK tends to find itself most intensely realised on the dancefloor
rather than on the telephone, and techno music provides a vector through
which much of the fiction under discussion in Word Bombs can be explored.
The materialist and instrumental understanding of language meshes with
the compositional imperatives of techno: fast, distorted and brain-damaging
- according to Praxis Records, true hardcore is, "Šanything that's not
laid back, mind numbing or otherwise reflecting, celebrating, (or) complementing
the status quo" . Speaking as a man whose publicists dare not shy from
calling him one of the century's greatest thinkers, George Steiner states
that one of the key paradoxes of the 20th Century is how a person could
possibly spend all day sending people to their deaths in concentration
camps and then spend the evening listening to Schubert. He might also ask
how one could possibly spend the day part-time teaching at a university
and then spend the evening listing to pre-digested sonic baby-food. What
he should ask instead is: how could one not? It is always those who imagine
themselves the most refined that are the most pathetic and the most absently
vicious. "Good taste is the framework of values which the dominant group
uses to keep itself on top. Rather than just an exploration or celebration
of superiority, it is an active agent of repression" For this position
to be kept in perpetuity demands up-to-the-minute camouflage. Enter the
Hampstead Junglist. 'Intelligent' jungle / techno / drum and bass is an
aural equivalent of 'painterly' painting. The display of a tastefully broken-in
repertoire of twitches, blotches and dribbles correlate to a music where
'interestingness' becomes a chore. It certainly never threatens to overthrow
itself into anything approaching danceability. Or stupidity. The vat-bred
fictional equivalents of this street-tweaked mush are the stock-in-trade
of literary publishing. More than mere novels - they are by-products of
a well-disciplined self: intuitive, spontaneous and replete with a strictly
inculcated ability to dither on just the right side of any boundary. Not
surprisingly then, the yawnsomely canonical intertextuality of much postmodern
fiction gets the stinky finger in favour of texts more along the lines
of that described by DJ Deadly Buda in his Morphing Culture manifesto,
hyping a turntable scenario that is seriously threatening to the continued
stable identity of anyone expecting what they're expected to expect: "Jumble
break to the best part of every song - that jet engine take off, that good
ole football crowd noise, the explosion at the beginning of every KISS
live album, that nutty pre-acid house Balearic movementŠ Šhow often do
you wish the musicians would just give it up and make a whole song out
of all those kool sounds?" As TechNET have shown, and like Jazz before
it, electronic dance music seems set to provide a hardcore methodology
that will find in fiction a perfect medium for its mutation and transmission.
Killa tactics of synthetic intelligence leaking down the wires, bass-bins
thickening up the air into a gel, hardcore methodology routing itself via
the keyboard, spread as thick black ink on the page: an unexpected smile
in a dark room. Moses, down from the turntables and still sweating, hands
you a flyer. In the savage flicker of a strobe, it seems to make some kind
of sense: 1 pace "Speed is not unimportant. Never read anything over one
hundred pages in length" . 2 "What? You search? You would multiply yourself
by ten, by a hundred? You seek followers? Seek zeros!" 3 what is hardcore?
The Cheetah Chrome Motherfuckers are a hardcore band. Copacetic huh? Feedback
from media previously firewalled apart routing itself through a whole seething
mass of others. Synthesising machines routing themselves through anything
a nervous system can carry, forcing the frenzied virulence of hypertext
into a focused hit of words-in-a-row. Amputation Cutting the legs from
underneath a purely stylistic interpretation of what I am suggesting, Hardcore
Methodology - being nothing itself but a stupid joke - just laughs at its
inevitable clip-art reiteration. Already of course we are seeing cases
of weakly flava'd repetitive strain injury afflicted narrative arriving
by the full-colour coverload onto the shelves of bookshops. Books that
are undifferentiable from their neighbours except for the meagre evidence
of being run through a club culture filter in a UK echo of the inability
of the dumber proponents of Avant Pop to stop themselves dropping MTV and
NWA into every essay. Not because they realise the fact that three-letter
acronyms are intrinsically cool, but because in the case of the long gone
NWA this is the only rap group they've ever been told they should like.
Cut it off. Hook fictionality up to a cultural dynamic that is pro-experience
and therefore mistrustful of 'explanations'; that is hooked on that experience
and its intensification and not afraid to push things and mutate to be
so, (despite having been heavily criminalised for the benefit of the churchyard
faced upholders of 'public decency' and their everluvin twin - those whose
salivary glands go into a spasm at the possibility of herding the carnival
into their expensive, star-focused, tightly patrolled enclosures); that
is intent on finding ways of being social, and is not afraid not to compromise
in order to keep things going. As the hardcore methodology of dance plugs
into the hardcore methodology of fiction it might learn a few things too.
Not the least of which might be getting an (unreverential) rush off of
those on the wrong side of its age barriers. Here comes the Old Skool -
and no shell toes? Whilst most writers could do worse than produce an encore
to the typing of their precious manuscript with an equal number of strokes
of the delete key we are approaching the release of a fiction that swallows
up the processing of words into the mutation engine of the sound studio.
All the adaptivity and dysfunctionality of language is reinvigorated, thrown
into communication. Hardcore methodology, mashed up in the machines, induces
a depth and power of revelation. Infesting fiction with the disturbing
noise at the depths of language; and always yet another, still hidden,
aspect. Another focus. Another mirror. Another rhythm. There is no way
of knowing which way to look any more, which way to move your body in order
to hear properly. People are going crazy. ================ WARNING: This
Computer Has Multiple Personality Disorder by Simon Pope and Matthew Fuller
- IntrODuction This paper comes largely out of our experience in the production
of the hyperactive electronic zine I/O/D. So firstly then, we should explain
what I/O/D might be. Technically it is a Macromedia Director Projector
with associated files that is small enough to be compressed onto one High
Density disk. That we choose the size to be restricted by the limitations
of the most mundane and cheapest storage device is important, because it
means that I/O/D is very easy for people to copy for their friends - or
surreptitiously leave on the computers of their enemies. It also means
that because of its relatively small size it is quite feasible for it to
be made available over computer networks such as the internet and on Bulletin
Board Services. Distribution over the networks is in fact the major way
in which I/O/D gets moved around. It is also worth noting that within the
internet, where degrees of access are stratified, we make I/O/D available
via a variety of protocols: ftp; gopher; and world wide web, in order to
ensure that as many people as possible have the option of downloading it.
Alongside the sites that we maintain a direct connection to we are encouraged
to find that I/O/D is also being independently distributed by people we
have had no contact with. Additionally, we should state that I/O/D is not
on the nets in order to advertise anything but itself. It is specifically
an anti-elitist contribution to the development of the nets as a 'gift
economy'. Consequently, it is also a way of producing some effects whilst
avoiding getting too enmeshed with the humourless circus of reputation
and career making that the techno-theory genre is fast becoming. I/O/D
is put together by a production team of three, based in Cardiff and London
and also aided and abetted on the nets by Calum Selkirk, based in Chicago
- a relationship obviously made possible through computer mediated communications.
Individuals or groups whose work we feel an affinity with either contribute
independently or are asked to submit some work. In most cases these will
be people who do not have specific knowledge of multi-media design but
whose activity as text, graphics or sound-makers correlates with some of
the dynamics we are playing with in the construction of I/O/D. Before we
return to a more detailed discussion of I/O/D though, we need to situate
it within an episodic context of some ongoing antagonisms around the nature
of a technologised physicality. - Inhuman Potential The mind as an interface
is no longer viable. McLuhan's assumption that the media-net would become
merely an extension of the human nervous system with the humanoid core
remaining its 'same old self' has provided a touch stone for both the liberation
rhetoric of writers such as Howard Rheingold and for tele-vangelists seeking
the redemption of the free market through the virtual corporation: a model
of business as the management of flows that is at once homely and sublime
- yet not of course lacking in sadistic perks. Envision, if you can stomach
it, Nicholas Negroponte, graced by smart cufflinks "communicating with
each other via low orbiting satellites," each with, "more computer power
than your present PC" . The human-in-control becomes a neurological disaster
area. Can any amount of attention from ambient computers dispensing technological
anaesthetics stifle the screaming pain of communication: Rwanda on line
three. - Surgical Strike Cartesianism Hans Moravec gathers together all
the kilobytes of his consciousness and downloads into your smart cufflinks,
announcing: "Body-identity assumes that a person is defined by the stuff
of which a human body is made. Only by maintaining continuity of body stuff
can we preserve an individual person. Pattern- identity, conversely, defines
the essence of a person, say myself, as the pattern and the process going
on in my head and body, not the machinery supporting that process. If the
process is preserved, I am preserved. The rest is jelly." Imagine how good
this would sound to Walt Disney stuck in a freezer somewhere in California.
Disembodied intelligence of this kind is always a con. If these glowing
elite minds migrated into data-space we can be sure that at some point
they would have to recognise a co-dependency with the material world, one
composed primarily of minerals, electromagnetic sensation perhaps, and
a new kind of physicality would emerge - possibly something akin to what
Ballard imagines in his repeated metaphor of the supercession of 'civilisation'
by the crystalline. The mind always emerges from the matter. The entropic,
dirty, troublesome flesh that is sloughed off in these fantasies of strongly
masculine essentialism is implicitly interwoven with the dynamics of self-processing
cognition and intentionality that are relegated to a substance called "mind"
- as Kevin Kelly points out in 'Out of Control': "We know that our eyes
are more brain than camera. An eyeball has as much processing power as
a supercomputer. Much of our visual perception happens in the thin retina
where light first strikes us, long before the central brain gets to consider
the scene. Our spinal cord is not merely a trunk line transmitting phone
calls from the brain. It too thinks. We are a lot closer to the truth when
we point to our heart and not to our head as the centre of behaviours.
Our emotions swim in a soup of hormones and peptides that percolate through
our whole body." Moravec's idea of the self as pattern repetition is echoed
rather differently by another cybernetician, Norbert Wiener, "We are but
whirlpools in a river of ever flowing water. We are not stuff that abides,
but patterns that perpetuate themselves" And out of this river, perpetually
muddied with peptides, hormones, immune response systems, viruses, pesticides,
sugars, and illicit substances emerges the cognitive body. However, lest
this should materialise as a 'holistic essentialism' that swaps meat-fearing
disembodiment for a dread of the machinic body we should move on to acknowledge
that homo sapiens evolved as a result of a deep, co-evolutionary intimacy
with the 'inhuman', with tools, with the machinic. At the very core of
our development as a species is the gradual bootstrapping of the brain,
the supposed Slot In Memory Module, which according to neodarwinian evolutionary
theory is itself possibly the result of a possibility-space opened up through
the development of the opposable thumb . A mutation in one part of the
body, with far-reaching side effects on all others, that opens it up to
a combinatorially explosive array of relations with other forms of matter.
Thus, we are always already deeply post-human. That information processing
technology is being touted as the 'next opposable thumb', generating the
possibility-spaces that we are currently living through, does not of course
lead us in an automatic loop back to a glorious disembodied life on the
outer reaches of some computer's sub-directory. A survey of most contemporary
multimedia work however, might convince us otherwise. From Automated Telling
Machines, through the freebie CDs on the covers of computer magazines;
corporate presentation material and 'high-end' games such as Myst, contemporary
multimedia constitutes presence in relationship to this post-human body
as a process of exclusion. What we mean by this is not that your much-prized
beigeware friend is some kind of digital Schengen Area that cruelly excludes
your disruptive meat, or that we need to start picketing the offices of
Apple for myoelectric implants to be packaged with every CPU in place of
a mouse, keyboard and monitor, but that the models of presence that do
come bundled (but tellingly unseen) with most current multimedia incorporate
highly stratified and tightly channelled notions as to what this relationship
might be. Computers are embodied culture, hardwired epistemology, and in
the area we are focusing on two parallel sequences are occurring. They
are implicitly related but whilst twisting in and out of each other, operate
in different ways. The bureaucratisation of the body into organs and the
privileging of the eye in multimedia is one. The renewal of encyclopedism
is the other. - The bureaucratisation of the body into organs Much has
been made of the notion of the eye as primary organ, (and primary also
in the genitive sense) around which bodies (literally), organise. From
Dziega Vertov's cruise missile approach to Berlin , to anti-sex feminism's
abhorrence of pornography the eye is seen as a unifying and explanatory
media in its own right. Perhaps a certain apotheosis of the privileging
of the eye is reached in the writings of Guy Debord where he simultaneously
assigns an immense life expropriating power to 'The Order of Appearances'
whilst simultaneously positing a different type of image, the printed word,
(of his writing of course) as the catalyst for the destruction of this
world of relations mediated by images. Sight is the most theorised, most
contested over, yet in some ways least contested of the bureaucratised
senses. Within multimedia, the desire to transfer information without transforming
its integrity has remained strong, and the senses have been prioritised
and valorised in order that this system should work efficiently. With the
eye situated as the locus of authority, assurance is passed to the other
senses, which are called upon to further validate the evidence presented
before them. Following the sales mantra "image, text, sound, video" , graphical
interfaces reinforce this rigorous separation of the senses into a hierarchy
of feedback devices. In other words, as you will see when using anything
from Grolier's Multimedia Encyclopaedia to the Arts Council's Anti ROM
interaction is fed first and foremost through the circuits of sight. Within
the sight-machine of contemporary multimedia then, the mind has to be re-thought
or re-programmed as a simple processor of Information Graphics. Once recognised
and regulated, sense can be made and order imposed on data; it can be subjected
to the strictures of simple structuralisms where sign = signifier and all's
well with the world. Under the heading comes the sub-heading, under which
comes the sub-sub-heading, until all complexity can be understood at a
glance from somewhere outside the filing cabinet... Through this representation
stacking, it is hoped that a mind-melding transparency can be achieved:
interfacing the disembodied mind and disinterested data. The mind is immersed
into the encyclopaedic data-space, as charmingly freed from visceral distractions
as a bottle of deodorant. That the eye sloughing off the cankerous meat
in an attempt to fuse mind and data, one electronic pulse with another,
chooses to confirm its conferred status shouldn't be a suprise. The eye,
released from constraint, with a mind of its own, 'can take any position
it wishes'. What is remarkable is that this pursuit of the monadic eye
realises itself in most contemporary multimedia as nothing much more than
a subset of behaviourism: with users barking, whining, and slathering at
the interminable (once in a lifetime) chance, to point and click their
path to dog-food heaven. - The Renewal of Encyclopaedism: Pavlov's Menu
At the centre lies the desire for the enforcement of meaning. The encyclopaedic
organisation of data preserves a point of privilege from where the eye
can frame the objects of its desire. There are no obstacles in this space,
only straight paths cleared to open unimpeded vistas. Within this space,
intention steps toward the user, to be understood without the hindrance
of cumbersome literary convention. All can be conveyed from within the
universal iconic language, a visual, pre-linguistic key, clearly carrying
reference to the ciphered world. This virtual architectural space has been
constructed by an unseen author, whose intention is usually to impose a
closure to a narrative, to provide the goal to be reached by means of one
of many approaches, the reader/user/participant/player, (choose according
to theoretical preference) can wander, but must not stray from the intended
thoroughfares. From any point it is possible to look back along your path,
holding on to Ariadne's thread, taking solace in the fact that all you
have seen is recorded, marked, referenced and ultimately retraceable. As
an aside, the theoretically critical academy has in parts too been enthused
by the possibility of hypertext under the rubric of the Renewal of Encyclopaedism.
Through the would be Grandpappy of Hypertext Studies, George Landow, we
are already seeing a drive to standardise linking protocols and the types
of connection that can be made from text to text, the centre is already
attempting to ossify meaning production into a regulated and standardised
practice. Don't worry, be happy - everything is under Control. Rather then,
than urge multimedia as a potential grounds for the renewal of spectatorship,
representation and simulation, terms borrowed most closely from cinema
and devolving power to the primal eye, or to engage in the Renewal of Encyclopaedism's
drive to suburbanise multimedia, we are perhaps more interested in developing
something that is synthetic. Specifically: a process of playing with process.
- material processing "Leroi Jones (aka Amiri Baraka) once made the comment
that what black people needed was a typewriter that responded not only
to the hand but to gestures. That way, said Jones, Black people's full
involvement in their lived space could be shown and not the pale white
version which he claimed writing alone gave." We would like to suggest
that this comment has resonance beyond the important and suggestive point
that Baraka makes here. Configurations of flesh that have been disarticulated,
that are The Unspeakable, are particularly attractive to us. With I/O/D
we are in part attempting to articulate some of those configurations that
have been erased from the multimedia vocabulary. However, with a nod to
Anti-Oedipus, and as a concession to anyone who has had the fucked up experience
of using I/O/D, we must give a body-swerve to some of the essentialism
that Baraka's statement avers and note that "Desiring-machines only work
when they break down." In disrupting notions of a 'transparent' interface,
and in investigating the possibilities of physicality in multimedia we
are not therefore proposing to formulate any new paradigm of multimedial
correctness. Nor do we find as with any amount of 'artists' that merely
scattering computers, camcorders, movement sensors and monitors around
a gallery in a vague utopian gesture towards interactivity deserves any
response but stolen equipment. We propose neither a new disciplinary regime
nor an abstract vacant 'creativity'. If meaning-construction always takes
place at the margins of a system - and meaning-enforcement at the centre
- then computer networks, where margin is folded in upon margin, in an
economy of fecund, yet excremental exchange are currently a useful place
to find oneself. In part it is this sense of margin rippling into margin
that I/O/D as a border zone process attempts to play with. What has been
marginalised as incidental in behaviourist multimedia: the flitting of
a user's hands over the keyboard, the twitching of the mouse, repetitive
or arrhythmic events, noise, confusion... accretes into a secret history
of makeshift, unimagined conjunctions. I/O/D then is an intensely haptic
space. In issue two for instance, the arrow-head cursor is largely abandoned
and replaced both by position indicating sound and by the springing into
life of the sprite that it would previously have been needed to animate.
Within the boundaries dictated by the hardware of an average Macintosh
computer we are coaxing out what has been disarticulated: different types
of mouse movement; exaggerated clicking routines; the slashing and burning
of Macintosh Operating System norms; larger than screen interfaces; repetitive
strain injury; sloppy directories; a physicality of multimedia that correlates
with what Ronald Sukenick has termed "fertile nonsense" ; the feeding back
of an action in one sense into another to produce a cross-wiring synaesthesia...
And it is perhaps as synaesthetics, the neurological disordering of smart-cufflinked
control, that within the abstract machine what we have here reviled - text
and image 'as truth', the renewals of spectatorship and encyclopaedism,
the privileging of the eye, - will loose themselves as the prime loci of
authority to be superseded by pattern finding and dynamic engagements with
material processing. A dynamic that at once both infests bodies and that
actually opens itself up to positively engaging with a bodily contamination
that has always been operative, but bubbling away in the background. I/O/D
is available to download from the internet at: http://www.pHreak.co.uk/i_o_d/ |