ahistoricality in software and architecture (fwd)
Mon, 9 Aug 1999 15:06:00 -0400 (EDT)
It's still a rough draft, as are most things I send to kragen-tol.
<email@example.com> Kragen Sitaker <http://www.pobox.com/~kragen/>
Mon Aug 09 1999
91 days until the Internet stock bubble bursts on Monday, 1999-11-08.
---------- Forwarded message ----------
Date: Mon, 9 Aug 1999 15:04:22 -0400 (EDT)
From: Kragen <firstname.lastname@example.org>
Subject: ahistoricality in software and architecture
Here's a rather rough draft describing some interesting connections
between architecture and computer science. I thought you might be
Phil Agre is fond of pointing out computer science's ahistorical
perspective. In an RRE article entitled 'notes and recommendations'
from 1999-06-16, currently available at
http://www.egroups.com/group/rre/1110.html, he writes:
Computer scientists are much more tuned to usability-in-depth,
at least a particular kind of depth, one that is suited to the
design of extremely complicated artifacts. But they have an
ahistorical, acontextual understanding of their "user". They
believe (if only tacitly) that "usability" is a property of the
artifact itself, and not a relation between the artifact, the
user, the context, and the culture. As a result, they are
prone to getting caught in historical traps: because everyone
is accustomed to one paradigm of interaction, that one paradigm
is regarded as inherently more "usable" than the ones that
nobody is accustomed to.
Later, in another 'notes and recommendations' on 1999-07-04, currently
available at http://www.egroups.com/group/rre/1121.html, he writes:
So the point isn't that individual computer scientists are
consciously trying to portray users in ahistorical terms.
Indeed in many cases they are trying *not* to. But to the
extent that their system of institutionalized language and
practices presupposes an ahistorical approach, the
practitioners will nonetheless, for most purposes, produce that
outcome in the end.
I think this tendency comes from mathematics, which, as Agre writes,
also has a strong tendency to view its objects of study ahistorically.
The parts of computer programming that constituted "computer science"
in its earliest days were really branches of mathematics. In a really
superb paper on 1998-11-09 entitled "The Practical Logic of Computer
Work", presently available at
http://www.egroups.com/group/rre/956.html, Agre writes:
In his reading of John Dee's sixteenth-century commentary on
Euclid, Knoespel (1987) describes what he calls the "narrative
matter of mathematics": the discursive work by which an
artifact such as a drawing on paper is exhibited as embodying a
mathematical structure such as a Euclidean triangle. This
process, which is familiar enough to modern technologists as
the work of mathematical formalization, consists of a
transformation of language into something quite different from
language, namely mathematics, and its challenge is to
circumvent or erase those aspects of language which are
incompatible with the claim of mathematics to represent the
world in a transparent, unmediated, ahistorical way.
. . .
Brian Smith (1996) makes a similar point when he speaks of
"inscription errors": inscribing one's discourse into an
artifact and then turning around and "discovering" it there.
. . .the social reality of the square resides in a reflexive
relationship between designer and user, and the success of that
relationship lies precisely in the achieved unremarkability of
that relationship and the possibility of routinely locating the
"squareness" in the artifact itself.
. . . A computer is, in an important sense, a machine that we build
so that we can talk about it in a certain way; it is successful
if its operation can be narrated using a certain vocabulary.
As the machine is set to running, we too easily overlook the
complicated reflexive relationship between the designer and
user upon which the narratability of the machine's operation
depends. . . .
. . . the practitioners of AI with whom I worked for several
years, and among whom I was trained, regarded formalization
very differently, as a means precisely of freeing their work
from what they regarded as the unruliness and imprecision of
vernacular language. As a result, I want to argue, the field
has been left with no effective way of recognizing the systemic
difficulties that have arisen as the unfinished business of
history has irrupted in the middle of an intendedly ahistorical
Agre goes on to discuss the particular ways that cultural and
contingent things, such as a tendency to conflate the realms of
Platonic ideals and concrete activity, have found their ways into the
standard AI theories. (It's a very interesting paper; I'm afraid I
don't understand it well enough to summarize it.)
Agre came from the late-1970s/early-1980s Lisp-hacking AI community at
MIT, which was apparently well-described by the beliefs in
ahistoricality and acontextuality he mentions above.
Richard P. Gabriel, one of the founders of Lucid, came from the same
background. In his widely-read 1991 paper, "Lisp: Good News, Bad News,
How to Win Big", he has a section entitled "The Rise of `Worse is
Better'", which can be read at
http://xent.ics.uci.edu/FoRK-archive/summer96/0591.html, among other
places. He contrasts two schools of software design, which he calls
the "MIT/Stanford" and "New Jersey" schools.
>From his description, the "MIT/Stanford" style of design is
characterized extremely well by Phil Agre's description. It is focused
on building ahistorical, mathematical artifacts. In contrast, the "New
Jersey" style is focused on building immediately-useful tools.
Gabriel says he thinks the "New Jersey" style -- which he labels "worse
is better" -- is likely to produce things more people use, simply
because they will be easier to port, perform better on slow and small
machines, and therefore be more ubiquitous.
So Gabriel, at least, thinks this pseudo-mathematical method of
designing and building software tends to produce software that isn't
widely used, because a more pragmatic approach tends to produce
software that is more useful to more people, even if it is technically
(ahistorically, mathematically) inferior.
[PARAGRAPH NEEDS REWRITING]
The current hot topic in the trenches of programming is something
called "Design Patterns". Gabriel was one of the original promoters of
the "patterns" idea in the software biz; basically, the idea is to
describe ways that people have successfully dealt with particular
combinations of requirements (or "pressures") in the past, so that you
can recognize those combinations of requirements and use the solutions
(or "patterns") that have worked well.
[PARAGRAPH NEEDS CONFIRMING]
The originator of these ideas was Christopher Alexander, who is an
architect. He wrote some books describing his ideas back in the 1960s
and 1970s; they go beyond what is described above, including a process
of collective design by the users of the buildings and their
architects, a process of gradual adaptation of buildings to people's
needs as those needs are discovered, and a process of involving
architects in actual construction to teach them about the materials.
In other words, Alexander explicitly recognizes the contingent,
socially-constructed nature of architecture and elevates it to a
position of prime importance in his method.
Alexander's work was, to some degree, a reaction to a pathological
obsession with mathematical perfection and stark simplicity, to the
exclusion of human needs, that has afflicted the architects of this
century, called Modernism by its proponents. This disease of Modernism
is curiously like the obsession with mathematical perfection to the
exclusion of human needs that has afflicted computer science.
[DAMMIT I NEED TO READ THE BOOK]
Yet Alexander's goal, with his "Pattern Languages", is to achieve a
"timeless quality" with his buildings. His first book on the subject
is entitled _The Timeless Way of Building_.
[I NEED A CONCLUSION]
Isn't that funny?
[oh, should mention Larry Wall's postmodernism speeches on Perl]