Postphonetic Writing and New Media

Lydia H. Liu

Postphonetic writing has inaugurated the future of new media. As the technology continues to evolve and morph into something we may not yet know how to characterize, one of the first things we should interrogate is the idea of the phonetic alphabet. Inasmuch as the alphabet lies at the foundation of our literacy, literary theory, linguistics, and information theory, the theoretical implications of this construct need to be rethought in light of the advent of postphonetic writing and new media. Is the alphabet necessarily phonetic? This somewhat facetious question leads us to that other enduring, but contentious, issue which had troubled the philosopher Jacques Derrida: What is writing?

Derrida’s insistence on the primacy of writing is well known but somewhat curious from this perspective because it coincides with the development of biocybernetics and the discovery of the genetic code. On closer inspection, what seems like a coincidence is actually the philosopher’s reaction to the news of biocybernetics. Derrida evoked the ‘information within the living cell’ and ‘the cybernetic program’ to elaborate the notion of the grammè or graphemein his essay ‘The End of the Book and the Beginning of Writing’.1 More interestingly, he treated the biocybernetic developments of his time as contemporary instances of a generalized ‘writing’ that would seem to suggest radical possibilities for the project of critiquing Western metaphysics. This attempt to fold biocybernetics into grammatology raises the issue of whether the so-called ‘information within the living cell’ can supply the kind of evidence Derrida was looking for or whether it exemplifies the same rhetorical loop as he was unraveling elsewhere, in particular with respect to the European metaphysical tradition. No doubt that the decades-long deconstruction of logocentrism has proven extremely fruitful in clearing the way for innovative views of writing but it is time, I believe, to reassess the critical project of grammatology and its relevance for writing technologies.

It is often said that the technology of writing has been instrumental in the making of cities, empires, civilizations, long-distance trade and communication over the past millennia and brought about electronic global capitalism and increasingly networked societies in our own time. Nietzsche made his prescient remark in 1878 that ‘The press, the machine, the railway, the telegraph are premises whose thousand-year conclusion no one has yet dared to draw’.2 In this Nietzschean picture of future technologies, writing clearly dominates. The sheer amount of written and printed record, and electronic information stored in data banks, libraries, museums, archival centers and global communication networks indicates the profound degree to which writing has transformed our lives and consciousness. But apart from a general consensus concerning the power of writing as technology, everything else seems up for grabs. Contemporary theorists who continue to work under the shadow of Marshall McLuhan exhibit a tendency of taking alphabetical writing for granted even as they analyze its relationship with print technology on the one hand and with electronic media on the other. The slowness in recognizing the metamorphosis of alphabetical writing across the disciplinary divide has prevented us from knowing exactly how a given idea of writing migrates from discipline to discipline. For instance, did Claude Shannon and Roman Jakobson share the same view of the alphabet? How was postphonetic writing invented? Why was this writing deemed necessary by engineers of communication systems? Where does it stand in the making of biocybernetic systems?

If informatics and linguistics each depart from different assumptions about writing, they must arrive at rather different results in view of the ambiguous identity of alphabetical letters in respect to phonetics, visuality, and spatiality. Whereas modern linguistic theory has tended to perpetuate the phonocentrism of European comparative philology, algorithmic thinking has always revolved around the ideographic potentials of alphabetical writing thanks to the non-phonetic character of mathematical symbolism. In other words, writing persists in algorithmic thinking in spite of the linguistic sign.

In a recent study I devoted to exploring the interrelations of James Joyce, Claude Shannon, and Derrida, I tried to draw attention to one of Shannon’s theoretical constructs called ‘Printed English’. Shannon conceived of his Printed English as an ideographical alphabet with definable statistical structures which is composed of a 27-letter alphabet including letters A to Z plus a ‘space’ sign. Printed English entails a symbolic correspondence between the twenty-seven letters and their numeral counterparts and has nothing to do with phonemic units in the spoken language. As a post-phonetic system, this statistical English functions as a conceptual interface between natural language and machine language. As one of the most significant inventions since World War II, Printed English is a direct offspring of telegraphy because it is based on a close analysis of Morse code conducted by Shannon himself. The novelty of his Printed English lies not only in its mathematical elegance for encoding messages and designing information systems beyond Morse Code but also in the reinvention of the very idea of communication and of the relationship between writing and speech. 3 Printed English functions as postphonetic writing precisely in this alphanumerical sense with profound implications for what Walter Ong has called ‘secondary orality’ 4 because it refigures the biomechanics of human speech in such a way as sound and speech can both be produced, rather than reproduced, as an artifact of AI engineering, the example being TTS (text to speech) synthesis. 5

It is worth pointing out that the ‘space’ symbol in Printed English is a conceptual figure, not a visible word divider as is commonly observed in some writing systems. The centrality of printed symbol for technology has been well captured by Friedrich A. Kittler as follows: ‘in contrast to the flow of handwriting, we now have discrete elements separated by spaces’.6 The letter ‘space’ owes its existence to the statistical, rather than visual or phonemic, parameters of symbols. It has no linguistic meaning insofar as conventional semantics is concerned but it is functional as a meaningful ideographical notion. However, this point is difficult to grasp until we tackle the long-standing attribution of difference among non-alphabetical writing systems along the spectrum of pictography, ideography, and phonetic writing.

Ideographic writing has long been opposed to the phonetic alphabet as its non-phonetic other. The binary thinking exemplifies a metaphysical turn of the mind that Derrida tried to dismantle, although the exact relationship between the two appeared to elude his grasp for reasons I do not have the space to elaborate here. For a preliminary understanding of the subject, the first thing to do is NOT to associate ideographic inscription too quickly with the Chinese script. 7 Despite the various claims to the contrary, the written Chinese character can no more be equated with ideography, much less pictography, than alphabetical writing can be reduced to phonocentrism. We must remember that ideographic inscription has been a European idea, like that of hieroglyph, which would be foreign to the Chinese scholars who have written voluminously on the subject of the zi (individual character) or the wen (text/writing) over a period of two thousand years.8 The equating of the Chinese script with an ideographic system has been the unfortunate result of misunderstandings and motivated translations by early Christian missionaries and linguists who were poor intermediaries when it comes to reporting on the state of Chinese writing to their home audiences and to unsuspecting philosophers. The situation has not improved much since the time of Leibniz.

That aura appears to have persisted with or without the help of the Chinese script. More recently, a new course of events began to speed things up and brought the centuries-long pursuit of the universal script to a halt, if not to a sense of closure. I am referring to the cracking of the genetic code by molecular biologists in the latter half of the twentieth century. This monumental event and the subsequent mapping of the human genome have marked a turning point in how some of the basic questions about life, humanity, reproduction, social control, language, communication, and health are to be posed or debated in the public arena. These events are happening when conversations between the scientists and humanists are made ever more difficult by the nearly insurmountable disciplinary barriers and institutional forces that are there to shield the scientist from the critical eye while keeping the humanist away from the production of objective knowledge.

Despite the difficulty, the news of the genetic code has given rise to a number of major critical studies by humanistic scholars who took upon themselves the task of scrutinizing the discourse of coded writing as a master trope in molecular biology. Inasmuch as the discipline of molecular biology did not come to its own until the midst of the Cold War, many of these studies are devoted to examining how the vast resources of the military-industrial-academic complex of the United States have been put in the service of a new vision of weapons technology and a new ontology of the enemy in the form of information theory and cybernetics. These studies demonstrate how the path-breaking discoveries made by Norbert Wiener, Shannon, Von Neumann, George Gamow, and others in cybernetic warfare and cryptography had inspired the first generation of molecular biologists to transcribe and translate the biochemical processes of the living cell and organisms as coded message, information transfer, communication flow, and so on. Whereas the mathematician relied on the logic of cryptological decoding to unlock the enemy’s secret alphabet, the molecular biologist searched for the letters, codons (words), and punctuation marks of the nucleic acids to decode the speechless language of DNA in the Book of Life. 9

As the digital revolution dissolves older conceptual boundaries and introducing new ones, the spatial/temporal coordinates of a future cognitive world will emerge from ever intensified interdependence of human and machine or similar kinds of prosthetic conditions enabled by digital media. Of course, the numerical function of the alphabet has always been there since its invention but we are so addicted to thinking of alphabetic writing as a phonetic system of transcription that Shannon’s treatment of the English alphabet as a total ideographic system may still come as a shock. Alphabetic writing is one of the oldest technologies in world civilization and has become more thoroughly and universally digital and ideographic than it ever was. But what is happening to non-alphabetical writing systems in the meantime? An incontrovertible fact has been thrust upon our attention; namely, the digital technology is turning non-alphabetic writing systems such as Chinese into some kind of sub-codes of global English via the Unicode. It is as if a new metaphysics of communication has emerged on the horizon of universal communicability through Printed English.

Commenting on the state of metaphysics, Martin Heidegger provided a number of fascinating reflections upon the implications of cybernetics for philosophy in general. In his essay ‘The End of Philosophy and the Task of Thinking’ (1969) completed the year before his death, Heidegger pointed out:

No prophecy is necessary to recognize that the sciences now establishing themselves will soon be determined and steered by the new fundamental science which is called cybernetics. This science corresponds to the determination of man as an acting social being. For it is the theory of the steering of the possible planning and arrangement of human labor. Cybernetics transforms language into an exchange of news. The arts become regulated-regulating instruments of information. 10

If cybernetics is capable of turning language into an exchange of news as it seems to be doing in our time, we must also register the fact that language and writing had enabled the invention of the cybernetic idea in the first place as is well attested by Printed English. It seems that the drive toward universal communicability (visual, verbal, and tactile) will continue to raise fundamental issues to challenge an intellectual endeavour such as Writing Technologies. I am hopeful that this new journal will enlighten us on many aspects of the ethical, political, and psychic life of technology and push us toward a better understanding of the prosthetic coexistence of humans and other lives on this very fragile planet.

But there is no reason why one should dismiss ideographical writing as a false idea. Even if this notion fails to inform us about the Chinese script, it has enjoyed a productive career in the West with a penchant for prolepsis, that is, a dream that some day alphabetical writing would be able to shed its local phonetic trappings to become a universal script. It is this Leibnizian dream of transcendence that has given ideography its aura of alterity in Western thought, so one can continue to fantasize about direct graphic inscriptions of abstract thought the way mathematical symbols or deaf reading and mute writing transcribe conceptual objects, namely, without the mediation of speech or sounds.

1. Jacques Derrida, Of Grammatology, trans. Gayatri Chakravorty Spivak (Baltimore: Johns Hopkins University Press, 1976), p. 9.

2. Friedrich Nietzsche, Human, All Too Human: A Book for Free Spirits, trans. R.J. Holingdale (Cambridge: Cambridge University Press, 1986), p. 378.

3. Lydia H. Liu, 'iSpace: Printed English after Joyce, Shannon, and Derrida', Critical Inquiry 32 (Spring 2006): 516-550.

4. See Walter Ong, Orality and Literacy (New York: Routledge, 1982), pp. 133-34.

5. 'Text to speech' conversion denotes a branch of artificial intelligence that deals with the computational problem of converting from written text into a linguistic representation. This is one of the areas where the relationship between writing and speech can be fruitfully investigated for both engineering and theoretical purposes. See Richard Sproat, A Computational Theory of Writing Systems (Cambridge: Cambridge University Press, 2000).

6. Friedrich A. Kittler, Gramophone, Film, Typewriter, trans. Geoffrey Winthrop-Young & Michael Wutz (Stanford: Stanford University Press, 1999), p.16.

7. A report not very long ago in New York Times suggests that the world outside China is still very much in the dark about Chinese writing. See Emily Eakin, 'Writing as a Block for Asians', New York Times, May 3, 2003.

8. For a discussion of the strained translation of the zi by the concept of the 'word' and the troubled beginnings of modern Chinese grammar, see 'The Sovereign Subject of Grammar' in my book The Clash of Empires: The Invention of China in Modern World Making (Cambridge, Mass.: Harvard University Press, 2004).

9. I have in mind, for example, the pioneering work of Katherine Hayles, Mark Taylor, W.J.T. Mitchell, Paul N. Edwards, Mark Hansen, and Lily E. Kay.

10. Martin Heidegger, 'The End of Philosophy and the Task of Thinking', in Heidegger, On Time and Being, trans. by Joan Stambaugh (New York: Harper & Row, 1972; Chicago: University of Chicago Press, 2002), pp. 55-73.