The development of TEI mark-up has had the effect of standardising the encoding practice that surrounds the creation of machine readable texts. TEI has the advantage of rendering these texts in a form that is more easily indexed and searchable. Despite the fact that this non-proprietary agreed code of practice facilitates more dynamic interaction than would otherwise be possible, there are critics that argue that the hierarchical arrangement of elements in a text is antithetical to the practice of the humanities, that it contravenes openness to differing interpretations and understandings of a text through overt standardisation.
Jerome McGann is probably the most well-known among those critics of TEI encoding. This is due to his association with early digital humanities projects such as the Rossetti Archive and his writings on the theoretical implications of using an encoding language for literary data. The hierarchical nature of the language is a particular bugbear for McGann. As he writes in Radiant Textuality: Literature after the World Wide Web (2001) “its [XML’s] hierarchical principles and other design characteristics set permanent and unacceptable limits on its usefulness with arts and humanities materials.” This level of reticence to engage in an exercise that essentially amounts to the standardisation of literary texts can be explained by a brief and simplified survey of the contemporary critical environment within the humanities, primarily as regards literary criticism.
The trend towards systematised critical approaches such as structuralism and formalism in the early twentieth century has been wholly reversed following the development of post-structuralism, deconstructionism, post-colonialism, feminism, etc, that established themselves in response to and in conjunction with these more schematic critical approaches in order to reveal what many of these critical practitioners had neglected or ignored. This wave of critics in the sixties and seventies were influential in emphasising the need to resist over-arching grand narratives and neat answers to textual questions. Humanities practitioners now exist in an environment where the theories of Gayatri Chakravorty Spivak, Michel Foucault and Jacques Derrida have entered the mainstream and are no longer as revolutionary or ‘against the grain’ as they have been in the past.
What we have inherited from these theorists I have cited above is a post-Derridean notion of what a text is. A text is infinite, endlessly referential and fundamentally indeterminate, embroiled as it is in an endless play of deferred meaning. This is the one of the cornerstones of the contemporary humanities landscape and is one of the first things I learnt as an undergraduate. One can see why there is a vocal cohort of critics speaking out against the application of a rigid TEI code that leaves no room for ambiguity in approaching a text. However, it is the contention of this blog post that many of these critics who inherit their sense of textuality from deconstructionists such as Derrida, are at least partially unaware of what it is exactly that Derrida was positing when he developed his own critical approaches. This blog post will attempt to, in as much as possible, argue that TEI is not as antithetical to deconstructionism as these critics maintain.
Firstly, this post will provide a (very) brief summary of Derrida’s ideas as expressed in the first part of Of Grammatology (1967), ‘Writing Before the Letter.’ If Derrida can be said to have one chief argument to make in this section, it is an attack that he mounts against the implicitly held beliefs of Western philosophy, which upholds the existence of a particular kind of ‘presence’ embodied in the spoken word. This has the consequence of downgrading the value of writing, which is in the discourse of Western philosophy, a mere representation or image of full speech. This distrust of an alleged representation and preference for an ideal is a tenant of Western philosophy that Derrida identifies as existing as far back as Plato’s Phaedrus (c. 370 B.C.E.). Derrida dismisses this notion of speech as ‘theological,’ a kind of scholastic purism that has no place in a post-Enlightenment philosophical framework. This binary opposition and privileging of speech is difficult for contemporary thinkers to elide, however. If it can be located in the originary Western philosophical texts, the trace of the idea remains in the philosophical schemata that have been devised since. It is therefore necessary for Derrida to make use of the discourse of Western philosophy as it exists now, for better or worse, while without perpetuating these binary oppositions and thereby prevent the implicit privileging of one over the other. An example of how to go about this is Martin Heidegger’s use of the word ‘Being’ with a line through it, to draw attention to the fact that he is discussing Being in a way that the reader would traditionally understand it, while simultaneously distancing the word from inherited notions of Being. This is one of the methodologies behind Derrida’s ‘science’ of grammatology.
This presents the question as to what deconstruction means for a programming language such as TEI. Derrida, in deconstructing a text, is attempting to rectify this relegation of writing to the margins of philosophy. Derrida posits that as a corrective, that we emphasise the endless ‘play’ of meaning and recognise the self-contradictory nature of all utterance. This would be problematic for TEI for a number of reasons. The marking up of tortured ambiguity is not a straightforward task for any encoder. When marking up a text, questions are generally rather either/or in nature. Can this alleged inflexibility of TEI be used meaningfully to approach a text of any kind? The thoughts of Derrida’s translator Spivak on the act of translation and interpretation will be productive in this context:
“Any act of reading is besieged and delivered by the precariousness of intertextuality. And translation is after all, one version of intertextuality. If there are no unique words, if, as soon as a privileged concept-word emerges, it must be given over to the chain of substitutions and to the “common” language” why should that act of substitution that is translation be suspect? If the proper name or sovereign status of the author is as much as barrier as a right of way, why should the translator’s position be secondary?”
If we understand the act of marking up a text as a kind of translation, involving many of the same skills, procedures and means of interpretation, according to Spivak there is no reason why such a product would be devalued. If deconstructionists oppose distinctions being made between low/high, original/copy, why is TEI so open to criticism? From through my own (admittedly limited) reading of Derrida and the critics who use his understanding of text as devoid of inherent meaning and more of a trace effect in the mind of the reader have yet to produce a convincing reasons why the most basic units of text should be exempt from classification. Simply put, a word, a sentence or a paragraph are not always simply effects of the reader’s interpretation and can be quite easily identified as textual features in themselves.
Therefore, it is hoped that this blog post has demonstrated that one can be a committed Derridean and still carry out the work of the TEI-C with a clear conscience. As Derrida himself writes, intellection about the instability of the signifier must be consciously forgotten occasionally in order for things to move forward. Derrida never advocates the complete destruction of the economy of signs as it exists today, regardless of some of the more violent implications of the word deconstruction.
“Up to a certain point, such repression is even necessary to the progress of positive investigation. Beside the fact that it would still be held within a philosophising logic, the ontophenomenological question of essence…could, by itself, only paralyze or sterilise the typological of historical research of facts.”
The question as to where this certain point is located is as always, up to the editor.
Rather than continuing to rail against that which Derrida did, we should recognise the changed critical landscape which over time has developed its own orthodoxies and blasphemies. It is these that the critic and editor should be trying to overturn and re-conceptualise. This post will conclude with the argument that it was not Derrida’s intention that we incorporate his ideas into our critical approaches only to arrive at a point of stasis. Instead, we should continue to apply and develop the kind of critical rigour that we find in his writings and re-invigorate deconstruction as an ongoing process, rather than a dead-end. This is not to argue for a return to value judgements or the worst excesses of ‘rational’ humanist thought that Derrida, Foucault and Jacques Lacan read against the grain. It is instead an invocation to recognise that deconstruction does not end; it is instead an ongoing process of undoing hierarchies in a productive manner.
 McGann, Jerome, Radiant Textuality: Literature After the World Wide Web (Palgrave: 2001), p.17
 Derrida, Jacques, Of Grammatology (The Johns Hopkins University Press: 1997), p.lxxxvi
 Ibid, p.28
Derrida, Jacques, Spivak, Gayatri Chakravorty (Translator), Of Grammatology (The Johns Hopkins University Press: 1997)
McGann, Jerome, Radiant Textuality: Literature after the World Wide Web (Palgrave: 2001)
Schreibman, Susan, Siemens, Ray and Unsworth, John, A Companion to Digital Humanities (Blackwell: 2004)
Schreibman, Susan and Siemens, Ray, A Companion to Digital Literary Studies (Blackwell: 2008)
Text Encoding Initiative Consortium Website, http://www.tei-c.org/index.xm