Report by Anna Cappellotto, Università degli Studi di Verona
Prof. Elena Pierazzo opened her plenary lecture on Digital Textual Editing on the assumption that it is not clear to her yet whether digital textual editing is a new medium or a new discipline. In her attempt to explain to what extent DH change our way to edit a text and to read it, Pierazzo takes into consideration the double nature of a text, which is at the same time something immaterial (“where is the 9th Symphony?”, she asks) and material (embodied in physical objects). While the immaterial dimension of a text can easily be transmitted across media and time – that’s why editing has been mostly concentrated on it  – in the last 100 years the “new philology” has been much more document-oriented, that is focused on the material dimension of a text.
Generally speaking, editing in a digital age is also strictly connected to the long-standing issue of reading in a digital age: digital editions, i.e., don’t seem to be suitable for long time reading; besides, it is also debated whether they are really meant to be read or they are tools to be explored (see T.S. Eliot’s Waste Land App or Shakespeare’s sonnets app to form an opinion). Moreover, despite the growing amount of digital reading, according to recent research readers still want printed books (see Porter’s 2013 survey with medievalists). The target to which digital scholarly editions address is also another crucial point of Pierazzo’s presentation, in the sense that these editions offer the possibility to widen the circulation of a book and to make it accessible potentially to everyone.
However, the above mentioned considerations entail not only a rethinking of reading in a digital age, but also a rethinking of what editing in a digital age should be. According to Pierazzo there are two main contrasting approaches related to the process of digital editing: some people see the editor as an encoder; others see computers as a “magic box”, a mechanic research assistant. If digital editors share the belief that encoding is interpretation, something which makes explicit our understanding, and finally a way to represent research, Pierazzo warns that such an editor should have a great amount of digital knowledge: XML, TEI, XSLT, HTML, CSS, web design, databases… (beyond what every editor, no matter if digital or not, has to become acquainted with in order to edit a text: textual scholarship, codicology, paleography, historical linguistics, history, literature). Nevertheless an “editor as encoder”-oriented approach brings certainly several advantages: digital editions can keep annotations and the documentations of the editorial work where it belongs; they give you the possibility to run statistical queries; they offer the maximum of flexibility in the output.
Pierazzo presented a wide range of digital editions which produces different outputs. One of the most “extreme” outcome of digital editing is the so-called social edition, based on crowdsourcing (Pierazzo seemed to be quite skeptical about it, putting forward the fact that a very specific know-how is needed in order to edit a text). The success of these kinds of editions relies upon two principles: the wisdom of the crowd and the public engagement which contributes to shape the edition.
Another interesting point which is connected to the double nature of a text, deals with the fact that editing is about texts, but sometimes the process to make the document is at least as important as the text itself (like in the case of the documentary edition of Jane Austen’s manuscripts : document-based encoding, which can lead to an ultra-diplomatic edition, is a kind of support for messy manuscripts and helps for genetic editing. Furthermore, sometimes, like in the case of the Beowulf, our understanding of the text depends strictly on its material support. Pierazzo illustrates also the example of the Around a sequence and some notes of Notebook 46: encoding issues about Proust’s drafts. Proust’s edition allows readers to see the text/document changing in time and space, which includes also a sort of gamification.
And finally the questions a researcher should ask himself when deciding to work on a digital edition instead of a printed one are the following: what can the computer do that a book cannot (a digital edition is not only the transposition of printed editions on the screen!)? Does working on a computer change the way we edit texts? To go back to the very first problem, whether digital editing is a new medium, a new method or a new discipline, Pierazzo concludes that it is not clear to her yet, even though the impact of computing is hard to overestimate because:
– It changes the division of labor
– It changes the way we think of our work
– It changes the way we think of ourselves
– It changes the reasons why we do our work
– It changes for whom we do it.
1) According to Pierazzo and Stokes also TEI prefers a text-oriented approach: «In the case of digital editions, this centrality of the text is encouraged by the structure and principles of the most prestigious standard for text encoding, the one produced and maintained by the Text Encoding Initiative. The approach of the TEI, in fact, forces scholars to consider the text first. The TEI certainly offers a very sophisticated way of describing manuscripts; however, when it comes to transcription, of the two main hierarchies (text and document) the TEI privileges the text, relegating topographical description to empty elements (, , ) or attributes (, ); it is no coincidence, after all, that it is called the Text Encoding Initiative. The TEI does not say that documents are not relevant, but rather that they are less relevant than texts; to use a metaphor from bibliography, texts are “substantial” while documents are “accidental”». E. Pierazzo and P.A. Stokes, ‘Putting the Text back into Context: A Codicological Approach to Manuscript Transcription’. In Kodikologie und Paläographie im digitalen Zeitalter 2 /Codicology and Palaeography in the Digital Age 2, edited by Franz Fischer, Christiane Fritze, Georg Vogeler, in collaboration with Bernhard Assmann, Malte Rehbein, Patrick Sahle. BoD, 2010, pp. 397-430 (p. 399).
3) See transcribe Jeremy Bentham: http://blogs.ucl.ac.uk/transcribe-bentham/about/; “The Devonshire manuscript /Farewell all my wellfare” http://en.wikibooks.org/wiki/The_Devonshire_Manuscript; http://en.wikibooks.org/wiki/Main_Page.
4) E. Pierazzo and P.A. Stokes, ‘Putting the Text back into Context: A Codicological Approach to Manuscript Transcription’, cit., p. 400. «However, when we are trying to capture what the source document looks like, it is because we believe that this is at least as meaningful as the text it contains: we are documenting our source, not formatting our output, and so our encoding is descriptive, not procedural». See also E. Pierazzo, ‘A Rationale of Digital Documentary Editions’. Literary and Linguistic Computing, 2011 26/4, pp. 463-77.
5) E. Pierazzo and P.A. Stokes, ‘Putting the Text back into Context: A Codicological Approach to Manuscript Transcription’, cit., p. 401: «Compared to more traditional approaches to editing, genetic criticism privileges the analysis of the process, the stratified flow of authoring, as opposed to the “photograph” of the end result which is embodied by traditional diplomatic editions. This is one—but by no means the only—scholarly approach for which the study of the process is relevant, and any understanding of the process must surely begin with the document».