Text Theory, Digital Document, and the Practice of Digital Editions
July 18, 2013, 13:30 | Panel, CBA 143
Subject
Many digital tools have been and are being developed aimed at transcribing, annotating and publishing editions of literary or historical texts making use of crowd sourcing for collaborative research. This panel discusses the question how well digital scholarly editions produced by such tools reflect the theoretical notions of digital scholarly editions, and how such may be assessed based on both empirical examination of current practices and text theory in the digital era.
The practice of preparing and producing digital editions is increasingly supported by purpose made and specialized digital tools, many of them involving crowd sourcing. Although an exhaustive survey and typology of these tools is still missing, by and large we can see that most of these tools are highly similar in functionality, text model, and editorial process. As such they express a fairly straightforward transformation from the physical book to a digital metaphor of the book, roughly along a trajectory of transcription, annotation and publication. Quite contrary text theory in the digital era seems to express a different scholarly ideal of representation of text, far more rooted in notions of instability (McGann 2001), fluid text (Bryant 2002), transclusion (Nelson 1982), text as process (Buzzetti 2002, Gabler 1987), transmedialization (Sahle 2010), and distributed editions (Zundert 2011) for instance. Also the usage patterns of emergent digital technologies and their applications such as Web2.0/3.0, crowd sourcing, cloud based services, open notebook science, data as service, multi device enabled layouts –to name but a few– seem to favor shaping the representation of digital text more in line with theory than with the practices of current scholarly digital edition tools.
If we follow Internet pioneer Cailliau (Cailliau 2012) information at our fingertips will become essentially undocumented, in the sense of not being a conventional cover-to-cover document, not even as a metaphor. Rather specific parts and facets of information will adapt to different devices and context, rather reminiscent of the concept of Nelson’s envisioned docuverse (McKnight 1991). It should therefore be critically examined if it is still opportune and adequate to speak of a digital edition as a document, and if a digital editing process should necessarily lead to a single or physical publication to serve maximum scholarly expressiveness. Overall the tools in use mostly let the metaphor of the book be inferred as the de facto model for digital publication. But as users and scholars choose and adapt new technologies and new forms of engagement with information, should scholarly publishing and digital editing follow these patterns of usage? Does an audience oriented approach such as the ideas on minimal and maximal editions expressed by Vanhoutte (Vanhoutte 2011) strike a middle ground between theory and practice? What is the intellectual loss or gain if the metaphor of the book prevails over using the medium of the Internet as an expression of text as process?
It is these kinds of questions about the theoretical underpinnings of the digital scholarly edition that arise at the intersection of shaping technologies, standing scholarly practice, and changing usage. Trying to establish some practices of quality a number of comparative studies have been conducted into transcription tools and crowd sourcing tools for digital editions, notably by members of this panel. Most of these studies have been based on analysis and comparison of functional requirements, usability aspects, and user feedback. However, a text theory based aspect of evaluation of such tools and editions is mostly lacking.
This panel will explore the issue of practice and theory based quality assessment of digital editions, building on the results of a comparative text theory based empirical survey of tools for digital scholarly editions the design of which is the subject of our preparatory paper presented at the NeDiMAH Expert Meeting on Digital Scholarly Editions held in conjunction with the 2012 conference of the European Society for Textual Scholarship. The panel focuses on several prominent digital tools and projects for preparing digital scholarly editions with varying approaches. From this broader view specific themes and issues will be examined, such as:
- The metaphor of the book as enabler or inhibiter of new avenues for research.
- User surveys and feedback as shaping forces of tools for digital editions.
- The role of users, editors, researchers, and funders in determining quality aspects.
- The digital edition as an expression of text in flux versus the iconic object.
- Text models for distributed documents.
- Is the generic or the specific a hallmark of quality of tools for digital editions?
- Crowd sourcing and open notebook science as determining aspects of digital editions.
- Visualization of instability of text as a scholarly quality of the digital edition.
- The relationship between formalization of editorial process and the instability of text.
Organization of the panel
The methodological research program of the Huygens Institute for the History of the Netherlands, part of the Royal Netherlands Academy of Arts and Sciences (KNAW) is the initiator and organizer of this panel. Panel members include e-Humanities researchers from KNAW involved in the development of the transcription and crowd sourcing tool eLaborate (https://www.elaborate.huygens.knaw.nl/); researchers from University College London involved with amongst others the Transcribe Bentham Project (http://www.ucl.ac.uk/transcribe-bentham/); researchers and editors of digital editions (e.g. http://www.i-d-e.de/); researchers working on open science approaches (e.g. http://editorsnotes.org/ and http://ecai.org/mellon2010/); and developers and researchers of crowd sourcing software (http://manuscripttranscription.blogspot.com). The panelists are engaged in the study and development of different digital humanities tools and projects pertaining to digital scholarly editions, specializing in transcription tools and crowd sourcing projects, which grants this panel a unique opportunity to comparatively explore various strategies in building and using digital editions, and to reflect on both theoretical and practical concerns of that process. In addition, the panel will critically evaluate the themes and issues listed above.
The panel session will be organized in the following way:
- The panel chair will introduce the main topic, discussion questions, and the panelists; duration: 3 minutes;
- Each of the panelists will give a short presentation (6 minutes), followed by questions from the audience (4 minutes); duration: 60 minutes;
- The themes and questions raised in the presentations will be further discussed in an open forum between the panelists and the audience; duration: 25 minutes
- The panel chair will briefly reflect on future plans, provide contact information, and close the panel: 2 minutes.
The names and affiliations of confirmed panelists are as follows:
- Ben Brumfield, Independent Digital History Software Services, Austin Texas (US)
- Karina van Dalen-Oskam, Huygens ING (The Netherlands)
- Greta Franzini, UCL Centre for Digital Humanities (UK)
- Patrick Sahle, Institute for Documentology and Scholarly Editing (IDE) / Cologne Center for eHumanities (CCeH) (Germany)
- Ryan Shaw, University of North Carolina, Chapel Hill, School of Information and Library Science (US)
- Mellisa Terras, UCL Centre for Digital Humanities (UK)
Moderators for this session are:
- Charles van den Heuvel (panel chair)
Huygens Institute for the History of the Netherlands (Royal Netherlands Academy of Arts and Sciences) - Joris van Zundert
Huygens Institute for the History of the Netherlands (Royal Netherlands Academy of Arts and Sciences)