OSCE Dunn Paper

=e-Science and the critical edition: a discussion paper=

Stuart Dunn and Tobias Blanke Arts and Humanities e-Science Support Centre, King's College London
At the end of the Nineties, a national e-Science Core Programme was established in the UK. Its agenda was driven by scientists who needed new technologies and concepts to cope with the ever increasing amount of data, both from experiments and simulations as well as knowledge gathering exercises. Faced with this 'data deluge', a new data-driven science was conceptualized with the scientist and research methods at the center of new data technologies. The idea of e-Science and the e-Scientist was accompanied by the development of new high-speed computing networks that promised solutions to a variety of problems in coping with the vast amount of information. 'Grid technologies' were the result of a global effort from computer scientists working together witch practitioners to advance existing network technologies like the internet in order to create a global space of sharing resources and services.

Several e-Science initiatives in the UK are promoting to advance research work in virtual spaces with advanced computing - in particular network technologies. Technologies and methodologies for the automation and support of research processes are being investigated. Grid technologies and methodologies address how globally distributed data resources can be used in the research process or how computational power can be shared. At the same time, new forms of scholarly communications in 'virtual organizations' are developed. For example, the Access Grid promises tools to support structured meetings of researchers in group-to-group collaborations, a benefit which will be keenly felt by A&H researchers as they move towards larger and more formal collaborations. The advantages of direct communication in face-to-face meetings is combined with the ability to share instantly digital items among the groups. Grid technologies integrate two recent developments in research that are inseparable from each other: the new possibilities due to improved technologies complement new highly collaborative research.

E-Science therefore stands for the development and deployment of a networked infrastructure and culture through which resources can be shared in a secure environment. These resources can be everything from processing power, data, or expertise that researchers can share. This networked infrastructure allows a culture of collaboration, in which new forms of collaboration can emerge, and new and advanced methodologies can be explored.

A key to the success of e-Science is the provision of shared access to research facilities and therefore to provide answers to the increasing globalisation of research. Researchers from around the world can work together and use each other's resources as if they were collocated. Digital knowledge objects shall be created and (re-)used in virtual collaboration spaces. E-research is about joining things up and not purely about CPU power or computer networking. It is about pro-active relationships as between server to server and programme to programme and research practitioner to research practitioner. This global collaboration in a virtual space will be of key significance to what Arts and Humanities (A&H) researchers are going to be doing over the next ten years; and will fundamentally alter their relationship with the resources they use.

Critical editions provide a key example of such resources. A recent expert seminar convened at the University of Sheffield by the AHDS e-Science Scoping Survey (http://ahds.ac.uk/e-science/e-science-scoping-study.htm) debated the application of e-science methods and technologies to the critical edition. It was considered that the concepts of the Virtual Research Environment (http://www.ahessc.ac.uk/briefing_papers/VRE_briefing_paper.html) and Virtual Organization have the potential to enable a paradigm shift from the 'traditional' model of the critical edition, whereby the text is produced by an individual researcher or small group of scholars and presented to a wider community as a static document, and an alternative whereby texts are produced and owned collaboratively by that community. In the latter case the text is produced as part of an iterative and ongoing process, under the collective influence of a group of researchers. The same principle could apply to elements of the 'digital infrastructure' on which much collaborative work relies - thesauri, dictionaries, lexica and so on. This raises complex issues of academic integrity and trust: the high-profile debate of the applicability of Wikipedia in research contexts is well known, and few would argue that a totally unfettered editorial process is appropriate. However such methodologies have very profound implications for the way humanities research is done, and the challenge is to quantify and qualify the shades of grey between Wikipedia and the traditional critical edition model.

Some key questions are:


 * What technologies are needed to enable the collaborative research environments required for such 'democratization' of the critical edition?
 * Do users need such editions? Will they ever trust them?
 * How should access to the editorial process be managed? Who decides who gets to edit the text? Should it be managed at all?
 * How should version control be maintained?
 * How should annotations and edits be captured, both in terms of the finished article and the workflow process?
 * What kind of peer-review process needs to be in place?
 * How should cataloguing, referencing and citation of such documents be approached?
 * How can such texts fit in to existing library and information (infra)structures? Will these need to be rethought?