Report “Codicology and Palaeography in the Digital Age”, 3/4 July 2009 Munich

(Übersetzung: Catarina Seeger)

Over the last 10 years, the heritage of European handwriting, too, has found its way into the internet. Catalogues by the most important European manuscript libraries are now available for online research. Additionally, an increasing number of projects is striving to establish a comprehensive display of manuscripts on the web. A conference was dedicated to the question of which implications this new situation has on the handling of manuscripts, which is increasingly conveyed by modern information technology. The conference was held at the Ludwig Maximilians Universität München in collaboration with the Institute for Documentology and Scholarly Editing (IDE) with financial support by the Gerda- Henkel Foundation. It was clarified during the conference that the provided digital catalogues mainly serve codicological research. Manuscript images, however, are used for palaeographic research and teaching or as material for the development of algorithms in automated processes of handwriting recognition or scribal identification. An anthology was published by the IDE simultaneously to the conference, the authors of which were invited in specific.[1] This conference report focuses on the contributions which did not originate from the anthology.

The conference was opened by Irmgard Fees (Munich) who stressed the motivating combination of potential which new technologies show and their integration into the evolution of research. Due to this, traditional questions of research are not masked out nor is a development of research constrained by persisting on established methods. She therefore addressed the question of to what extent palaeographic and codicologic research are altered by the new circumstances. This central question should prove itself to be a leading one throughout the event. Wernfried Hofmeister, Andrea Hofmeister-Winter and Georg Thallinger (Graz) presented a multitude of computational methods for the identification of scribal hands by using the example of the Database for the Authentication of Medieval Writing Hands (DAmalS). They demonstrated a graphitic detail recording as well as pattern recognition technologies which originate from forensics. A central component for this was a certain visualization environment, which makes data and allocations interpretable for the user. The contributors emphasized that providing material alone is not sufficient, since the placement of information plays a central role in the follow-up as well. The subsequent discussion addressed the central question of how the necessary amount of work for a graphitic detail analysis should be assessed.

Timothy Stinson (Raleigh, North Carolina) reflected on the changes electronic recordings made in manuscript descriptions. The manuscript digitization by the Parker Library of the Corpus Christ College Cambridge and the Roman de la Rose- project were used to demonstrate these changes. He referred to the fact that the digital representation of a manuscript can take different forms and thus, the traditional concept of a description of a manuscript in a single catalogue entry could be rendered obsolete by a “One-To-Many”- relation between original handwriting scriptures, images, transcriptions, classic manuscript descriptions or domain specific partial descriptions.

Peter Stokes (Cambridge) focused on the role of authorities in palaeography. Referring to the methods of forensic script comparison, he clarified how strongly the credibility of scholarly work can be impaired by the fact that expert judgements cannot be verified. He argued that the acceptance of palaeographic results could be markedly raised by computer-aided methods. This would require the availability of a system which could make both the analysis of basic data and their specific methods for scientific verification accessible. The use of such a system, which he presented as a prototypic application, was not called into question in the following discussion, whereas the role of authorities in palaeography and their external perception was evaluated differently.

Roland und Gabriel Tomasi (Merenville) presented their software which is meant to to solve problems caused by automatic script recognition using modern methods of segmentation as well as an intelligent character interpretation. It features a  training system, full word and character group recognition and linguistic plausibility checks. The acquired data is useful for palaeographic analyses, especially for scribal identification. In the following debate, the general enthusiasm over the possibilities of automatic text recognition, such as over medieval manuscripts, was dimmed by insecurities over how the system could handle the multitude of cursive scripts from the Late Middle Ages and the efforts necessary training would require.

The first day of the conference was closed by a panel discussion, in which the four authors of the day discussed the relation of traditional manuscript research to  computer-aided research together with Eef Overgaauw (Berlin) and Marc Smith (Paris). The authors had been previously selected according to their contributions to the anthology both by the IDE and APICES for their outstanding representation of the spectrum of methods and research. During the discussion it was particularly pointed out that the computer-aided approaches currently still concentrate entirely on working off the old questions. These focus on software for hand identification, allotment of manuscript collections and the localization and age determination of scripts. Simultaneously, it was noted that the new methods could raise attention for old problems, which would not be possible without them. It became clear that the discussion over “Connaisseurship” or “measures” has lost the explosiveness as an instrument of palaeographic analysis which it still possessed in the 1970s. Instead, the debate searched for the correct relationship between technical methods and palaeography. Particularly the history of development of scriptures is still not addressed by digital methods. A collective consensus was that cooperation between palaeographs, codicologists and computer scientists is desirable and productive, without codicology or palaeography having to lose their scholarly quintessence as a consequence, neither with regards to content nor to method. In fact, both groups have to collaborate closely for the development of suitable algorithms.

On the second day of the conference, Daniele Fusi (Rome) presented the concepts of neural networks for the identification of character forms which is used when working with inscriptions. He suggested tackling the frequently small size of the samples and the increased complexity of manuscript properties, by using a graphical preprocessing and a theoretical frame, which can dictate evaluated properties to the neural system. This could be used for example for problems which arise from ligatures and the multitude of nuances

Mark Aussems und Axel Brink (Edinburgh/ Groningen) contrasted techniques for scribal identification originally developed for forensic purposes and methods for scribal identification which rely on measures and therefore mirror traditional palaeographic ideas. They used manuscripts from the works of Christine de Pizan (London, British Library, Harley 4431) to illustrate that methods which measure characteristics of scripts which are not visible to the naked eye and also cannot be derived from traditional, non-digital criteria, provide results which are just as accurate as without the use of a computer.

Arianna Ciula (Straßburg) reported about her collaboration with computer scientists who developed a framework for the palaeographic analysis of the Carolingian minuscule in manuscripts from Siena between the 9th and 13th century. The software which arose out of this, "Software for Paleographical Investigation" (SPI) allowed a new chronological classification for the corpus of script samples. The employed software is currently being converted into a Java- version, which will be made to adapt modern standards.

Daniel Deckers and Cristina Vertan (Hamburg) outlined the concepts of the TEUCHOS work environment, in which the hand written record of Greek and Latin literature from the antiquity are deposited and made useable. The contribution introduced the data models and software solutions which are used for the texts and the description of the manuscripts.

MarcoPalma and AntonioCartelli (Cassino) reported their experiences with a palaeographic teaching system which transitioned from instructional to participative learning. When the consistently increasing amount of exercise material motivated the students less than the possibility of collaboratively contributing to the repository of the exercise material, the switch was initiated. It was stressed during the discussion that successful offers of instructional eLearnings exist beside this, concentrating on the acquisition of transcription.

 

Shorter contributions presented virtual manuscript catalogues, dedicated to projects which apply sophisticated technology in manuscript research and also present research approaches for the automatic classification of scripts. The virtual manuscript library of the Malatestiana for example is designed as an open catalogue which functions as an intersection in research and is capable of including new contributions by researchers. The regional catalogue of the Veneto integrates the manuscript descriptions of an entire region. The collaboration project TILE (Text-Image Linking Environment) aspires to provide a computer-aided work environment which alleviates the connection between images and image details with texts, automated wherever possible. The French GRAPHEM- Project aims for the computer-aided classification of script samples to script types. The digital edition of the Sinaitic Glagolitic Sacramentary, uses modern photographic techniques to make scripts visible which are normally unreadable for the naked eye. The digital Edition is developed by the Institute for Slavic studies in Vienna.  

 

For the concluding discussion, Georg Vogeler (Munich) phrased five questions, which arose in the course of the conference: If computers are used in order to solve old questions in palaeography and codicology, isn’t the use of digital procedures only a political argument in order to keep a traditional science attractive, instead of being a revolutionary method? The fact that on the Call for Paper no proposals from the fields of quantitative codicology, art history or musicology came up, leads to two questions: Firstl of all, could the evaluation of databases for the purposes of quantitative codicology possibly be an obsolete research approach, which does not take the new conditions of digital representation of manuscripts into account? Are art historical and musicological research on medieval manuscript questions marked by methods which exclude the use of digital aids? And lastly, to what extent can and should palaeographs and codicologists trust the results from software solutions, which only present themselves as a Black Box to them?

In the heated debate about this topic, Aliza Cohen-Mushlin elaborated on the disappointments in art historical research so far, which only provided useful results in the field of iconography when computational methods were used. Representatives from this field such as Nataša Golob emphasized once more that the art historical and musicological stylistic questions and the applied empathic methods which are used for this purpose are inaccessible to computational means. The methodical position of IT for palaeographic and codicological research brought Mark Aussems to the notion that with the help of computers new answers to old questions can be found. Cristina Vertan stressed that the computer is not limited to counting and measuring, but can also collect information and complex correlations and visualize them. Her experience with approaches of Humanities Computing, especially Computational Linguistics, showed that using computers can not only resolve questions which are in binary form, but can also explicitly deliver “gray” results. Yet again, many debaters emphasized that the machine is not able to recognize more in the medieval and early modern manuscripts than the researcher assigned to it for examination.

Particularly in order to orientate IT on the interests and problems of researchers, Torsten Schaßan advocated OpenSource- principles to give researchers insight into the employed concepts. His notion also counts for software solutions in Humanities. In order to utilize the programs in a reflected and suitable way for the according problem, this is of great importance. Among others, Ariana Ciula and Wernfrid Hofmeister denoted during the discussion that the use of computers forces a methodical reflection, which not only improves old methods critically, but also allows for access to entirely new methods from related disciplines. The conference addressed a pressing topic in the scientific community, which showed in the resonance: Over 70 participants from all over Europe came together; even researchers from North America and Israel were present. The need to gather at similar events became abundantly clear and proves that there is still a lot of scientific potential and a high demand for information about codicology and palaeography in the Digital Age.

 

Dr. Georg Vogeler

Ludwig-Maximilians-Universität

Historisches Seminar, Historische Grundwissenschaften und Medienkunde

Geschwister-Scholl-Pl. 1

80539 München

 

http://www.hgw.geschichte.uni-muenchen.de/aktuelles/archiv/tagung_kod_pal/index.html



[1] Kodikologie und Paläographie im digitalen Zeitalter / Codicology and Palaeography in the Digital Age. Hrsg. von Malte Rehbein, Patrick Sahle und Torsten Schaßan unter Mitarbeit von Bernhard Assmann, Franz Fischer und Christiane Fritze. Norderstedt, 2009 (ISBN 978-3-8370-9842-6).

Comments are closed.