About Me

My photo
I am Professor of Digital Humanities at the University of Glasgow and Theme Leader Fellow for the 'Digital Transformations' strategic theme of the Arts and Humanities Research Council. I tweet as @ajprescott.

This blog is a riff on digital humanities. A riff is a repeated phrase in music, used by analogy to describe a improvisation or commentary. In the 16th century, the word 'riff' meant a rift; Speed describes riffs in the earth shooting out flames. The poet Jeffrey Robinson points out that riff perhaps derives from riffle, to make rough.

Maybe we need to explore these other meanings of riff in thinking about digital humanities, and seek out rough and broken ground in the digital terrain.

26 July 2015

Digital Humanities and the Future



This was a talk I gave at the University of Sussex on 20 November 2013. Parts of it are now out of date (for example, there is now a lot more to say about the REF as far as the intellectual direction of DH in the UK is concerned), but other sections are perhaps useful, so it may be worth sharing by means of this late blogging. The illustration shows the Banksy mural 'No Future Girl Balloon' which appeared on a house in Southampton in 2010 but was painted over shortly afterwards.

Talking about the future is always a rash endeavour. Charles Henry has described how in 1876 an article in the journal Nature envisaged the value of the telephone chiefly as a new form of home entertainment. It was anticipated that Alexander Graham Bell’s invention would ‘at a distance, repeat on one or more pianos the air played by a similar instrument at the point of departure. There is a possibility here...of a curious use of electricity. When we are going to have a dancing party, there will be no need to provide a musician. By paying a subscription to an enterprising individual who will, no doubt, come forward to work this vein, we can have from him a waltz, a quadrille, or a gallop, just as we desire. Simply turn a bell handle, as we do the cock of a water or gas pipe and we shall be supplied with what we want. Perhaps our children may find the thing simple enough’. While this is interesting as an anticipation of streamed music, as a discussion of future of the telephone, it was wide of the mark.

Dreams of the future frequently drive the way technology develops. H.G. Wells’s dream of a ‘World Brain’, described by him in a lecture in 1936, reflected his own intellectual preoccupation with synthesis and the search for grand narratives rather than any technical possibilities. Yet Wells’s interest in whether microfilm could be used to develop such a world brain inspired subsequent researchers to experiment with new technologies as they appeared, and influenced Arthur C Clarke when he proposed in 1962 a world library powered by supercomputers. At the recent Digital Economy conference at Media City in Salford, a BBC speaker showed a video describing a vision of future communications technology enunciated by Captain Peter Eckersley, the first Chief Engineer of the BBC, in 1926. The vision described by Eckersley in 1926 for television and pervasive media eerily prefigured the kind of technologies which are only just now, nearly a century later, appearing in a domestic context. When this video was shown, a member of the audience remarked that in a way the video was a condemnation of the BBC, since it suggested that it had not developed its engineering vision since 1926, and had for nearly  hundred years been relentlessly pursuing the realization of the dreams of its first chief engineer. Regardless of how we view this criticism, the examples of Eckersley’s 1926 vision and of Wells’s dream of a world brain illustrate forcefully how the most important driver in technological development can be the human imagination and dreams of a future state.

For the digital humanities, part of its promise is always the claim that is on the side of the future. The digital native will effortlessly succeed the clumsy digital immigrant, and so technology will pervade all aspects of humanities research. This assumption of the inevitable triumph of digital technology underpins some of the most strident claims made on behalf of digital humanities in recent years. Digital humanities has been claimed as ‘the next big thing’ on the intellectual landscape, the successor to the critical theory which has dominated since the 1950s. In 2009, William Pannapacker wrote, after the MLA Convention, that ‘Among all the contending subfields, the digital humanities seem like the first "next big thing" in a long time, because the implications of digital technology affect every field’. Pannapacker continued: ‘I think we are now realizing that resistance is futile. One convention attendee complained that this MLA seems more like a conference on technology than one on literature’. These assumptions of the inevitable triumph of the digital humanities have fed into a visionary discourse of DH which, stressing its interdisciplinary and collaborative aspirations, sees it as a means of renewing and transforming the academic practice of the arts and humanities. Mark Sampler has famously commented that ‘The digital humanities should not be about the digital at all. It’s all about innovation and disruption. The digital humanities is really an insurgent humanities’. Likewise the Digital Humanities Manifesto declared that: ‘the Digital Humanities revolution promotes a fundamental reshaping of the research and teaching landscape’.

This visionary discourse around DH has been immaculately documented and analysed by Patrik Svensson. The way in which the rhetoric of DH frequently becomes suffused with the ‘technological sublime’ has also been emphasized by Paul Gooding. Melissa Terras and Claire Warwick in a recent article. As Patrik Svensson stresses, much of this rhetoric is not so much a comment on the possibilities of digital technologies but rather using the idea of a digital humanities as a springboard for a debate about the nature of the humanities. Digital humanities has become for some scholars a field in which we can reimagine the humanities, perhaps without reference to the digital at all. Yet there still remains a strong techno-optimistic thread within the digital humanities and an assumption that its time will inevitably come. Patrik Svensson points out how these assumptions echo the theme of the ‘proximate future’ discussed by Paul Dourish and Genevieve Bell in their remarkable book, Divining a Digital Future: Mess and Mythology in Ubiquitous Computing. Dourish and Bell emphasise how governments, corporations and institutions portray the future as a technological utopia which is always just around the corner, and never here.

The commercial and political benefits of this constant claim that we are on the verge of a technological utopia are obvious. A good example of the power of the idea of the proximate future is Singapore, where the government seeks to create ‘a global city, universally recognized as an enviable synthesis of technology, infrastructure, enterprise and manpower [with a] new freedom to connect, innovate, personalize and create’. Dourish and Bell emphasise the disconnect between this digital freedom and restrictions on human rights in Singapore, and suggest that this promise of jam tomorrow helps bolster these restrictions. Rhetoric of the proximate future, in the view of Dourish and Bell, has obscured the fact that the future is already here; technological trends identified and developed in units like the Xerox Palo Alto Research Centre twenty or thirty years ago have moved into everyday life and have effected profound transformations on every aspect of our existence. No doubt changes will continue and we will still see many remarkable innovations, but the digital future arrived sometime ago, and it would be better for us to start examining and using more closely what is around us. In talking about digital transformations, we are talking about a process which is current and all around us, not about the future.

I am a child of Harold Wilson’s white heat of technological revolution. I must admit that listening for fifty years to speeches advising me that technology is about to unleash a revolution unprecedented in human history is a little wearing and jangling on  the nerves. In expectation of the coming technological revolution, I was taught in the 1960s a new type of mathematics which required me to learn to use a slide rule and to perform arithmetic with binary numbers. Although I am now a professor of digital humanities, and have had quite a bit to do with computing, I have never since had to perform calculations with binary numbers. However, the fact that somehow the new mathematics left me with a lack of understanding of a number of fundamental mathematical concepts (although I scraped an O level pass) has left me feeling disadvantaged as we start to think about new quantitative techniques in various humanities subjects. I fear that the myth of the proximate future has damaged me. If we see the aim of digital humanities as simply being to promote the use of technology in studying arts and humanities subjects, then I suspect that the claim that we are constantly moving towards a new technological revolution has also been unhelpful. The way in which digital humanities is engaged with promulgating this myth of a proximate utopia is apparent from the way in which the subject constantly reinvents and renames itself: from humanities computing to digital humanities, and now e-science, e-research, web science, digital studies, digital culture.

At one level, in accordance with Alan Liu’s Laws of Cool, it is perhaps necessary and unavoidable for digital humanities to propagate the myth of the proximate future. At another, this vacuous myth-making may do digital humanities a disservice. A colleague in America recently forwarded to me a remark by a history undergraduate writing a long essay on ‘digital history’, who wrote that: ‘The digital humanities, of which digital history is a subset, is scary because there is no definition of what is meant by the term. Real historians fear its lack of cohesion’. I’m not sure that is necessarily an argument for a tight definition of DH, but it does suggest that the rhetoric might obscure the substance, and be off-putting to precisely the audiences we should be seeking to enthuse.

Are we overcomplicating DH? I fear so. Let’s return to our roots. In Britain, a key moment in the development of digital humanities took place on the banks of Loch Lomond in September 1996. A meeting was held at the Buchanan Arms Hotel entitled ‘Defining Humanities Computing’. Attending the meeting were representatives of three leading universities which had been involved in the Computers in Teaching Initiative established in Britain in the early 1990s. Many of the names are familiar still: from King’s, Harold Short, Willard McCarty and Marilyn Deegan; from Glasgow, Christian Kay, Jean Anderson and Ann Gow; from Oxford, Stuart Lee, Mike Popham and Mike Fraser. It’s perhaps the nearest thing to a digital humanities summit meeting that has ever taken place in Britain. Among the questions were debated were:

  • How should we define Humanities Computing theoretically or pragmatically in terms of current practice? 
  • Where does humanities computing fit within institutions of higher education? How will computing integrate into standard humanities courses? 
  • What should research in humanities computing be about? 

These are questions that are still as pressing as they were twenty years ago, and I fear we still lack cogent answers. It is fair to say that the deliberations on the banks of Loch Lomond were even then heated. For some, computing was something which facilitated and supported academic research, and the role of humanities computing specialists was analogous to that of lab technicians.  For others, particularly Willard McCarty, who has been the most persistent and forceful advocate of this view in Britain, it is a field of intellectual endeavour and investigation on a par with more widely recognized academic disciplines such as history, classics or media studies.

In the course of the discussions in Scotland, Willard drafted the following definition of the field as he saw it then:

‘HUMANITIES COMPUTING is an academic field concerned with the application of computing tools to humanities and arts data or their use in the creation of these data. It is methodological in nature and interdisciplinary in scope. It works at the intersection of computing with the other disciplines and focuses both on the pragmatic issues of how computing assists scholarship and teaching in these disciplines, and on the theoretical problems of shift in perspective brought about by computing. It seeks to define the common ground of techniques and approaches to data, and how scholarly processes may be understood and mechanised. It studies the sociology of knowledge as this is affected by computing as well as the fundamental cognitive problem of how we know what we know.'

'Within the institution, humanities computing is manifested in teaching, research, and service. The subject itself is taught, as well as its particular application to another discipline at the invitation of the home department. Practitioners of humanities computing conduct their own research as well as participate by invitation in the projects of others. They take as a basic responsibility collegial service, assisting colleagues in their work and collaborating with them in the training of students.'  

This is a beautifully crafted working definition, which would apply as much to the digital humanities today as to the humanities computing of 1996 (an updated version, supplied by Willard, is available here)  . The clarity of the definition, however, brings to the forefront a number of issues. The simplicity of the insistence that humanities computing is about using technology in humanities scholarship is important. But in 1996, there was still an air of reticence and passivity about this activity. Could computers model and mechanise what scholars did? The focus is on replicating existing scholarly practice in a digital environment. The idea that computers might create new types of scholarship is implicit here, but not actually stated. Likewise, it is assumed that intellectual disciplines are equated to the administrative structures of universities. Disciplines equal departments, it is suggested, and humanities computing only intervenes (in a collegial fashion) at the request of the home department.

Most of those attending the Loch Lomond event were not members of the academic staff of their respective universities. Most worked in information services or in libraries, in what were in those days in Britain called ‘academic related’ posts. Intellectually and in terms of their academic expertise, these pioneers of humanities computing were without doubt the equals of those in full academic posts. Part of the reason for the meeting at Loch Lomond was to try and create a co-ordinated approach to the anomalous position created by the fact that many of those who were pioneering the use of humanities computing were not themselves academics. Curiously, as far as the UK is concerned, the position of scholars and researchers who do not hold formal academic posts has got worse rather than better. The category of ‘academic-related’ post has been abolished, and Britain has misguidedly emulated North America in insisting in a distinction between academics and professional services staff, who often have significantly poorer career conditions than academic staff. Too often in this process, digital humanities work has been regarded as more appropriate to the professional services. We may trace this diminution in the status of digital humanities practitioners to that very reticence which states that we model the practices and requirements of academics. We shouldn’t. We should be challenging the way in which academic research is conducted, and disrupting cosy disciplinary assumptions. Instead of documenting and modelling what historians have done for generations, we need to show how it could be done differently.

In essence, we use computers at present to undertake humanities research more quickly, conveniently and cheaply. This reflects the way in which all those engaged in developing the infrastructure underpinning humanities research have sought to try and replicate in a more mechanized environment existing scholarly practice. Very few scholars have tried to break out of these existing models – one such is with us here this evening, Tim Hitchcock. But one Old Bailey exemplar cannot a revolution make. The way in which our digital landscape replicates the older print scholarship reflects the lack of confidence among practitioners of digital humanities in challenging older structures of scholarship and their unwillingness to build really new structures. It is striking how digital projects are often bound by the very old-fashioned structure of the edition. While I was working at King’s College London, much of the Department of Digital Humanities research was about building for individual scholars digital editions of canonical materials (rarely something unfamiliar) ranging from Ben Jonson and Jane Austen to calendars of historical documents. Even in the major prosopographical datasets produced at King’s – some of the most intriguing and potentially transformational work undertaken within the digital humanities – the data is safely locked away behind a web interface which makes the data almost as intractable as if it was printed.

It is difficult to escape the impression that digital methods have hitherto chiefly been used as a means of trying to restore dying and endangered forms of editorial scholarship. A good illustration of this is the calendar. This was from the nineteenth century a major means of publishing archival records for historians. Printed volumes contained short and thoroughly indexed summaries of historical record series. The vast size of the record series justified the production of summaries – even in the abridged form the printed volumes represented a huge series. For many areas of historical research, the calendar was the essential tool and the first step in primary research. But they wee enormously expensive to produce and printing costs became increasingly prohibitive.  In a desperate attempt to keep the small trickle of calendars flowing, Roy Hunnisett of the Public Record Office produced in 1977 a guide to record publication which gave rules for the preparation of calendars. This is fascinating as a document of late print culture. Hunnisett’s rules are dominated by the need to reduce printing costs and at almost every point are shaped by what was proved to be a doomed method of publishing records.

As a historian whose research has been facilitated by series such as the Calendar of Patent Rolls or the Calendar of Close Rolls, I applaud enthusiastically the digital revival of this movement for giving access to archival records. But the historians who have led these projects have generally found it difficult to re-imagine how a calendar might operate in a digital environment. What we have is what Her Majesty’s Stationery Office were doing in 1910, with the additional facility of some images of the records. This problem is exemplified by the way in which Hunnisett’s rules, formulated for print, are still used as the editorial basis of the online calendars, although many of the compromises Hunnisett was forced to make were intended solely to reduce printing costs, and thus do not apply in a digital environment. So, how might we imagine a calendar in an online environment? The concept of a calendar assumes that summaries are the only way to explore the vast quatities of information in archival record series. If we accept that assumption of extracting and abridging historical records as a reasonable way of proceeding, then we could think about different strategies and structures for summarizing these records. We could start to produce a variety of more summary tables of information in particular records which could then be displayed and linked in different configurations.

Instead of the standard and restricted chronological structure of the calendar, we could establish open data repositories containing tables summarizing different aspects of the records, linked to images to facilitate verification. I have for many years worked on the records of the Peasants’ Revolt of 1381 and it was an interest in editing these that first really drew me in to digital humanities work. I have recently stated to experiment with preparing and sharing data relating to the revolt in this way and I think it has some exciting possibilities. The concept of nanopublications – scholarly statements reduced to their smallest possible component and expressed as RDF triplets – might be relevant here, with archival resources being represented by vast linked groups of nanopublications. But this poses many challenges – I would regard my work on the Peasants Revolt as my most important scholarly work.  I think I have now reached the stage where I would be happy for it to become a large number of digital tables which I share with whoever is interested – losing in the process a lot of the traditional sense of authorship, ownership and acknowledgement – but it’s taken me a long time to reach that stage, and for many younger scholars this poses profound challenges in terms of careers and academic profile.

The online calendar stands as an indictment of our timorous approach to existing scholarship in developing the digital humanities. I think it will be clear that, while I enthusiastically subscribe to the view that arts and humanities scholarship should deeply engage with the new technological possibilities and facilities which are all around us, I don’t take the view that the triumph of digital humanities is inevitable. In my most dystopian moments, I fear that the kind of creative engagement humanities scholars have had in recent years with digital technology will in future become more difficult as the digital world becomes increasingly commercialized and locked down. In the UK, it’s worth looking at the awful thing, the Research Excellence Framework (probably the most striking example of academic newspeak I have yet encountered – even worse than examples from Soviet bloc universities in the Stalinist era). The REF defines the status of particular types of academic activity in the UK as strongly as the tenure process in North America. Unlike the tenure process, research assessment in the UK has always gone out of its way to accommodate interdisciplinary research and new forms of electronic communication. In the 2008 Research Assessment Exercise, digital humanities formed part of the panel dealing with Library and Information Management, and DH units did very well. King’s College London, although only its first time in the exercise in this subject, came joint top of the unit of assessment, and in Glasgow the Humanities Advanced Technology and Information Institute was the leading Scottish institution. In order to reduce the breathtaking and grotesque costs of the REF, it was decided to create larger panels this time, so library and information science has been joined with cultural and media studies to form one large panel. Although the rubric for this panel mentions DH, there is no recognized DH specialist on the panel, although organisations like ADHO made nominations. The rules of the exercise have been changed to exclude many research staff as well as working librarians, archivists and information specialists. In some cases, joint DH-Cultural Studies submissions have been necessary. Of course, we don’t know yet what outcome of the REF will be (true in November 2013, but of course we now have the results, and I have offered some preliminary reflections on them here), but I think we can already say that, if REF defines the research landscape in the UK, digital humanities does not figure very prominently on it.

Many of the issues about the future of the digital humanities can be traced back to concerns evident in Willard’s definition from Loch Lomond. The Loch Lomond meeting was very much of its time, in the assumption that a small group of enthusiasts from just three universities could shape approaches as to how digital technology would be integrated into arts and humanities provision of British higher education. The 1990s was characterized by a kind of gold rush, in which individuals and groups felt that they could annex parts of the digital future. A couple of medievalists might hope to shape the digital future of medieval studies by establishing a portal; others sought to control future editorial practice by developing appropriate guidelines. This was analogue thinking par excellence but this mentality of seeking to become recognized as the ‘Mr Digits’ of certain aspects of scholarly activity is still I think evident. And this is true of the digital humanities. Bodies like the Alliance of Digital Humanities Organisations make digital technologies seem safe, familiar, comfortable and (above all) controllable. Much of our literature (such as Melissa Terras’s remarkable and compelling keynote at DH 2010) assumes that, in the arts and humanities, the digital equates to the formally constituted bodies in ADHO. This is clearly wrong, and dangerous. One only needs look to HASTAC, which has been far more successful than ADHO in attracting young and digitally committed faculty across a variety of disciplines and interests to see the danger in clinging to the structures of forty years ago. But it goes much, much further. As humanities computing pursued research funding, and sought to model itself on scientific research institutes, it forgot about pedagogy.  As a result the Association of Learning Technologists sprang up, which is just as large and active as ADHO, but there appears little contact between them. Likewise, other areas, such as museums and archives, have pursued their own digital paths, with only patchy contact with DH. As a community DH is singularly ill prepared to deal with the digital becoming mainstream. Having spent many years predicting that everyone will absorb digital techniques, we are very uncertain what to do when that actually happens, and we become very small cogs in a huge machine. The growth of areas of academic study like digital culture, web science and digital studies illustrate the issues – these are the digital achieving recognition from mainstream academia, and those in the DH community aren’t sure how to accommodate this, no matter how wide we make the tent.

This leads to the argument which was my starting point in thinking about this talk, namely that the digital humanities are inherently time-limited and must inevitably disappear.  This assumes that, once the tools developed by DH have passed into common use, DH will have done its job, and ceases to have a purpose. Once the humanities become digital, there is no further use for the digital humanities.  This argument has recently been clearly expressed by Peter Webster of the British Library in a post on ‘Where Should the Digital Humanities Live?’ Peter wrote: ‘The end game for a Faculty of DH should be that the use of the tools becomes so integrated within Classics, French and Theology that it can be disbanded, having done its job. DH isn’t a discipline; it’s a cluster of new techniques that give rise to new questions; but they are still questions of History, or Philosophy, or Classics; and it is in those spaces that the integration needs eventually to take place’. At one level, this might be an argument that DH should then be more primarily critical, but I think it ignores the extent to which our engagement with digital technology is a continuum. John Naughton has noted how the humanities is the only area which refers to ‘the digital’ in this way. At one level, it reflects an assumption that ‘the digital’ is in some way alien; at another, it assumes that ‘the digital’ represents a series of techniques which came to maturity with the appearance of the World Wide Web in the mid 1990s (it is that has led David Berry and others to suggest that we can talk of the ‘post-Digital’). I think it is an oversimplification however to see that apotheosis of the 1990s as a single transformational moment which we are in the process of coming to terms with. They were part of a continuum of transformation which in my view reaches back to the Industrial Revolution. We know how to make digital editions of classical texts, but how can the new technologies of making help us study the classical period? What use is the internet of things to classicists (a lot, I would say). What about born digital data – something which could be fitted into Willard’s Loch Lomond definition, but wasn’t apparently at the forefront of thinking at that time. In short, it is clear that there are many new technologies and new science coming along which will also offer manifold opportunities and challenges to the humanities.  The role of the digital humanities is not to continue to crank up the digital photocopier, but rather to explore these innovations and consider how they enable us to engage with the subject areas of the humanities in fresh ways. In order to achieve this – and ensure their own future – digital humanities practitioners need to take more of an intellectual lead in creating projects.

1 comments:

  • José Igartua says:
    31 July 2015 at 06:36

    "... the digital humanities are inherently time-limited and must inevitably disappear." You could cite as an example the Association for History and Computing. Its journal is no longer publishing, and its Wikipedia entry is but a stub. Has the AHC transformed historical scholarship? I've not seen much evidence of it.

Post a Comment