1. The Industrial Revolution as a Point of Comparison

The inaugural Digital Humanities Congress organised by the Humanities Research Institute of the University of Sheffield took place shortly after the 2012 Olympic Games in London. The opening ceremony of the Olympic Games, entitled ‘Isles of Wonder’ and masterminded by the film director Danny Boyle, was a breathtaking mashup of British history.1 The ceremony began with a depiction of rural life in a ‘green and pleasant land’, with village cricket and dancing round a maypole. This bucolic scene was interrupted by the appearance of an unmistakable figure in a tall hat chomping a cigar, the celebrated engineer Isambard Kingdom Brunel, portrayed by Kenneth Branagh. At the command of Brunel and his fellow industrialists, crowds of workers transformed the green and pleasant land into a noisy smoky landscape. Seven chimneystacks rose from the ground together with five beam engines, six looms, a crucible and a water wheel. Danny Boyle described the transformation as ‘the biggest scene change in theatre history’, and had been advised against attempting it.2 This section of the ceremony was named ‘Pandemonium’ after the anthology of descriptions of the Industrial Revolution compiled by the film director Humphrey Jennings. At the climax of this section of the ceremony, workers began casting an iron ring which converged, amidst steam and pyrotechnics, with other rings, to give the impression that the Olympic Rings were being cast in steel.

If the Industrial Revolution can be seen as the ‘biggest scene change in history’, the growth and development of the city of Sheffield was one of its most dramatic manifestations. Sheffield has an industrial tradition as a centre of cutlery manufacture which goes back to the middle ages.3 It was partly the specialised skills available in Sheffield which prompted Benjamin Huntsman to select the town for his experiments in the production of crucible steel which laid the foundations of its later reputation as a steel city. Sheffield’s light trades remained important even after Thomas Bessemer’s innovations allowed the production of steel in bulk from the middle of the nineteenth century. As, thanks to Bessemer, huge steel plants were established in the city, making rails, steel plates and armaments, the physical fabric of the city was transformed in a manner that, for many people, indeed represented true pandemonium. As early as 1768, Horace Walpole described Sheffield as ‘One of the foulest towns in England in the most charming situation’, adding that what might it particularly disagreeable ‘is the excessive smoke from the great multitude of forges which the town is crowded with.’4 By 1842 the social reformer Edwin Chadwick declared that ‘Sheffield is one of the dirtiest and smokiest towns I ever saw. One cannot be long in the town without experiencing the necessary inhalation of soot … There are however numbers of persons in Sheffield who think the smoke healthy’.5 In 1909, the political and religious campaigner, Annie Besant, used Sheffield in a lecture as an example of the evils of industrialised society, in terms that again recall the imagery of Danny Boyle’s Olympic ceremony:6

"Go to Sheffield, which was built in what was one of the loveliest valleys of the Midlands; notice, as you come near it, the beauty of the countryside, the wooding of the undulating land, the exquisite beauty of rivulet, of forest, and of grass: and then, out of all that beauty of Nature, you plunge suddenly into the hideousness of Sheffield. You find the atmosphere thick with black smoke. No tree will grow in many of the districts, no flowers even on the sills of the houses of the poor."

It is a commonplace that we are today going through another period of profound economic, social and cultural change transformation, our contemporary pandemonium being associated with the use of computers. It seems like this digital revolution has been going on for a long time. Computer development was a major part of the technological and scientific revolution whose ‘white heat’ Harold Wilson in a celebrated speech in 1964 claimed would reshape the British economy.7 Nevertheless, the importance of digital transformations is still being stressed, fifty years after Wilson’s speech. A recent study by MIT Sloan Management Review and Capgemini Consulting called Embracing Digital Technology declared that:8

"The world is going through a kind of digital transformation as everything — customers and equipment alike — becomes connected. The connected world creates a digital imperative for companies. They must succeed in creating transformation through technology, or they’ll face destruction at the hands of their competitors that do."

Digital technologies are seen as the key to government success, whether it is the British Cabinet Office Minister Francis Maude describing data as "the new raw material of 21st century"9 or the declaration by the government of Singapore in 2006 that "In less than ten years, every single person and business in Singapore will find the world – and everyday life – transformed by technology".10 Humanities scholarship is not immune to such claims. The Arts and Humanities Research Council in Britain has identified ‘digital transformations’ as a major strategic theme; the Digital Humanities Manifesto 2.0 declares that the second wave of digital humanities will "shape natively digital models of scholarly discourse for the newly emergent public spheres of the present era".11

Within this world of transformation, the most common point of comparison is with the introduction of the moveable press in Western Europe which, following Elizabeth Eisenstein, is assumed to have led to great religious and cultural upheaval. A representative comment is that of Michael Brodie, the Chief Scientist of Network Technologies for Verizon, the American telecommunications company, who declared that "the Gutenberg Bible led to religious reformation while the Web appears to be leading towards social and economic reformation".12 Presumably because of the influence of theorists like Walter Ong and Marshall McLuhan, the digital revolution is seen as a media affair, so that the Digital Humanities Manifesto 2.0 sees digital technologies through the prism of our experience with print:

Like all media revolutions, the first wave of the digital revolution looked backward as it moved forward. Just as early codices mirrored oratorical practices, print initially mirrored the practices of high medieval manuscript culture, and film mirrored the techniques of theatre, the digital first wave replicated the world of scholarly communications that print gradually codified over the course of five centuries: a world where textuality was primary and visuality and sound were secondary (and subordinated to text), even as it vastly accelerated the search and retrieval of documents, enhanced access, and altered mental habits. Now it must shape a future in which the medium-specific features of digital technologies become its core and in which print is absorbed into new hybrid modes of communication.

Computing does not however simply affect communication but also medicine, manufacturing, health, perception, hearing – indeed, virtually every aspect of human endeavour and experience. The wide-ranging nature of the transformations associated with computing are more reminiscent of the effects of steam than of printing – a point shrewdly made by William Gibson and Bruce Sterling in their steam-punk novel, The Difference Engine. In this context, it is surprising that the Industrial Revolution, as a moment of major disruption in human society closely associated with technological innovation, is not more frequently used as a point of comparison to analyse and understand the nature and structure of the current changes associated with digital technology. What perspectives does the Industrial Revolution offer in thinking about the nature of technological transformation and in particular what insights does it offer to help us understand how the digital humanities should develop? This is an appropriate question to consider in Sheffield, a city, like the Olympic Rings in 2012, cast in steel by the Industrial Revolution.

2. Industrial Continuities and Disruptions

In contemplating a new digital world, it may seem as if the steam and smoke of the mechanical gear-driven world of the Industrial Revolution has little to teach us. The ‘little mesters’ of Sheffield in their tumbledown workshops seem a million miles from the inhabitants of bright new digitally-enabled workspaces. The digital is indeed frequently represented as a means of escape from the industrial. Yet the continuities between digital technology and early industrialisation are profound. Some of the fundamental concepts behind the computer programme as a sequence of logical instructions were developed in the nineteenth century from punch card mechanisms used to control mechanical looms. The telegraph was one of the iconic technologies of the period which, by offering improved communications over long distances, facilitated the growth of railways. The very concept of the digital derives from the work of engineers trying in the 1940s to improve the performance of telegraph cables. One of the great icons of the Industrial Revolution, Brunel’s steamship, The Great Eastern, was used to lay the first transatlantic cable, thereby effectively laying the foundations of the internet. Jon Agar in his masterly study, The Government Machine: A Revolutionary History of the Computer, has demonstrated how many of the conceptual foundations of the computer reflected responses to the increased pressures on government as a result of the growth of population and greater complexity of society after industrialisation. One aspect of this was the greater government interest in statistics which encouraged government support for Charles Babbage’s difference engines, one of the most sophisticated products of the new manufacturing technologies. The use of punched cards for statistical analysis was pioneered by Herman Hollerith (whose company went on to become IBM) for the American census in 1890, while in Britain punched cards were used by the Register Office for the 1911 census.13 Agar’s account of the sophistication achieved by analogue computing in the 1940s and 1950s is a powerful reminder that the lines of continuity between the industrial and digital revolutions are in many ways stronger than the disruptions.

Disruption is one of the watchwords of the digital revolution. The strapline for one of the first Web 2.0 conferences in 2008 was ‘Design, Develop, Disrupt’, and in his letter to potential investors in Facebook, Mark Zuckerberg has declared that one of the mantras at Facebook is ‘Move Fast and Break Things’, the idea being that if you never break anything you are probably not moving fast enough.14 Gutenberg and the development of print are frequently invoked as precedents for these disruptive approaches. To quote Zuckerberg again:

"We often talk about inventions like the printing press and the television — by simply making communication more efficient, they led to a complete transformation of many important parts of society. They gave more people a voice. They encouraged progress. They changed the way society was organised. They brought us closer together."

The blurb for Jeff Jarvis’s 2012 book, Gutenberg the Geek, reads:15

"Johannes Gutenberg was our first geek, the original technology entrepreneur, who had to grapple with all the challenges a Silicon Valley startup faces today. Jeff Jarvis tells Gutenberg's story from an entrepreneurial perspective, examining how he overcame technology hurdles, how he operated with the secrecy of a Steve Jobs but then shifted to openness, how he raised capital and mitigated risk, and how, in the end, his cash flow and equity structure did him in. This is also the inspiring story of a great disruptor. That is what makes Gutenberg the patron saint of entrepreneurs."

Many of the assumptions about the disruptivity of Gutenberg derive from the work of Elizabeth Eisenstein, who has argued that the role of printing has not been given sufficient weight in accounts of the Renaissance, Reformation or Scientific Revolution.16 Printing was, according to Eisenstein, the 'unacknowledged revolution'. Eisenstein argues that printing helped standardise texts so that knowledge became more settled and easily transmitted. She also suggested that, as large numbers of texts became available, their contradictions became more evident, causing readers to become more skeptical and critical of authority. However, a growing number of historical bibliographers are expressing doubts about Eisenstein’s thesis. States such as Russia and the Ottoman Empire were able effectively to control and restrict the use of printing 17 Moreover, the printing press did not kill off the manuscript.

David McKitterick describes how the manuscript of a treatise by Walter Hilton was copied at Sheen in 1499, despite the fact that the owner of the manuscript already had a copy of the same work printed by Wynkyn de Worde.18 Although the production of printed gazettes flourished in seventeenth-century England, manuscript newsletters were equally important in the dissemination of news, and indeed often regarded as more reliable.19 Above all, printing did not standardise texts. Printing was a craft activity and, just like manuscript copying, there were many points in the production of printed books in which accidents, errors and mistakes were introduced. The picture which emerges from the work of historical bibliographers such as David McKitterick, Adrian Johns and Sabrina Baron is that Gutenberg’s introduction of the press marked one stage in the protracted evolution of printing. Gutenberg was no disruptor; he was part of a process of experiment and extemporisation which lasted for hundreds of years. As Raymond Williams pointed out many years ago, the rise in literacy and access to information was a long revolution in which the appearance of steam-driven presses in the nineteenth century was just as significant as the work of Gutenberg.

The idea that changes in contemporary society will be technologically driven, unexpected and disruptive is at the heart of much current discussion of digital transformation. The invariable leading example of such transformation is the effects on the music industry of online access through services such as iTunes and Spotify. The lessons of this are summed up a report by the IBM Institute for Business Value:20

"The music industry was one of the first to feel the brunt of the digital revolution. With the standardized mp3 format for digitized music and the availability of broadband connections for Internet distribution, the reality of industry disruption became apparent to all. Traditional music companies are expected to lose more than 35 percent of value between 2003 and 2012, with total revenues for the period expected to drop from US$12 billion to $8 billion. But at the same time, other parts of the music ecosystem – more closely attuned to the customer – experienced significant growth. This includes consumer electronics companies that make digital music players, concert promoters and producers of other live events. The lesson? Industry incumbents that avoid the hard decisions about digital transformation are likely to suffer a fate similar to that of traditional music companies. For companies that stay closer to their customers, digital transformation can create significant new opportunities."

The collapse of high street shops such as HMV, Jessops and Comet is seen as an example of failure to deal with the disruptive effects of digital technologies. Although the spectre of the fate of such firms haunts management literature, the kind of digital transformation which management gurus describe as an appropriate response to these threats turns out to be incremental and small-scale. An example frequently cited of digital transformation is that of Starbucks.21 A new IT strategy for Starbucks included the provision of free wireless access for customers and a new system to speed up card payments. These had a very good effect on business, but it is difficult to see them as more than an incremental change. This raises questions about the appropriateness of our current rhetoric of disruptivity. The idea of disruptive technologies owes much of its popularity to the management guru Clayton Christiansen who introduced the concept in a best-selling book in 1995.22 But popular ideas of disruption seem to be based on a superficial reading of Christiansen’s work. Christiansen sees disruption as being produced by cheaper better-value products and suggests that sticking to expensive high-end products. In this reading, it is Toyota which is disruptive to the Detroit car industry, cheap Chinese or Hong Kong goods which disrupt European manufacturers, Samsung which disrupts Apple. It is the cheap imitation which is disruptive: simple products sold to less demanding customers. Companies which focus on high-end technologically innovative products are those which are at risk of disruption.

In Danny Boyle’s depiction, Isambard Kingdom Brunel was the high priest of disruption – the man who changed the face of Britain at a stroke. Yet Brunel himself believed profoundly that innovation was an incremental process. He wrote:23

"I believe that the most useful and novel inventions and improvements of the present day are mere progressive steps in a highly wrought and highly advanced system, suggested by, and dependent on, other previous steps, their whole value and means of their application probably dependent on the success of some or many other inventions, some old, some new… [I]n most cases they result from a demand which circumstances happen to create. The consequence is that most good things are being thought of by many persons at the same time."

The history of steam power itself illustrates Brunel’s proposition that innovation consists of small ‘mere progressive steps’. Watt famously hit upon the idea of a separate condenser when repairing a model of the steam engine devised by Thomas Newcomen in the eighteenth century. Watt’s ability to produce steam engines was dependent on other inventions such as John Wilkinson’s boring machine. The high cost of Watt’s patents encouraged further technological innovation by other inventors, leading for example by experiments by Cornish engineers such as Trevithick with high-pressure steam engines which pointed the way towards steam locomotion in a way that Watt’s work did not.24 The steam age appeared in small increments and not as one mighty transformation.

The adoption of new industrial technologies was by no means as disruptive as might at first be imagined. A very famous example of industrial disruption is the fate of the handloom weavers and stockingers, independent artisans who supposedly were suddenly thrown out of work en masse by the arrival of power looms. The fate of the handloom weavers might be taken as an early disruption and the futile protests of the Luddites as an illustration of the impossibility of bucking technological trends. Yet recent studies have revealed a much more nuanced process of change among weavers. A detailed analysis of the number of handloom weavers by Geoffrey Timmins shows that there were still large numbers of hand weavers until the late nineteenth century who continued to make a significant, if diminishing, contribution to textile output.25 Emma Griffin has pointed out that, for hand weavers based in towns, industrialisation was not necessarily a bad thing, since working in factories provided a more steady source of income.26 Griffin suggests that it was weavers in rural areas who encountered greater difficulties. The changes were complex, gradual, uneven in their distribution and impact, and the effects by no means sudden or disruptive.

The Industrial Revolution was a more complex and amorphous process than we might imagine, and this suggests that the present changes associated with digital technology may be more complex than is apparent from the glib language of Silicon Valley. The story of industrialisation in Sheffield exemplifies how innovation is frequently a process of starts, stops and small steps. Henry Bessemer’s invention of a new method of bulk steel production, hailed at the time as the greatest invention since the steam engine, might be seen as a disruptive technology, but the story was more complex.27 Bessemer described his use of a converter to turn 700lb of iron into steel in a famous paper, ‘On the Manufacture of Iron and Steel without Fuel’, at the British Association for the Advancement of Science in 1856. But when the process was first used commercially, the steel was found to be brittle and over-oxidised. The problems were resolved with advice from the metallurgist Robert Mushet and the Swede, Göran Göransson. Without the assistance of these men, Bessemer’s discovery would have been still-born. Exasperated at the failure of the Sheffield iron masters to recognise his discoveries, Bessemer established his own works in Sheffield, which resulted in other Sheffield works adopting the Bessemer process, beginning with John Brown in 1862. Bessemer mocked the initial failure of the Sheffield iron masters to recognise the potential of his method, but given his early problems, this is perhaps not surprising. Moreover, it was only after 1880 when Sidney Gilchrist Thomas further improved the process to allow the use of pig iron containing phosphorous that steel production began to reach millions of tons a year at a cost of less than £5 a ton.

The birth of Sheffield as the first ‘steel city’ was a lengthy process, and the triumph of heavy industry there by no means quick or sudden.28 The roots of Sheffield’s industry lie in its development as a centre for the manufacture of cutlery in the middle ages. Huntsman first produced crucible steel there in the 1740s and steam power arrived in the city in 1786, but nevertheless the initial industrial growth was in historic light trades such as the making of tools, cutlery and silver plate. Techniques in Sheffield’s light trades changed very slowly. Before 1850, the only major change was the use of steam instead of water to drive the wheels used by the grinders. The light trades remained dominated by ‘small mesters’ who hired rooms in works with steam-powered wheels. It was only in the 1850s that factory production and mechanisation began to be introduced in the light trades. The growth of heavy industry was a product of the third quarter of the nineteenth century, and was related to the adoption and development of the Bessemer process. Between 1851 and 1891, employment increased over 300% in the heavy trades, compared with 50% in the light trades. The process of industrialisation in Sheffield was a long and incremental one, lasting over four hundred years.

The digital humanities has enthusiastically adopted the disruptive and transformative terminology of Silicon Valley. In a now notorious sound bite, Mark Sample declared: "It's all about innovation and disruption. The digital humanities is really an insurgent humanities".29 Yet, as humanities scholars, perhaps we can point to the historical example of industrialisation to question this adoption of the Silicon Valley language of disruption and transformation. Following Brunel, perhaps we should instead conceptualise digital humanities as a process of incremental development. Maybe we should stop seeking to create that all-embracing system that will disrupt and transform and focus on the small improvement. We might also recognise that the important thing is perhaps not to break things but to build things which last. The important thing about the industrial revolution was not its disruptive character but instead its ability to support change. In Joel Mokyr’s words:30

"The Industrial Revolution was ‘revolutionary’ because the technological progress it witnessed and the subsequent transformation of the economy were not ephemeral events and moved society to a permanent different economic trajectory."

In this sense, the digital revolution may not be about innovation and disruption at all, but rather about moving to a situation where we easily accept a process of constant incremental change and development.

3. Measuring a Revolution

The famous economic historian at the University of Sheffield, Sidney Pollard, pointed out how, for many workers, their experience of the Industrial Revolution was far removed from the scenes depicted in the Olympic Opening Ceremony. Pollard described how a "visitor to the metalworking areas of Birmingham or Sheffield in the mid-nineteenth century would have found little to distinguish them superficially from the same industries a hundred years earlier. The men worked as independent contractors in their own or rented workshops using their own or hired equipment … These industries…were still waiting for their Industrial Revolution".31 Yet, as Pollard emphasised, the environment in which these workmen operated had been completely transformed. Their wheels were now powered by steam and there were other gadgets which speeded up minor operations such as stamping and cutting. The workshop might be lit by gas and have a water supply. Railways made distribution easier and cheaper while also giving access to a large labour market. While the ‘small mester’ may have been working in an old-fashioned way, his environment had been completely transformed.

The picture painted by Pollard is fascinating. It may be that the humanities scholar is in the same position with regard to digital technologies as a Sheffield craftsman at the time of the Industrial Revolution. Perhaps the most that digital humanities will ever achieve is a change in the environment of humanities scholarship – quicker methods of undertaking research, improved means of distributing and sharing research, and so on, while the fundamental attributes of humanities scholarship remain unchanged. But Pollard’s picture of the Sheffield worker waiting for the Industrial Revolution also emphasises the difficulty of defining the Industrial Revolution. The enormous complexities of the historiography of the Industrial Revolution and its failure to reach a consensus on many key issues illustrate the difficulties of defining and measuring such enormously complex social and cultural upheavals. The lack of consensus about the nature of the changes which took place in the eighteenth and nineteenth century suggest that we should not despair about the difficulty of measuring the impact of digital projects in the humanities. The scholarly impasse on many aspects of the Industrial Revolution also suggest that digital humanities scholars should be cautious about the potential role of quantification in humanities scholarship.

Tony Wrigley has described the Industrial Revolution as one of the two greatest transformations of human society since the days of the hunter-gatherer. However, he points out, many of the most widely used indicators of economic and social change record more rapid change in the 150 years since 1850 than in the preceding period. Moreover, he suggests that despite its great significance the Industrial Revolution was for the most part "curiously and instructively imperceptible to contemporaries".32 While liberals celebrated the potential of steam to promote peace and prosperity, critics such as the Stockport doctor Peter Gaskell complained that a complete revolution had been affected, with the very face of the country re-modelled and whole classes of inhabitants swept away.33 Nevertheless, exactly what had happened was mysterious.34 Technology was seen as a key element, as can be seen from the work of Friedrich Englels who explained how a succession of inventions had changed the manufacturing and transport sectors in ways that had precipitated major social changes which Engels anticipated would cause a political revolution. The idea that these changes represented a single definable event only however became widespread when the social reformer Arnold Toynbee popularised the term ‘Industrial Revolution’ in the nineteenth century. The appearance of this term (which did not finally enter the Oxford English Dictionary until 1926) immediately raised a lot of questions. If this was an ‘industrial’ revolution, how were the contemporary changes in agricultural productivity related to it? Was this really just an industrial revolution? Were the changes of the late eighteenth and early nineteenth centuries really so revolutionary? Can’t we find parallels in earlier periods? During the fifty years after Toynbee wrote, scholars reconstructed the Industrial Revolution as a more gradual and piecemeal process. But then in 1960 the economic theorist W.W. Rostow argued that between 1783 and 1802 the British economy went through a dramatic process of ‘take-off’. In Rostow’s view, any country wanting to develop a modern economy also had to undergo a similar process of ‘take-off’.

Rostow’s model encouraged further investigation of the economic records of Britain during the late eighteenth and early nineteenth century. The economic historians Phyllis Deane and W. A. Cole assembled detailed data on the rate of growth in the economy between the late seventeenth and early twentieth centuries. While suggesting that Rostow’s idea of a ‘take off’ was an exaggeration, Deane and Cole nevertheless found evidence of a period of sustained and rapid growth of 3.4% per year from 1780-1801, rising to almost 4% until 1831. The data compiled by Deane and Coles was at first widely accepted by historians, but Crafts and Harley subsequently revisited the figures and suggested that Deane and Coles had significantly overestimated the amount of growth. Deane and Coles suggested industrial production had risen sixfold between 1780 and 1830; Crafts suggested a more modest increase of less than fourfold. This suggested that the economy as a whole was growing more slowly than might be at first imagined. Deane and Coles had British GDP increasing at 2.5% per year between 1780 and 1831; Crafts revised this downwards to just over 1.7% - hardly the sort of figure which would make a modern Chancellor of Exchequer very excited. Moreover, Crafts also examined individual productivity and suggested that the main increases were confined to just two sectors, textiles and iron. The findings of Crafts and Harley, which seemed at first sight almost to dissolve the Industrial Revolution as an event, provoked controversy among historians, but the end result has been summarised by Emma Griffin as follows: "It is noticeable that over the last two decades there has been begrudging acceptance of Crafts’ three central arguments: that the mid eighteenth-century economy was considerably larger than previously thought; that subsequent economic growth was probably slower; and that growth was confined to a limited sector of the economy".35

The attempts by historians to quantify and define the nature of the changes to which we have given this convenient label ‘The Industrial Revolution’ confirm once again how amorphous, localised and patchy the nature of the changes in the British economy from the late eighteenth century were. While historians agree that there was a major economic, social and cultural transition at that time, its exact nature, and its relationship to the emergence of new technologies, remains very difficult to pin down – Griffin suggests that the vigorous discussion among historians means that "by the end of the 1990s we had more Industrial Revolutions than ever before",36 while David Cannadine suggests that each generation develops its own conception of the Industrial Revolution, moulded by its own economic circumstances.37 In the light of the complexity of the discussion about industrialisation, it seems likely that in retrospect historians will find the digital revolution equally difficult to pin down. However, paradoxically, this is in many ways a reassuring message for those working in the digital humanities. A major issue in the digital humanities over the past ten years has been the evaluation of impact. As research councils and government agencies have poured millions of pounds into digitisation programmes, there has been a demand to show that the investment was a worthwhile one and that these resources are indeed transforming scholarly work. However, too often it has been found that these resources are not actually used very much – in particular there has been great concern about the low levels of reuse of data. A major focus of recent research has been the development of methodologies supporting more balanced evaluations of digital scholarship. A good example is the work of my colleague at King’s, Simon Tanner, and his development of the Balanced Value Impact Model, which is being widely adopted by many cultural institutions.38 Tanner argues against a purely metric form of evaluation, and the issues reflected by the historiography of the Industrial Revolution reinforce and support his argument. Historians argue that impact cannot be evaluated purely by overarching metrics: "the national accounts approach to economic growth and productivity change is not a good starting point for the analysis of fundamental economic discontinuity".39 Similarly, it is unlikely that metrics will help much in analysing or understanding the cultural discontinuities associated with the spread of digital technologies. In her recent enthralling book, Liberty’s Dawn, Emma Griffin has shown the potential of working-class autobiographies from the period of the Industrial Revolution to produce striking and surprising perspectives on the social and economic changes experienced by ordinary people. Maybe it is only by similar testimonies that we will be able to grasp the significance of the digital revolution.

The prominence in digital humanities of works using statistical and visualisation techniques such as Moretti’s Distant Reading has led to a frequent assumption that the Digital Humanities is intimately linked with quantification, and many of the anxieties about the digital humanities appear to be partly linked with uncertainty about the appropriateness of statistical methods in disciplines such as English which were to some extent formed as a reaction against the growth of a mechanistic and technological science. Digital Humanities is of course much more than quantification. From the perspective of a manuscript scholar or archaeologist, digital imaging or virtual reconstructions loom larger than quantification. But the assumption of a link between digital humanities and numbers won’t go away, and is reinforced by comments such as that in a New York Times article in 2010:40

"Members of a new generation of digitally savvy humanists argue it is time to stop looking for inspiration in the next political or philosophical “ism” and start exploring how technology is changing our understanding of the liberal arts. This latest frontier is about method, they say, using powerful technologies and vast stores of digitized materials that previous humanities scholars did not have."

Moretti has made a similar point more subtly in a recent article in The New Left Review which argues that digital tools provided new ways of connecting with empiricism, "by turning concepts into magic spells that can call into being a whole world of empirical data".41 Moretti presents data as an inescapable corrective: "if the data revolt against their creator, then the concept is really in trouble". Yet, as the historical debates about the levels of economic growth in the Industrial Revolution amply illustrate, quantitative research can just as easily reach an impasse as theoretical debates. These debates suggest that, while quantification is an important method, it is far from being the kind of shibboleth sometimes suggested in the Digital Humanities, and those experimenting with statistical methods in literary studies could learn a great deal by contemplating the experiences of colleagues such as economic historians.

While management literature on digital transformations emphasises the urgency of engaging with the potential of new technology, it also suggests a frequent uncertainty among business organisations about what is required. One IT executive is describes his anxieties that "this company has spent the past 25 years building its IT and processes, it’s now got about 3 years to reinvent them or we just won’t survive".42 Yet the nature of these changes is very uncertain: another executive states that: "If we’re going to be successful in the future then we need to have an infrastructure that allows us to plug things in. And we won’t know now what all those future ‘plug-ins’ will look like".43 In these very speculative environments, it is frequently difficult to measure the business benefits of particular approaches. The MIT-Capgemini report states that "business model transformation is also elusive. A mere 7% of respondents said that their company’s digital initiatives were helping to launch new businesses, and only 15% said new business models were emerging thanks to digital technology".44 There are complaints about uncertainty as to whether new technologies are right for the marketplace, about excessive hype and innovation fatigue.

Again, these comments seem reminiscent of discussions about the Industrial Revolution, where it has been frequently difficult for historians to quantify and measure the extent and character of changes. It may be that transformations of these kind are simply not susceptible to easy measurement and definition. The very interest in quantification, measurement and analytics which digital technologies foster may ironically ultimately prove incapable of capturing the essence of the impact of digital transformations.

4. Heroes of Invention

For the city fathers of Victorian Sheffield, their town was the creation of great inventors and visionary entrepreneurs. The imposing town hall of the city incorporates statues of figures representing electricity and steam holding scrolls with the names of such technological pioneers as Watt, Stephenson, Faraday and Davy. The message conveyed by the Victorian iconography of Sheffield is the same as that of the Olympic Ceremony – that these extraordinary changes were the work of engineers such as Brunel, the man in the tall hat who, with a wave of his cigar, brought about the biggest stage change in history. In this sense, Danny Boyle’s narrative of the Industrial Revolution (criticised by some commentators for left wing bias) echoed Victorian liberal ideology. Christine MacLeod has recently described how, in the period after the end of the Napoleonic Wars, liberals reacting against conservative claims that the victory was due to the military genius of men like Wellington and Nelson, argued that it was Britain’s economic might that had enabled her to triumph, and that it was the achievements of men such as James Watt that had might the greatest contribution to the defeat of Napoleon. In the words of an obituary of Watt:45

"It is our improved steam-engine that has fought the battles of Europe, and exalted and sustained, though the late tremendous contest, the political greatness of our land. It is the same great power which enables us to pay the interest of our debt, and to maintain the arduous struggle in which we are still engaged, with the skill and capital of countries less oppressed with taxation … It is to the genius of one man, too, that all this is mainly owing; and certainly no man ever before bestowed such a gift on his kind."

Watt was also portrayed as a paragon of personal virtue, with an amazing range of knowledge and accomplishments but completely lacking in arrogance or bluster, a model of enlightened sociability. The God-like status increasingly accorded to Watt is vividly expressed in the monumental statue of him installed in Westminster Abbey in 1834. This invention of Watt as a liberal hero provided a template for the creation of a technological pantheon celebrating the idea of self-made inventors and engineers, such as Arkwright, Stephenson and Brunel father and son, as the heroes of the Pax Brittanica. The Victorian celebration of the inventor and industrialist was also apparent in the eulogisation of men like Henry Bessemer and William Armstrong. The idea of the heroic inventor only began to decline at the beginning of the twentieth century, as increased industrial competition from countries such as Germany emphasised the need for a more professional and scholarly approach to scientific research.

Likewise, the message taken up in the digital revolution has been the importance of leadership and vision. The hagiography surrounding Steve Jobs is the leading illustration of this phenomenon. Jobs has become the digital Watt, and is specially revered by those in arts and humanities. At the launch of iPad 2, Jobs declared that "It’s in Apple’s DNA that technology alone is not enough—it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing, and nowhere is that more true than in these post-PC devices".46 Yet the iPad can be seen as making an assault on the open ideals associated with the humanities. The iPad is an instrument for more efffective commercialisation of the web. The app fragments the web to produce a plethora of services which can be more easily marketed and commercialised. It is a way of taking the unifying vision of the web and splintering it so that business models more effectively emerge. Not surprisingly, Rupert Murdoch heralded the appearance of the iPad with enthusiasm, describing it as a ‘game-changer’ and the ideal device to ensure that users pay for digital journalism.47 Apple-watchers complain that the magic and creativity has left the firm since Jobs’s death. Some commentators saw the launch of the iPad Air in 2013 as lacking in the Jobs sparkle, giving an impression "of staleness, and ossification. Words and illustrations on a canvas, literally replayed, without life, without originality".48

Perhaps the Steve Jobs myth is as exaggerated as that of James Watt. Christine Macleod traces a process whereby the early heroic ideal of Watt is reconfigured in response to radical criticism so that Watt embodies ideals of democratic invention. Inventions were not the result of the sudden inspirations of genius but rather a methodical and workmanlike investigation. Samuel Smiles reflected this point of view when he wrote that "Arkwright probably stood in relation to the spinning machine that Watt did to the steam–engine and Stephenson to the locomotive. He gathered together the scattered threads of ingenuity which already existed, and wove them, after his new design, into a new and original fabric".49 A similar process can already be seen as occurring with the reputation of Steve Jobs as initial eulogies of the incomparable genius rapidly began to be replaced by a view of Jobs as a supreme tweaker. An appraisal of Steve Jobs by Malcolm Gladwell explicitly makes comparisons between Jobs and the ‘tweakers’ of the Industrial Revolution,50 and suggests that Jobs was more of a ‘tweaker’, borrowing the mouse and graphical interface from Rank Xerox, introducing the iPod five years after the earliest music players, creating the iPhone because he felt existing mobile phones ‘sucked’, and so on. Watt and Jobs were tweakers of genius.

Gladwell cites influential research by Ralf Meisenzahl and Joel Mokyr which argues that the early growth of industrialisation in Britain reflected the availability there of a large cohort of skilled artisans and engineers.51 Mokyr has subsequently developed this thesis on a larger scale, arguing that the Industrial Revolution was forged by the exchange of information among this class of craftsmen, scientists and engineers, and suggesting that the associational culture fostered by the Enlightenment was important in generating the social capital and networks which helped create the Industrial Revolution.52 In such a reading, the Lunar Society was as important in explaining the Industrial Revolution as the coal seams of Wales and Yorkshire. Mokyr’s explanation is of course richer and more multi-layered than this crude summary suggests – the relationship between cost of labour and cost of coal appears for example to have been another factor which may have encouraged the development of new technical solutions. There are issues around Mokyr’s interpretation which require much further analysis, such as the reasons why ‘tweaking’ was more productive in some industries than others. As noted, Sheffield had a strong craft tradition in light trades such as cutlery but, unlike iron and textiles, this did not seem to foster an early culture of technological innovation. Similarly, there were strong pockets of great mechanical expertise in other parts of Europe, but these did not result in an Industrial Revolution.

Mokyr’s emphasis on ‘tweaking’ and the importance of craft exchange of ideas contrasts with those discussions of digital transformation which (perhaps following the Steve Jobs myth) emphasise the importance of strong top-down leadership and disparage the idea of bottom-up change. The MIT Sloan Management Review emphasises that "Digital transformation starts with a vision from top leadership".53 It quotes a senior manager as stating that "This idea that a thousand flowers will bloom and we will all be okay is a great way to get some ideas, but we have not seen any transformations that happen bottom up. They’re all being driven top down". The conclusion is that "Digital transformation needs to come from the top, and companies should designate a specific executive or executive committee to spearhead efforts". Indeed, the message is that any action (exactly what is often vague) is better than none: "The only wrong move for executives, then, would be not making any move". The lessons of the Industrial Revolution seem to run counter to this view. The ‘tweaking’ culture described by Mokyr was one of letting a thousand flowers bloom and experimenting in all sorts of areas. Versatility and diversity were the key to success. Among Bessemer’s experiments before he went on the Besemer Converter was work in electroplating; embossing of metal cards and fabric; die stamping; making imitation velvet; sugar-refining; the production of glass plate; and pencil manufacture. Apart from his work on the steam engine, Watt created the first mechanical device for duplicating writing, and experimented in creating a sculpture copying machine. While the creation of factory system by men like Arkwright can be seen as reflecting a bold entepreneurial vision, the engine of the transformation was very much the product of a collaborative, playful and inquisitive culture.

It is perhaps here that digital humanities has a distinctive contribution to make, in promoting models of collaborative innovation which are closer to those which characterised the Industrial Revolution than the executive-driven top-down transformation proposed in management literature. Digital humanities is charcterised by its adaptive and eclectic technical approach. Rather than develop major new innovative technologies, it picks and chooses from a variety of technical solutions and adapts them for use with humanities source materials which are often complex in nature. While the collaborative rhetoric of digital humanities is often a tactic to conceal strong control by senior academic researchers of projects populated by junior research staff with little opportunity to develop their own research interests, nevertheless the collaborative ideals of digital humanities could form the basis of a strong culture committed to share, change and improve technical solutions, in just the way that many of the craftsmen, inventors and engineers of the Industrial Revolution did. This may prove to be a more effective path to digital transformation than management training seeking to turn every CEO into Steve Jobs.

We may feel that in learned societies like ALLC or ADHO we have the equivalent of a Lunar Society in digital humanities. But the model of something like ALLC is that of a nineteenth-century learned society, and the Lunar Society was more flexible and informal than that. Bodies like the ALLC or ADHO are designed to affirm the respectability and seriousness of their members, to show that they are worthy professional people. But the informal, drunken societies of the eighteenth century show the value of using much looser arrangements to generate social capital. We need to think about how we can recreate that kind of eighteenth century social excitement in the digital sphere. What is particularly important about these eighteenth century clubs is that they operated a spectacularly big tent. There was no set view in the eighteenth century as to whether the engineer or the money man should take the lead. It has been suggested that the key ability was ‘to identify a need or opportunity, then cooperate with others who possessed a different skill to take advantage of it’.54 This description of the skills necessary for success in the eighteenth century is, I would suggest, equally applicable to the digital world. However, in the eighteenth century this also involved an appetite for risk. Watt was constantly terrified by what he saw as Boulton’s imprudence. Two of the greatest engineers and entrepreneurs of the Industrial Revolution, Richard Trevithick and Richard Roberts, died penniless.

5. Spaces of Creation

The factory may seem to be the characteristic spatial expression of the Industrial Revolution, but this was by no means the case. In Sheffield, the spaces of industrial activity varied from the tiny workshops associated with the light trades through to the massive vulcanic spaces of the great forges. What is striking and instructive is the way in which new ideas emerged from liminal spaces. The ideas which drove the Industrial Revolution were to some extent influenced by scientific theories emerging from universities like that are Glasgow, but mostly the new inventions were the work of the ‘technically brilliant but basically empirical tinkerers or technical designers’,55 who were to be found in workshops and forges. James Watt’s position in Glasgow again perfectly illustrates this. James Watt is one of the outstanding names associated with the University of Glasgow, but he was never a member of the University’s academic staff. He was employed to repair scientific instruments. It was in the process of repairing a model of a steam engine owned by the University that Watt hit on the idea of a separate condenser. Although Watt was not a lecturer but a mere craftsman, his workshop became the intellectual hub of the University. His friend John Robison, who afterwards became Professor of Chemistry at Glasgow, recalled how: "All the young lads of our little place that were any way remarkable for scientific predilection were acquaintances of Mr Watt; and his parlour was a rendezvous for all of his description. Whenever any puzzle came in the way of any of us, we went to Mr Watt. He needed only to be prompted; everything became to him the beginning of a new and serious study; and we knew that he would not quit it till he had either discovered its insignificance, or had made something of it".56

Watt was not exceptional. In Sheffield, Benjamin Huntsman was also a scientific instrument maker. Sheffield plating was accidentally discovered in 1743 by a Sheffield cutler Thomas Boulsover while repairing a customer’s knife. Henry Bessemer received only elementary schooling, preferring to gain practical experience in his father’s type foundry. When Bessemer was invited to describe his steel process to the British Association, he protested that he had ‘never written or read a paper to a learned society’.57 Stainless steel was developed in Sheffield in 1913 not in the University but in the research laboratory of the steel firms Firth and Brown by Harry Brearley, a self-taught metallurgist who had never received any formal education.58 Indeed, Brearley was scornful of university-based research, describing himself as a breaker of idols and a scorner of cherished reputations, unpopular with writers of textbooks, such as Dr John Arnold of Sheffield University whose work, according to Brearley, was full of mistakes. When setting up the research laboratory at Firth Brown, Brearley was insistent that continuing close content with the workmen was key to success. A successful research laboratory should be a clearing house for difficulty and this according to Brearley required close contact with factory life. Scholarly literature was for Frearley beyond redemption, undermined by dogma and authority, the twin evils of scientific progress, full of disagreements between rival professors about issues which could be resolved by simple observation.

One of the great challenges which digital technologies presents us with is the need also to develop spaces which allow theory and making and tinkering to collide – a digital equivalent of Watt’s workshop at Glasgow or Brearley’s clearing-house for difficulty in Sheffield. Ideally, this would be precisely what a digital humanities centre should be like, but sadly we have rarely achieved this. The pressure of university funding structures means that most digital humanities centres are soft-funded and are on a treadmill of project funding which restricts the ability to act as centres for innovative thinking. Moreover, in Britain at least, universities are increasingly making a stronger distinction between academic and professional staff. This is without doubt a retrograde development, but the political and administrative drivers behind it are formidable. Places such as the Pervasive Media Studio at the Watershed Arts Centre in Bristol give a good idea of what can be achieved, but it is perhaps telling that this is part of an arts centre, and not based on a university campus. Perhaps we should question of whether digital humanities, which seeks to build new forms of collaboration and network, should be in such formal structures as centres or academic departments at all. It may be that what we should seek to do is to establish networks of like-minded people rather than centres. Mark Sample has suggested “We belong on the margin---not because we've been pushed there, but because that's where the edge is. And when the center expands to swallow the periphery---not in the name of exclusivity, but in the name of incorporation and assimilation---we need to push ourselves further away”.59

The academic margin was precisely where pioneers like James Watt and Harry Brearley belonged, and is perhaps the territory we should be seeking out. In their remarkable book, Divining a Digital Future: Mess and Mythology in Ubiquitous Computing, Paul Dourish and Genevieve Bell have emphasised the importance of the shed as a liminal space with the home which is nevertheless fundamental to the way in which technology is deployed domestically – “spaces through which technologies move both into and, more commonly, out of domestic space”. The shed is an area of technological experiment, and a shrine to the kind of tinkering that was significant in the Industrial Revolution. Perhaps what we need is not digital humanities centres but digital humanities sheds.

6. The New Industrial Revolution?

At the end of his life, James Watt was preoccupied with attempting to create a sculpture copying machine. At the time of his death, his workshop in Birmingham, a legendary ‘magical retreat’, contained a number of plaster cast moulds connected with Watt’s work on sculpture copying. In 1924, the contents of Watt’s workshop had been transferred to the Science Museum in London. In 2011, in connection with preparations for an exhibition at the Science Museum, a tea led by Professor Stuart Robson from UCL were asked to undertake some work with one of the plaster cast moulds. The mould was scanned in 3D by Professor Robson and his team and the scan was afterwards 3D-printed at the Bartlett School of Architecture. The resulting object, now displayed in the Science Museum, turned out to be a previously unknown bust of Watt himself.60

This experiment represents an appropriate turning of the circle, with Watt’s own experiments directly contributing to the development of new forms of manufacturing. The idea that 3D printing will represent ‘a new industrial revolution’ has become commonplace. However, this is not necessarily the only parallel that can be drawn. David Gauntlett has emphasised the way in which the new maker culture also echoes the ideals of William Morris and the Arts and Crafts movement, which were a reaction against the effects of industrialisation.61 It is also striking how quickly the discussion about 3D printing has moved on from the potentially disruptive effects of being able to print car parts, tools or even guns, towards completely new possibilities, such as the printing of body parts and even now new body organs.62

The emergence of this new digital materiality might be taken as a good example of the unpredictable twists and turns of our digital engagement. Just at the point where our rhetoric of the digital sublime was stressing the evanescence and immateriality of data, data suddenly becomes very real: it can be printed and turned into data art. Yet perhaps this latest shift suggests that our disruptive mindset in approaching technology, looking for the next revolution and transformation, is an inappropriate approach. The social, economic, cultural and technological changes from the eighteenth century moved humanity into a new type of existence characterised by a continuum of change and development. The development of digital technology was not a revolution but a further manifestation of the mighty changes which began in the eighteenth century. We are still caught up every day in the throes of the Industrial Revolution.