Past Futures: The Impossible Necessity of History 9781442620971

In Past Futures, Ged Martin advocates examining the decisions that people take, most of which are not the result of a &#

228 105 703KB

English Pages 326 Year 2004

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Contents
Acknowledgments
1. Redefining History at the Centre of Debate
2. History versus the Past
3. The Impossibility of Explanation
4. The Moment of Decision
5. Past Futures
6. A Long Time in History
7. Significance
8. Objections, Review, and Tailpiece
Notes
Index
Recommend Papers

Past Futures: The Impossible Necessity of History
 9781442620971

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

PA S T F U T U R E S : T H E I M P O S S I B L E N E C E S S I T Y O F H I S T O RY

By nature, human beings seek to make sense of their past. Paradoxically, true historical explanation is ultimately impossible. Historians never have complete evidence from the past, nor is their methodology rigorous enough to prove causal links. Although it cannot be proven that ‘A caused B,’ by redefining the agenda of historical discourse, scholars can locate events in time and place history once again at the heart of intellectual activity. In Past Futures, Ged Martin advocates examining the decisions that people make, most of which are not the result of a ‘process,’ but are reached intuitively. Subsequent rationalizations that constitute historical evidence simply mislead. All historians can do is to locate these decisions in time and explain not why they were made, but why then? To illustrate, Martin asks a number of questions: What is a ‘long time’ in history? Are we close to the past or remote from it? Is democracy a recent experiment, or proof of our arrival at the end of a journey through time? Can we engage in a historical dialogue with the past without making clear our own ethical standpoints? Although explanation is ultimately impossible, humankind can make sense of its location in time through the concept of ‘significance,’ a device for highlighting events and aspects of the past. Through his queries and discussions, Martin is suggesting a radical new approach to historical discourse. (The Joanne Goodman Lectures) ged martin formerly held the chair of Canadian Studies at the University of Edinburgh.

This page intentionally left blank

PAST FUTURES The Impossible Necessity of History

Based on the 1996 Joanne Goodman Lectures GED MARTIN

UNIVERSITY OF TORONTO PRESS Toronto Buffalo London

© University of Toronto Press Incorporated 2004 Toronto Buffalo London Printed in Canada isbn 0-8020-8979-8 (cloth) isbn 0-8020-8645-4 (paper)

Printed on acid-free paper

National Library of Canada Cataloguing in Publication Martin, Ged Past futures : the impossible necessity of history / Ged Martin. ’Based on the 1996 Joanne Goodman lectures.’ Includes bibliographical references and index. isbn 0-8020-8979-8 (bound) isbn 0-8020-8645-4 (pbk.) 1. History – Philosophy. 2. History – Methodology. II. Series: Joanne Goodman lectures ; 1996. d16.m365 2004

901

I. Title.

c2003-906043-8

University of Toronto Press acknowledges the financial assistance to its publishing program of the Canada Council for the Arts and the Ontario Arts Council. University of Toronto Press acknowledges the financial support for its publishing activities of the Government of Canada through the Book Publishing Industry Development Program (BPIDP).

The Joanne Goodman Lecture Series has been established by Joanne’s family and friends to perpetuate the memory of her blithe spirit, her quest for knowledge, and the rewarding years she spent at the University of Western Ontario.

This page intentionally left blank

Contents

ack n o wl ed g m en t s

ix

1 Redefining History at the Centre of Debate 3 2 History versus the Past 11 3 The Impossibility of Explanation 33 4 The Moment of Decision 77 5 Past Futures

109

6 A Long Time in History 7 Significance

149

187

8 Objections, Review, and Tailpiece

notes

263

index

295

243

This page intentionally left blank

Acknowledgments

The invitation to deliver the 1996 Joanne Goodman Lectures at the University of Western Ontario was a great honour. I wish to thank in particular the Honourable Eddie Goodman OC who established the lecture series in memory of his daughter, who died in tragic circumstances while studying at Western in 1975. I thank Mr Goodman, his family and friends, for the warmth of their welcome. As a commemoration should be, the 1996 Joanne Goodman Lectures were a very happy event. I am also doubly indebted to Dr Neville Thompson and his colleagues in the Department of History, first for their efficient organization and secondly, for their patience. My distinguished forerunners in the series had addressed specific, and frequently very large, issues in their lectures. Boldly, perhaps recklessly, I undertook to talk about History itself. It soon became evident that a version revised for publication would exceed the three-lecture format of the annual Joanne Goodman Lectures. Much has happened in both personal and social time since my visit to the University of Western Ontario. I am especially grateful to Neville Thompson for his quiet confidence that the project would reach fruition. Among friends at Western, I thank in particular Barbara and Leslie Murison, Colin Read, and Gail Thompson. A comprehensive list of acknowledgments would include practically everybody who has helped me to comprehend the historical process, ever since Alan Pender and Ian Webber first encouraged me to reflect upon the relationships of space and time. I wish I could name them all, but I shall try to thank as many as possible in private. For the

x / Acknowledgments past two decades, my main academic endeavour has been within Canadian Studies, where I have heard colleagues who were not historians expound how their disciplines could help throw light upon the Canadian experience. Fascinated and often overawed, I felt challenged to think afresh about my own subject, and I am grateful for that stimulus. Ann Barry, John Laband, Jodie Robson, Ruth Sandwell, and Jim Sturgis made helpful comments on my drafts. As always, Colin Coates and Tony Cohen provided collegial encouragement. Geoffrey Roberts was generous and knowledgeable in his advice. I owe a special debt of thanks to Lennart Husband, Frances Mundy, John St James, and their colleagues at the University of Toronto Press for their encouragement and support. It should be obvious that Past Futures represents the reflections of a jobbing historian upon a worthwhile craft. The examples are drawn from areas of history that happen to interest me (and in which I am not necessarily an ‘expert’). Those who delve into the Notes will find that I have cited many paperback or reprint editions of historical works, simply because these are on my own bookshelves. This reflects my belief that we need a historical methodology that equips ordinary educated people to make sense of the continuum of past, present, and – dare we hope? – future in the world around us. I have touched upon some more fundamental discussions of the nature of History, but this book makes no overall claim to reflect those profound debates. Indeed, it was news to me when a friendly critic told me that my approach is post-structuralist. No doubt there is much that I should have read, and much that I have read that I should have better comprehended – but I doubt that a single working lifetime could ever accommodate such an exhaustive study. Meanwhile, although I contend that historical explanation is impossible, I am in no doubt that History itself is more necessary than ever, and that historians must find ways of overcoming that paradox. The Joanne Goodman Lectures commemorate an undergraduate who enjoyed her studies. It may be heresy in an era that stresses scholarly productivity, but I believe that, in the last resort, it matters little to the thousands of students who enrol in History courses that their professors have written monographs that change two lines in a standard textbook. What matters is that we, the historians, should find ways to help our students to think intelligently about the world in

Acknowledgments / xi which they live. History has never been more necessary. It is up to historians to attempt the paradigm shift that will once again place our craft where it belongs, at the heart of all intellectual discourse. Ged Martin Shanacoole, County Waterford May 2002

This page intentionally left blank

1 Redefining History at the Centre of Debate

This page intentionally left blank

It is not that historians ask the wrong questions, but rather that we have yet to pose all the right ones. Historical explanation, as we usually practise it, is ultimately impossible, although some approaches are more fallacious than others. Unfortunately, historical explanation is also a social necessity, since in its widest form it is an essential part of everybody’s daily life. Most of us have to take decisions, constantly to adopt strategies that respond to the world around us. So basic is this process that it hardly ever occurs to us to focus on the nature of a decision itself. It is curious that this reluctance should be so marked among historians, since it is a basic function of the craft to examine decisions and locate them in time, another quantity which they only vaguely define. Whether those decisions relate to getting married, taking a job, or moving house, they cannot simply be taken in a frozen present. Rather the decision to take a decision requires us to assess how we have reached the position that obliges us to choose, and so we are all constantly engaged in a historical exercise that underlines the need for sound analytical method. Consequently, some degree of historical education is a basic requirement for citizenship in a participatory democracy. History is no mere antiquarian luxury, for we are ourselves caught up in the march of events, and must make sense of our own responses and of the actions of others if we are to cope with the world around us. Chapters 2 and 3 argue that we can never recover the complete evidence needed to account for any event in the past and that, even if we could, our methods of explanation, of building upon such evidence, are equally unsatisfactory. Historians love paradoxes, but there is something daunting in the paradox that our craft is both socially vital and professionally impossible. Is there a way out of the paradox? The present work argues that historians must expand and recast the questions that we ask. Most people would accept that a historian is somebody who seeks to explain the past, someone who asks, ‘Why?’ In order to begin to comprehend the past, we need to refine that question, to ensure that ‘why?’ is always posed in the form ‘why when?’, so that it is firmly tied to the more basic task of locating events in time. To ask ‘why when?’ we must also clarify ‘why what?’ and ‘why who?’ Framing our questions in this way will enable us to focus more clearly on those most fundamental building blocks of the historical edifice, the

4 / Past Futures: The Impossible Necessity of History decisions taken by people themselves, either as individuals or as part of social and national collectivities. Some will object that such an approach to historical explanation privileges the powerful and the prosperous over the poor and oppressed, and it is undeniable that a far wider range of choice is usually available to the rich than to the poor. None the less, there are important advantages in basing attempts at historical understanding upon the study of the decisions that people take, whether as individuals or in groups. One is simply that to adopt any other approach places us at risk of entrapment in some imposing form of words embodying the determinist fallacy that decisions impose themselves on people through the autonomous rampaging of some theory or abstract process. The other is that the more we attempt to isolate and explain human decision-making, the more it appears to be both highly simple and frighteningly ephemeral. The standard method used by historians to account for a decision is to deduce its explanation from the evidence supplied by the people who took it. Chapter 4 contends that, for most people, the act of decision is so rapid and intuitive that they may be incapable of identifying even the moment at which it happened. It follows, and the implications for accepted historical method are alarming, that those who take decisions may defend them but they cannot necessarily explain them. Their ‘evidence’ gives us not the reasons that led to the action, but the rationalizations subsequently cobbled together in its defence. Even if those decisions are based on some form of rational choice between available options (and available options are rarely perfect), we are still left with an insoluble riddle. There are usually arguments both for and against any action. We have no way of understanding why one course is perceived to be more rational. The study of any election is enough to demonstrate that highly intelligent people can disagree totally about the best means of advancing both the common welfare and private interest. If we focus upon the decisions taken by individuals, we have at least a chance of assembling a robust account of the events that may flow from them. Although explanations that are 100 per cent satisfactory will still elude us, we can begin to locate events in time, to identify some of the interconnections within the past, and so help to situate ourselves on that moving line called the present that so tenuously divides our past from our future. Two questions help to locate any

Redefining History at the Centre of Debate / 5 decision in time. The first, ‘why when?’, asks why a decision was taken at a particular moment. Most of us are adept at procrastination: we do not tackle issues until the need to respond has become overwhelming. Others enthusiastically embrace the opportunity of taking decisions. (Some of these people become chief executives of organizations, where they sometimes demonstrate that their readiness to take decisions is not always matched by their ability to discern the best solution.) A few, who have a disproportionate drive to seek political power, are cursed with a desire to make decisions on matters which hindsight may conclude they would better have left alone. The second question, ‘why what?’, seeks to understand the options seen to be available to the decision-makers at that ‘why when?’ moment in time. Such options were normally circumscribed both by practical limitations and by the confines of intellectual debate at the time, so that the solution that may seem obvious in hindsight simply did not make it to the historical short list. Thus far, many historians may wonder what the fuss is about. Like the character in Molière who was surprised to discover that he had been talking prose all his life, they will happily agree that much of our work incidentally sets out to locate past decisions in time as a step towards their partial explanation. However, a methodology that concerns itself solely with events at a frozen moment in the past takes us only half-way towards a mature historical comprehension. Such an approach is effective enough at examining an episode that happened in, say, 1850 and sketching how it seems to have been influenced by events that had occurred in 1830, or indeed earlier. But that is only one side of the coin. Anyone who takes a decision to get married, to take a job, to buy a house is also making certain assumptions, even if intuitively, about the future: that the relationship will last, the employer will not go bankrupt, the housing market will not collapse. The inconvenient problem that the future is unknowable does not excuse historians from exploring the influence of perceived or imagined past futures upon the shaping of the decisions that constitute the building blocks of history. We can at least attempt to classify the attitudes to the future held by the people whom we study: did they look to the short term or the long term, were their perceptions optimistic or pessimistic? Thus, a reconstruction of an event in 1850 purely in terms of what had hap-

6 / Past Futures: The Impossible Necessity of History pened before 1850 falls short of locating it in the full roundness of time. Historians generally have little to say about these past futures, and this widespread failure to explore a vital dimension of decision-taking is a major area of inadequacy in contemporary historical writing. Thus chapter 5, which discusses how we might categorize past perceptions of the future, represents a fundamental challenge to the way in which many historians carry on their work. Chapter 6 is a more speculative extension of the discussion to an attempt to locate ourselves in time. This is a far more difficult task, since our constantly changing present faces its own future, one (perhaps more than one) of guaranteed, multiple, and frightening change. As a result, some dismiss history, insisting that the past now has no causal or structural relationship to the inherently provisional present in which we live our daily existence. They are wrong. It is the present that has no independent existence, and to deny its relationship with the past leaves us not only bereft of a sense of our location in time but dangerously exposed to the delusion – doubly absurd in a world of change – that our Western civilization is entitled to discount its own future because we represent a logical end point in the human journey through history. To recognize the transience of our present is to underline yet further the impossibility of explanation in history, since all historical reconstruction and analysis take the form of a dialogue between present and past, in which we attempt to reassess their actions and beliefs in the light of our priorities and values. Chapter 6 suggests that while historians generally profess to recognize the dangers of imposing modern values upon past experiences, most of us are more subtly influenced than we appreciate by a sense of the ethical superiority of our own times, even if this takes the paradoxical form of tolerant neutrality towards the values of those whom we regard as less enlightened. It is only by stepping back from our own values that we can ask why we should assume that our Western, free-enterprise, electoral democracies represent any form of normality. Are they really the logical product of our past? If so, does this entitle them to be assumed to possess either practical durability or moral entitlement to survive? Implicit in this enquiry is a very simple question: what is a ‘long time’ in history? Historians have little to say about the subjective quality of time. Indeed,

Redefining History at the Centre of Debate / 7 the more that professional specialization drives scholars to study narrower issues within shorter periods, the greater is the risk of blocking off earlier centuries as belonging to a world too distant from our own to be of significance. The neglected minefield of the notion of significance is explored in chapter 7. The term is used as a means of locating events in time as a step, not to asking whether we find the past relevant or convenient, but to seeking some broader illumination of our own relationship with it. A significant event, then, must be distinguished from an episode that is merely dramatic or portentous. It is argued that significance itself may be assessed in two ways. Historical events are significant in relation to one another. Just as the essence of music is to be found in the silences between sounds, so the significance of history may be identified in the intervals between events. Equally, what did indeed happen is significant in relation to what might have happened: we can only fully digest the importance of an event by taking account of the possibility that it could have taken place in some other way, or that it might simply never have occurred at all. This is not an excuse for indulgence in counter-factual fantasy, for might-have-beens are even harder to capture than vanished past futures, and there would be far less point in any such exercise. Rather, we must employ an explicit methodology that enables us to identify and elucidate the landmarks and lighthouses of the past if we are to steer history between the delusion of explanation and the monotony of narrative. Cumulatively, the arguments that follow amount to a major challenge to the way in which historians go about their business. It is important to make clear that the demand for a revision is directed not against such accepted professional practices as archival research and precision in footnoting, but rather towards the interpretation that is made of the scholarly material thus gleaned and presented. At the very least, there is an obligation to make clear that all attempts at explanation are provisional and insecure, that since historians are more often reliant upon inference than certainty, there must be ‘an austere reserve never to pass off one as the other.’1 That much will be uncontroversial. If historians are sometimes tempted to pronounce where they should only suggest, the fundamental inadequacy of the craft lies in the shortcoming that it simply does not go far enough in the more important

8 / Past Futures: The Impossible Necessity of History direction of locating both past and present in the longer sweep of time. The potential for asserting the intellectual centrality of the discipline is enormous. Every branch of the social sciences and humanities is implicitly historical if only because all of them deal with human experience and its expression within a framework of time. Policy-making in government and in business is essentially an exercise linking past inheritance with future possibility through the fleeting hinge of the present. A historical methodology that focuses upon the nature of decisions and their location in time has the potential to inform and enrich such processes. A methodology that concerns itself with the conundrum of significance has the capacity to make our craft more intellectually democratic, by widening historical debate among the educated and articulate. These are attractive prizes for an increasingly marginalized academic craft. They may be achieved more easily than we think. It is not necessary to be dull to write about history, as Carr and Fischer and Hexter have shown, most notably when they have shouldered the stern duty of pillorying the follies of fellow practitioners.2 None the less, when confronted with the prospect of climbing the North Face of Arnold Toynbee’s twelve-volume Study of History, most working historians prefer to go for a stroll in the archival garden, confident of being able to gather enough pretty blooms for the publishable nosegay of a journal article. The formidable philosopher R.G. Collingwood insisted that anybody writing history ought to give ‘special attention to the problems of historical thought,’ although he patronizingly added that it was ‘possible to be a quite good historian (though not a historian of the highest rank) without thus reflecting upon one’s own historical thinking.’ Perhaps confusingly, he also pronounced that ‘experience comes first, and reflection upon that experience second.’3 We might turn Collingwood inside out and suggest that it is virtually impossible to write history without subliminally encountering and subconsciously resolving many of the fundamental issues of the craft. The present study represents one historian’s attempt to systematize the assumptions of a working lifetime, in the hope of persuading other practitioners that our intellectual role may be more easily reshaped and extended than we might have thought. It is based not upon historiographical and philosophical theorizing, but is rather the reflection of

Redefining History at the Centre of Debate / 9 the research interests of a jobbing scholar. Since this book has evolved from lectures delivered in Canada by a historian with a special interest in the British empire, many of the episodes discussed are taken from Canadian history. Other examples are also cited, in the hope that there is still sufficient shared knowledge of events such as the two world wars of the twentieth century to render possible some degree of informed historical debate among a wider reading public, thus transcending the narrowness that increasingly characterizes the history profession. More and more, undergraduates study the past only through semesterized glimpses of unrelated gobbets, while young scholars can only hope to make their professional mark by concentrating upon constricted topics within narrowly confined blocks of time. If starting his career nowadays, the Edwardian don who apologized that he could not answer a question about the battle of Waterloo because he ‘only went down to 1485’ would probably be counselled to begin his specialization at 1450.4 Journals proliferate and grow fatter; abstruse (and expensive) monographs pour forth, blink feebly in the face of minutely intolerant reviewers, and are soon remaindered by their publishers. The result of this well-footnoted cacophony is that it is difficult for practitioners to keep abreast of published work in their own sub-fields of research, let alone take account, as was once the case, of interesting and path-breaking work on other countries or centuries or problems. Perhaps the hit-and-run raids of this study upon the specialized subjects of other scholars will do no more than demonstrate the folly of rising to the challenge. The only defence that can be offered is that the aim of the exercise is not merely to halt the marginalization of history but to replace our craft in its rightful position at the centre of all studies of the world around us.

This page intentionally left blank

2 History versus the Past

This page intentionally left blank

One crucial distinction must be emphasized at the outset. There is an entire difference between ‘history’ and ‘the past.’ The past comprises the totality of the human experience, every step taken behind every plough by every peasant who has ever lived, every credit-card bill or tax return, every human action whether inspired by heroism or spite, every meal, every act of copulation. History, by contrast, is our selective attempt to make some sense of at least one corner of that enormous past, the process by which we put some of its chaos into order and seek to understand how and why it happened the way it did. This distinction immediately points to several reasons for the ultimate futility of historical explanation. Of these, the central and most ineluctable is that, thanks to the selective or accidental survival of evidence, we can never have more than a partial knowledge of everything that happened in the past. Another is that historical analysis is based upon a one-way dialogue between the present and the past, in which we seek to interpret the past through the distorting filter of our own values and concerns. From this judgmental standpoint, it is too easy to assume that the past was merely travelling, while we, for our part, have arrived. Thus, we fall into the trap of elevating our transient present into a triumphant end product, almost a preordained destiny. One aspect of this sense of the ethical superiority of our own time can be seen in the pervasive popular belief that ‘history’ passes judgments on those who have acted in a pageant of time. There was a time when respectable historians also subscribed to this view of their functions. Medieval chroniclers not only took for granted that England’s bad King John, the legendary foe of Robin Hood, had gone to eternal damnation, but felt some sympathy for the occupants of Hell for having to endure his foul presence. This ‘terrible verdict,’ pronounced J.R. Green in 1874, has ‘passed into the sober judgement of history.’ By passing themselves off as mediums in communion with the past, some nineteenth-century historians claimed immense authority for their pronouncements. The French scholar Fustel de Coulanges once told an audience that they were listening not to him, but to the voice of history itself.1 Modern scholars have abandoned the practice of consigning their subjects to places either in history or indeed in Hell, although Chapter 6 argues that bloodless neutrality has been taken too far in historical narrative. Historians should not pretend that past actions were

14 / Past Futures: The Impossible Necessity of History invariably value-free, either in the eyes of contemporaries or in the way in which they are interpreted today. (Indeed, it is tempting to suggest that politicians might look more favourably upon funding our research if we still proclaimed ourselves as the arbiters of their desire for ‘a place in history.’) Historical immortality is a notion that obsesses tyrants and dazzles even elected leaders. Hitler consoled himself that he would be acquitted by ‘the goddess of the eternal court of history’ when the Weimar republic locked him up in 1923. After the Conservative landslide in the Canadian general election of 1958, American VicePresident Richard Nixon assured the victorious John Diefenbaker that ‘history will record that you are one of the truly great political campaigners of our time.’2 Historians, on the other hand, have generally felt obliged to point out that Diefenbaker’s remarkable talents on the stump were not matched by comparable political skills in the exercise and retention of power. Moreover, Nixon’s belief that history was a process of recording and passing judgment backfired badly upon him when he taped his own attempts to cover up the Watergate scandal. In short, any idea that history operates independently of the fallible assessments of individual historians is a notion best left to the cartoons of Doonesbury. Socially necessary, but intellectually impossible – that is the paradox of historical explanation. There are two principal reasons why historians can never fully explain the past. Chapter 3 argues that our methodologies are defective. Yet even if they were watertight, the plain fact is that in most cases we simply lack sufficient evidence to enable us to provide a complete description of past events, let alone explain what happened. Perhaps one day historians will have access to everybody’s credit-card bills and maybe to their tax returns as well. Even so, it is likely that more archival evidence will survive for heroism than for fornication, and what can be recovered about the latter is probably incomplete and misleadingly sensational. For instance, the records of church courts from seventeenth-century Somerset reveal that heterosexual intercourse was frequently conducted in an upright posture with the female partner propped up against a tree. Does this prove that sex in Stuart England was predominantly a vertical experience? Probably not: the ecclesiastical courts concerned themselves with extra-

History versus the Past / 15 marital encounters, which were likely to be urgent and passionate. Furthermore, inconvenient witnesses turned up in woodland more often than in bedrooms or haylofts. Some offenders even managed to draw attention to their own fornication. A couple who met at a fair near Taunton in 1633 decided that same night to surrender to their mutual desire. Unluckily the tree against which they chose to express themselves proved to be a much less firmly secured fairground pole. Worse still, it was topped with a bell whose rhythmic clanging aroused the neighbourhood.3 We shall never know the full story of the past, and should never forget that the sensational is more likely to be recorded than the humdrum. Yet this is not to say that the historian must always know less about the past than the contemporary participant.4 In 1849, the towering British politician W.E. Gladstone undertook a campaign in the British parliament in support of Canadian Tories, who complained that they were being taxed to compensate the rebels of 1837–8 for the consequences of their own follies. Unluckily for Gladstone, the Montreal mob displayed its fidelity to the British constitution by burning down the provincial parliament and stoning the Queen’s representative. As a result, he was easily out-manoeuvred in the House of Commons and heavily defeated. The episode merits study, since the British parliament was never again asked to set limits to Canadian self-government. The Rebellion Losses affair of 1849 is one case where historians have more knowledge of what was going on than was available to any one individual involved at the time, simply because almost all the major players left detailed and accessible personal papers, through which we can trace the evolution of the parliamentary ambush which humiliated the unwitting Gladstone.5 Indeed, simply because contemporary witnesses were not only participants but also partisans, they may prove singularly ill-equipped to appreciate what was happening around them. Looking back over thirty years at the heart of Australian politics, the thoughtful Melbourne intellectual Alfred Deakin reflected in 1913 that he had ‘often been amused by the ... amazing difference between the newspaper stories of the “why,” the “how,” and the “by whom” and the facts as I have known them at first hand.’ Journalists, we should recall, tend to rely upon the briefings they receive from politicians. As Deakin had

16 / Past Futures: The Impossible Necessity of History written four years earlier, ‘for me the effect of my life experience is to discredit most of the personal estimates of history and many of its interpretations.’6 Sometimes, a historian may mislead, even if inadvertently, by offering an account that seems so beguilingly complete that readers may not suspect that it is incomplete. Theodore H. White’s account of the American presidential election of 1960 remains an outstanding example of the combination of political analysis with ‘fly-on-the-wall’ journalism, so that we feel both that we know what happened and what it was like to be in the victorious Kennedy camp at the moment on election night when Jacqueline whispered to her husband, ‘Oh, Bunny, you’re President now!’ It would be censorious to condemn White, Washington insider though he was, for omitting to tell us that Mrs Kennedy shared her marital intimacy with a whole lot of other women: in the nineteen-sixties, personal lives remained private, and accordingly did not impinge upon the election. A more serious omission related to White’s central thesis, his patriotic invocation of the way in which power passed to a new president through the silent process of millions of citizens putting their votes into thousands of ballot boxes, an inspiring portrait of a mighty democracy holding its breath. Almost sixty-nine million Americans voted and, to add to the drama, Kennedy won by just 112,881 votes. White related the pattern of voting, by region, by religion, by race, to the formal outcome through the Electoral College, that product of the ‘wisdom of the Constitutional Fathers ... which, while preserving free citizen choice, prevents it from degenerating into the violence that can accompany the narrow act of head-counting.’ The praise was overdone, for the Electoral College came within a whisker of misinterpreting the will of the people. ‘If only 4,500 voters in Illinois and 28,000 voters in Texas had changed their minds, the sum of their 32,500 votes would have moved both those states ... into the Nixon column,’ enough to deprive Kennedy of his victory. All this is documented in detail – but at no point in his account did White mention that suspicions hung over the results in both those states. ‘No one will ever know precisely who carried the majority of 1960,’ he acknowledged fourteen years later, ‘for on that night political thieves and vote-stealers were counterfeiting results all across the nation.’ The Making of the President 1960 remains an impres-

History versus the Past / 17 sive book, but the immediacy of its narrative and the apparent cogency of its explanation combine to give it a misleading authority, disguising the omission of that key point that it was not only the voters who were filling the ballot boxes on that November day in 1960.7 Occasionally, the omission of a key piece of information leaves an obvious hole in the resulting attempt at explanation. One example of a story that has been vividly rewritten is that of the Battle of the Atlantic in the Second World War. From 1940, when the German navy gained bases on the west coast of France, through to the end of 1942, U-boats scythed through Allied merchant shipping on the North Atlantic: onequarter of the entire British merchant fleet was destroyed in 1940 alone. The raw statistics of the carnage, rising above 700,000 tons displacement sent to the bottom in the single month of November 1942, hardly capture the human tragedy of thousands of brave men drowning in freezing seas. More to the point, if the submarines had not been checked, the invasion of Normandy would have been impossible and Britain might well have been starved out of the war. One survivor of the battle of the Atlantic – and it was fortunate for Canada’s war effort that he did survive – was Mackenzie King’s minister of munitions and supply, C.D. Howe, who was sailing to Britain in December 1940 on a ship that was torpedoed three hundred miles off Iceland. As sometimes happened, the U-boat commander decided to surface to check on his kill. As the dark conning tower broke through the icy waters close to his lifeboat, Howe was to seen to clench his fists in anger, as if about to attack the submarine with his bare hands. For the next two years, it seemed that the Allies could find no more effective counter-measures than C.D. Howe’s instinctive response. Then the course of the battle began to turn in their favour. U-boats were tracked down and destroyed until, although they were never entirely eliminated, they ceased to threaten the transatlantic lifeline. Thus, when Peter Calvocoressi published the first edition of his magisterial history of the Second World War in 1972, it was obvious that he would devote attention to the struggle to overcome the U-boat menace. His account was meticulous and gripping: a friendly critic called it ‘splendid’ – but added that it made no sense. For two years, the submarines had caused havoc. Then, suddenly it seemed, they were bombed and depth-charged out of the story. How could this have

18 / Past Futures: The Impossible Necessity of History come about? As it happened, Calvocoressi knew more than he could reveal. The British had broken the German codes, and could track and so destroy the U-boats. Calvocoressi himself had served at Bletchley Park, the celebrated intelligence unit where the decryption took place. By the nineteen-sixties, Bletchley itself was the subject of many legends. Many distinguished post-war academics were known to have worked there, and stories circulated about the quirks of prominent personalities. But in 1972, what actually went on at Bletchley remained a state secret – so much so that, it must be confessed, a younger generation of academics sometimes treated the war service of its seniors with affectionate disrespect.8 Given the number of people involved, and the notorious tendency to indiscretion in the academic profession, it is astonishing that the secret was kept for so long. The sceptical may object that the example proves precisely the reverse: the pertinent evidence did come to light and, as a result, we now know how the U-boats were beaten in the battle of the Atlantic. Well, maybe. We still do not know, for instance, how it was that the Germans, who had invented perhaps the most sophisticated system of code ever known, failed to update and protect it even after it became apparent that the Allies knew more about their submarine movements than could ever have been derived from spotter planes. The Battle of the Atlantic, like the ocean itself, keeps its mysteries. More generally, it seems reasonable to conclude that, in most episodes, the ‘real’ story is known, and remains confined, to far fewer participants – and even they probably saw events through a glass, darkly. The survival of so much archival evidence for mid-nineteenthcentury British politics owes much to the existence of a confident elite who had no doubt that they were destined to rule and that their careers would be of interest to posterity. Preserving private documentation was not even a weapon in the political struggle: in 1846, when a flustered Disraeli denied ever having sought office from the Conservative ministry, Peel could have destroyed him by flourishing a begging letter that some said he was actually carrying in his pocket. He chose not to bother. Some later politicians were less squeamish, acting upon the second part of John A. Macdonald’s maxim: ‘Never write a letter if you can help it, and never destroy one.’ Mackenzie King prevailed upon J.L. Ralston, Canada’s army minister, not to leave the government over conscription

History versus the Past / 19 in 1942. However, King kept the resignation letter. Two years later, when Ralston confronted him in cabinet over the same issue, King dramatically tabled it and ruthlessly fired his challenger.9 The formation of archives was helped by the fact that Britain’s rulers were rich and owned houses large enough to store their correspondence: of mid-nineteenth-century governors general of Canada, we know far more about Lord Elgin and Lord Monck, who owned country mansions, than we do about Sir Edmund Head, who did not. However, luckily for historians, landowners were not the only people who kept personal papers. If it sometimes seems that the political life of Canada has always been dominated by lawyers, it is partly because lawyers are trained to file documents: Canada’s first prime minister, Sir John A. Macdonald, left over five hundred volumes of papers, while the private archive of the fourth, Sir John Thompson, filled thirty trunks. A third category of archive creators consists of the selfimportant. Like most New Zealanders, Walter Nash dwelt not in a stately home but a suburban bungalow. Even when he was prime minister, Nash travelled to work on a Wellington commuter train (just as Wilfrid Laurier took the tram in Ottawa). In describing him as ‘a sincere and Christian gentleman’ and also ‘a bore,’ the patrician Harold Macmillan captured Nash’s belief in himself as New Zealand’s man of destiny. After his death, the Nash family bungalow and its adjoining garage disgorged private papers sufficient to fill two hundred metres of shelf space in the country’s National Archives.10 It can be confidently predicted that Nash will continue to loom large in the writing of New Zealand political history. Accident plays a large role in determining what survives and what is lost. The personal papers of George Brown of the Toronto Globe were shipped back to Scotland when his widow left Canada, to be found stored in a trunk in a house in the Highlands seventy years later. The Canadian Tory politician Sir Allan MacNab also dispatched his personal archive to Britain, but the papers were lost in a shipwreck in 1861. The papers of Sir Frederic Rogers, the senior civil servant at the Colonial Office at the time of Canadian Confederation, were stolen from a train on Britain’s South Western Railway. Some collections are reprieved from intended destruction. Mackenzie King left instructions in his will that his extensive and revealing diaries should be destroyed.

20 / Past Futures: The Impossible Necessity of History However, soon after his death, reports of King’s interest in spiritualism began to circulate, and his executors decided that compliance would have looked like a cover-up. The diaries, at least, survived beyond the grave. The only known copy of the 47-page memorandum of 1862 on the Intercolonial Railway written for the cabinet by the Colonial Secretary, the Duke of Newcastle, was snatched in disarray from a fire some years after his death. In supporting a Canadian request for British financial support, Newcastle had clashed with his old friend, W.E. Gladstone, who was by now Chancellor of the Exchequer and famous for resisting expenditure. It was Gladstone who oversaw the rescue operation, and he came close to destroying what was left of the duke’s papers, so great had been the ravages wrought by the flames and the fire hoses. It was only by the narrowest of chances that one of the most revealing sources for British official thinking about the future of Canada should have survived.11 If archives are one source for the past, memoirs are another. Yet here historians are at the mercy of the Ramses the Second factor. Ramses was an Egyptian Pharaoh who reigned three thousand years ago. Judged against the standard pharaohic mission statement, he was a great success. Ramses smote the Hittites, he ravaged the Beka’a valley, and he commemorated his achievements with the usual array of statues and obelisks. Indeed, his statues were the largest in ancient Egypt, while his commemorative obelisks sprouted in plantations of stone. Unfortunately, Ramses was not content with merely erecting his own monuments, but took a chisel to erase the names of his forebears from their obelisks and statues, appropriating their achievements wholesale for himself. Some writers have called him Ramses the Great, which suggests that his strategy worked.12 Autobiographers are prone to the same weakness. Few people are moved to announce that their lives have been humdrum and without influence. Indeed, there is good reason for suspicion when autobiography cloaks itself in false humility, such as Bishop Hensley Henson’s two-volume Retrospect of an Unimportant Life, published in 1942 during a wartime paper shortage. It is rare but refreshing to encounter a writer like the Canadian journalist J.S. Willison, who frankly disclaimed any political influence, but believed that ‘a close, even if accidental, relation to great men or great events may give equal or better qualifications for

History versus the Past / 21 dispassionate dealing with the forces by which events are directed and controlled.’13 If every politician wrote an autobiography, and all gave equal space to every issue, historians might be able to filter out the inevitable element of bias. Unluckily, the political memoir is a relatively recent development in Canadian history. The country’s second prime minister, Alexander Mackenzie, was probably the first major public figure to consider an autobiography, but he decided that he was too old for the task. Sir John A. Macdonald seems to have been planning to devote the summer of 1891 to his memoirs, but he died just a few weeks too soon. The only autobiography from the Fathers of Confederation, Charles Tupper’s Recollections of Sixty Years, was published in 1914 when he was in his forgetful nineties, and is not remarkably indiscreet. (His personal papers are hardly more revealing: the witty and perceptive historian Peter Waite has described them as not so much laundered as starched.) John Hamilton Gray of New Brunswick told almost nothing of the inner history of the Charlottetown and Quebec conferences, and only completed the first of his projected two-volume history of Confederation. Macdonald’s secretary, Joseph Pope, who probably knew much of the inner story, contented himself with the publication of his employer’s archive of Confederation documents without any accompanying commentary.14 ‘What is not disclosed by contemporary writers will never be disclosed,’ wrote Willison, hence ‘history never can be a true record, and the exact relation of public men to the causes in which they are concerned never can be determined.’ Far too often, historians have seized uncritically upon whatever scraps they can find. The Ramses the Second of Canadian Confederation was an English railway entrepreneur, Edward Watkin, who wrote memoirs boastfully claiming a key role in the creation of the Dominion. Watkin was not a modest man. He even began construction of a Watkin Tower in London, to rival Gustav Eiffel’s creation in Paris. Arthur Lower acknowledged that Watkin was ‘a central figure’ in Confederation, but rightly added that ‘it would hardly be possible for him to have been as central as he himself seemed to think he was.’15 Bereft of alternatives, most historians have refused to look a gift horse like Edward Watkin in the mouth merely because he prances a little too proudly.

22 / Past Futures: The Impossible Necessity of History The Ramses the Second factor reminds us that historians must not simply embrace with gratitude whatever survives from the past. Rather, the whole process of historical reconstruction is one of dialogue, between our questions and their answers. Sometimes that dialogue requires some sharp interrogation on our part, since there is often a mismatch between our concerns and their evidence. A recent comprehensive textbook of Canadian social history condemns earlier works for their ‘complete silence on homosexuality.’ It is an entirely reasonable complaint: modern understanding of sexuality indicates that there have been gays and lesbians in every age, and it would be a distortion of the past to ignore their existence. Unfortunately, in 1200 pages, the textbook makes only five passing references to homosexuals, for the simple reason that in the nineteenth century ‘the legal system forced them to keep their intimate lives secret,’ and hence ‘little of the history of gays and lesbians of this period has been recorded.’ Consequently, for all the authors’ worthy intentions, the main treatment of the subject is a twopage discussion linking gay liberation with the AIDS crisis, a focus that Canadian homosexuals might not regard as a balanced view of their total experience.16 A related problem, especially for those studying the activities of governments, is that while historians ask questions about ‘why?’, most official records were designed to supply answers about ‘how?’ Rigorously studied, even purely administrative records can be made to yield some remarkable findings. One recent example is the hypothesis, deduced from a study of the accounts of the royal household in fourteenth-century England, that Richard II invented the pocket handkerchief. It is a cameo that helps us to understand why the king was unpopular with his less fastidious barons, although it does not necessarily tell us why they deposed him in 1399.17 Because history is a process by which the fleeting present interrogates the irrecoverable past, it is easy to overlook the fact that the past may not have contemplated our priorities at all. The documented history of Australia began with a decision by the British government in August 1786 to establish a convict settlement in New South Wales. Australians naturally wish to know more about the reasons for the founding of one of the world’s most democratic and prosperous countries. Their attitudes to Britain have always been more complex and

History versus the Past / 23 angst-ridden than those of English Canadians, for whom the proximity of the United States has provided a rival focus for their insecurities. In the first half of the twentieth century, Australia’s historians assumed that the British were simply seeking a dumping ground for their social refuse, just as they had squandered ANZAC soldiers at Gallipoli. The Second World War was an uncomfortable reminder of Australia’s proximity to Asia, and some scholars began to speculate that perhaps New South Wales had been intended as a naval base on a new trade route to China. Later, the geopolitics of the Cold War helped to popularize a theory that the motive for occupying Australia had been a desire to control its strategically useful raw materials.18 The scholarly controversy over the founding of Australia has relied heavily upon a single document, a letter from the Home Secretary, Lord Sydney, directing the Treasury to equip the expedition with, among other items, forty wheelbarrows and 2100 pairs of trousers. Much of the academic debate has been concerned not so much with understanding what Sydney wrote but rather with accounting for the far larger issues that he completely omitted. Intriguingly, his letter was accompanied by a more speculative document called ‘Heads of a Plan’ but, like so many people who send enclosures with their correspondence, Lord Sydney did not explain its provenance.19 ‘Heads of a Plan,’ sounded like the project of an enthusiast: some historians wondered whether it formed part of the government scheme at all. One even pointed out that eighteenth-century maps also applied the term ‘New South Wales’ to the Ontario and Manitoba coast of Hudson Bay, a much more terrifying place of punishment, and mischievously suggested that perhaps a bureaucrat had confused the two.20 Naturally, among serious (not to mention, nationalistic) historians, the suggestion that Australia might have been founded by mistake has found no favour. The fundamental problem lies in the dialogue between the scholars and their source. Historians have asked ‘why?’ questions about a ‘how?’ document. Lord Sydney was issuing instructions to the clerks of the Treasury, and people who give orders do not need to explain the reasoning behind them. The unknown author of ‘Heads of a Plan’ certainly took an imaginative view of Australia’s potential, predicting for instance that the colony might supply Britain with tropical spices.

24 / Past Futures: The Impossible Necessity of History Unfortunately, twenty-first-century students seeking to trace the origins of Australian democracy will not derive much enlightenment from an anonymous vision of New South Wales as an eighteenthcentury delicatessen. Sometimes even the apparent similarity between past priorities and present concerns can complicate our ability to interpret a document. Lord Durham’s Report on the Affairs of British North America was published in 1839, and its apparent continuing relevance to the defining issues of Canadian nationhood has encouraged publishers to reissue it, in both official languages, several times since. Lord Durham was an aristocratic liberal, who combined belief in the rights of the people with confidence that he knew what was best for them. He argued for some form of Canadian self-government, but also pronounced that the kindest policy for French Canadians would be to teach them to speak English: as late as 1948, a Quebec scholar warned that the Report was the secret blueprint for Ottawa’s plot to monopolize power.21 Because Durham had written about issues that continued to be central to the Canadian experience, his ideas seemed to remain of continuing relevance. Scholars speculated on Durham’s reactions to the challenges of their own times, speculating that he would have opposed Irish Home Rule in and after 1886, backed the white settlers in Rhodesia in 1965, approved of Canadian independence, and would even have endorsed the Trudeau policy of bilingualism – all of them curious exercises in projection given that Durham had been dead since 1840.22 The bond that united Durham and those who studied his report, so it seemed, was that they were all enthusiastically concerned about the same issue: the future of Canada. Yet even those who saw Durham as a hero were puzzled that his vision of a free Canada was remarkably limited (notwithstanding their confidence in his postulated, if posthumous, intellectual development). He wanted Britain to control Canadian tariffs and sell Canadian land. After his sweeping and scornful denunciations of the ineptitude of colonial rule, it was anticlimactic for him to conclude that no fundamental change was needed in the structure of government. The key to the mystery may be found in a misleading similarity between past priorities and present concerns. Historians have read Durham’s report purely as a document tackling the issue ‘What is the

History versus the Past / 25 future of Canada?’ In fact, Durham was probably responding to a much more important challenge: ‘What will I do if I become prime minister of Great Britain?’ His report was a coded manifesto addressed to the British and aimed at winning power at home. Like Churchill a century later, Durham was making it clear that he did not intend to preside over the dissolution of the British empire. The limits that he placed on colonial self-government were designed to reassure the emerging middle class that their surplus children could make homes on Canadian farms, and that no tariff wall would impede the sale of their manufactured goods to Canadian consumers. Equally, in rejecting structural changes to the creakingly defective Canadian constitution, Durham was signalling to more conservative elements in Britain that he had abandoned his earlier radicalism and had no intention of launching into a fresh round of parliamentary reform. Anyone who argued against remodelling the feeble and despised Legislative Councils of the North American colonies was presumably unlikely to challenge the equally absurd but far more powerful British House of Lords. The apparent community of interest between the author of the Durham Report and the Canadian historians who have studied his ideas has obscured the fundamental importance of its contemporary context: Lord Durham was a British politician addressing a British audience. It happened that he had undertaken a mission to rebellious Canada, but his core message might equally have formed part of a manifesto about Patagonia or Pluto.23 To understand the Durham Report, it is not so much historical dialogue that is needed as an element of literary deconstruction. Durham’s Report was written in robust and flowing English that has remained clear to his twentieth-century disciples. Its very accessibility should trigger warning signals, for language can change in the meaning that it conveys, such that vocabulary which sounds comprehensible to us may in fact reflect very different underlying values. In some cases, the change in meaning is so recent and obvious that misunderstanding is unlikely. In 1840, the manager of an Australian sheep station noted that one of his stockmen had absconded, abandoning his distraught wife. ‘I suspect however that some gay Lothario about the farm is already consoling her for his loss.’ Obviously, this is not a reference to counselling sessions by an Italian homosexual. Equally,

26 / Past Futures: The Impossible Necessity of History when Canada’s Methodist leader, Egerton Ryerson, took his son to London’s Haymarket in 1866 ‘to know something of the gay world,’ he did not intend to encourage the lad to experiment with same-sex relationships. The grumbles of linguistic purists are enough to remind us that the word ‘gay’ acquired a whole new meaning in the late twentieth century. The shift in definition came about more slowly than perhaps we now recall. As recently as 1981, a distinguished and progressive McGill University historian could describe the father of George-Etienne Cartier as ‘gay, social and extroverted,’ evidently without any allusion to his sexuality.24 Yet in other instances, changes in meaning have been more subtle and are harder to detect. In 1847, the British prime minister, Lord John Russell, commented on a dispatch to the North American colonies drafted by the Colonial Secretary, Earl Grey, with the words: ‘I wish you would leave out the latter part of your warning.’ It would be all too easy to conclude that Russell was a weak prime minister reduced to impotent wheedling of his colleagues. Indeed, a case can be made that Russell was ineffectual, but he was also hyperactively interventionist. Russell’s wish was, quite simply, Grey’s command: the word has weakened in force since their time. By contrast, ‘want’ has moved in the other direction, from ‘lack’ to ‘desire.’ Queen Victoria’s famously laid-back first prime minister, Lord Melbourne, sounded typically cynical when he remarked, in 1840, that while the clergy should study theology, ‘he did not think it was a thing which they wanted.’ In fact, he intended to pay them a compliment, implying that there was no shortage of such knowledge among men of the cloth, and hence no need for specific training. The transition towards its more modern meaning can be seen in the correspondence of John A. Macdonald. In 1856, he wrote to flatter a potential political ally who was too sick to take his place in parliament. ‘We want you very much in the House’ – the emphasis conveying the meaning ‘lack’ or ‘need.’ In 1880, news spread that a prominent Toronto lawyer had suffered a mental breakdown and could no longer conduct the legal affairs of a leading Canadian bank. Although he was prime minister of Canada, Macdonald continued to head up the law practice that helped bankroll his political career, and he was blunt in pursuit of his own interests. ‘I want the solicitorship of the Bank of Montreal at Toronto for my own firm,’ he wrote.25 There is

History versus the Past / 27 little danger of misunderstanding the vocabulary of the past in that instance. The further back we move into the past, the more we should be on our guard against such misunderstandings: we may smile when Shakespeare uses ‘naughty’ to mean ‘evil,’ but we do not dismiss him as a playwright for that reason. Yet even the exotic language of the sixteenth century can still cause confusion. In 1558, John Knox, the Scots Protestant Reformer, published his Blast of the trumpet against the monstrous regiment of women. Four centuries later, until political correctness drove such language out of fashion, Knox’s phrase was regularly appropriated by men who felt threatened by assertive females, notably those such as headmistresses and hospital matrons who occupied positions of authority subversive of patriarchy. Knox can indeed be classified by the twentieth-century phrase ‘male chauvinist,’ but his condemnation was aimed at a very different manifestation of feminism. He used the term ‘regiment’ not in a military sense, but rather as an abstract noun, a synonym for ‘government.’ Nor was his use of the adjective ‘monstrous’ intended as a denigration of female physical charms. Indeed, for Knox, the trouble with the woman who was his chief target, Mary Queen of Scots, was that she was far too feminine. It was rather that (for Knox) to be ruled by a woman was a distortion of the natural order of the universe, in which (according to Knox) it was decreed that political power should be exercised solely by men.26 The sixteenth century is an alien world, but the closer we come to the present, the more easily we risk shouldering aside the nuances of the people we study. Two and a half centuries after it was written, ‘Rule Britannia’ remains a popular patriotic song, at least among the English. Sung with a certain self-mockery, it has formed part of the ritual culmination every year of London’s major music festival at the Last Night of the Proms. Britain’s task force sailed off to recapture the Falkland Islands in 1982 with the blessing of its stirring music and defiant words: Britannia rule the waves, Britons never never shall be slaves. Few notice that the words are oddly defensive for the rallying cry of a great power: when James Thomson penned them in 1740, it was France and Spain that dominated Europe. Furthermore, Linda Colley has recently drawn our attention to the first verse, nowadays treated merely as a preliminary to the rousing chorus, which pictures Britain

28 / Past Futures: The Impossible Necessity of History rising ‘at heaven’s command’ out of the blue ocean (or ‘azure main’). ‘We barely notice that opening all-important reference to Britain’s divine origins, even though for Thomson – a minister’s son from the Scottish Lowlands – it would have meant a great deal.’27 In short, while Thomson’s verse appeared to chime with the great era of British naval supremacy a century and a half later, his own thinking was much closer to the invocation of England as ‘this sceptred isle’ by Shakespeare’s John of Gaunt a century and a half earlier. At least Thomson knew better than to call the whole island ‘England.’ The closer we come to our own times, the more we should be on the lookout for recognizable forms of language that cloak sharply differing patterns of thought. Two examples from mid-twentieth-century Britain illustrate the point. During the Second World War, George VI regularly visited areas of London that had been damaged in air raids. On one occasion, someone in a cheering crowd called out, ‘Thank God for a good King.’ To renewed cheering, George VI replied, ‘Thank God for a good people.’ Half a century later, such an exchange would have been merely comic, a gift for satirists, but in 1940 the idea of an organic union of the people and their sovereign under divine providence still exercised a deep hold. Indeed, one of those who witnessed the exchange, and who was deeply moved by the episode, was an overseas visitor to London, none other than the prime minister of the very modern land of Australia.28 The Britain of George VI, and his Australian and Canadian realms too, drew upon the distant past in another and much grimmer respect. All three relied upon the gallows as the ultimate sanction against murder. As capital punishment became increasingly controversial, by the nineteen-fifties defenders of the death penalty were pressed to articulate its utility. In 1956, the Archbishop of Canterbury put aside arguments based upon deterrence, claiming instead that homicide was a challenge to the fundamental compact and fabric of society: ‘where murder exists it is the first duty of society to repair the damage done by it to its own integrity and to bear witness to the majesty of its own first principle.’ The Most Reverend Dr Geoffrey Fisher was a formidable intellectual, but in his mindset, this twentiethcentury prelate was remarkably close in spirit to the chaplain of Coventry Gaol who, in 1849, had plunged the hand of a woman con-

History versus the Past / 29 demned to hang into the flame of a lighted candle to give her a foretaste of the pains of Hell.29 There are even occasions when the language of relatively recent times conveys an argument acceptable to our enlightened modern values, but barred from us by terminology that we find embarrassing. In 1946, the Manitoba-based historian Arthur Lower likened French Canadians to ‘horsemen trying to keep up with motorists,’ out of breath and resentful. He then varied both image and gender to portray English Canada and Quebec as ‘a badly mated man and woman’: in time of discord, ‘the woman sulks while the man bullies.’ ‘It is as a very womanly woman that such a feminine people as the French should be treated; such a one could be wooed, but all the English Canadian can do is to shout.’30 Surely few short passages could be so economically designed to offend so many groups of people. Yet if Lower can be stripped of his various ethnic and gender prejudices, the passage is not without a core of sound advice to English Canadians. After all, in 1946 social pressures forced most unhappy couples to buckle down and accommodate their differences. Half a century later, both the marital and the political agendas were much more open to the alternative of divorce. Historians should be thankful for any shreds that survive from the past, but gratitude does not excuse us from the duty to interrogate that evidence. Two fragments from British political history of the nineteensixties offer guidance on how to go about this ungracious task. They are the episodes of Mandy’s riposte and Wilson’s heckler. In 1963, a sex scandal, the Profumo affair, briefly enlivened British politics, bringing celebrity to an eighteen-year-old model, Mandy RiceDavies, whose glamorous image suggested that the wages of sin were well above the national average. Mandy was accused of exaggerating her claims to familiarity with influential public figures. When it was put to her that one of her alleged sexual partners denied that he had ever been to bed with her, she disarmingly replied, ‘Well he would, wouldn’t he?’31 Mandy had identified a truth that sometimes eludes historians: often a piece of evidence, particularly in self-defence, has to be doubted simply because it is inconceivable that its source would have said anything else. On the face of it, Mandy Rice-Davies would

30 / Past Futures: The Impossible Necessity of History have had little to say to George Brown, the Toronto newspaper proprietor who was the stern foe of French Canadians, Catholicism, and the endless iniquities of John A. Macdonald. In 1864, Brown took office alongside the very French and Catholic G.-E. Cartier, not to mention the unrepentant Macdonald, in the Great Coalition that launched Confederation. Brown, of course, explained that he had joined forces with his enemies in the public interest. Indeed, he added in his own praise that to have refused office would have made him ‘one of the vilest hypocrites that ever entered public life.’32 It is a pity that Rice-Davies was born a century too late to comment on that statement. She might have injected an element of disbelief that has been notably lacking in the uncritical pages of Canadian history textbooks. In the 1964 general election campaign, just a year after the Profumo affair, the Labour leader, Harold Wilson, hardly put a foot wrong. There was, however, one embarrassing exception during a whistlestop speech-making tour of Kent, when he not only made the mistake of asking a rhetorical question, but fatally compounded it by following with a pregnant pause. ‘Why,’ Wilson asked, ‘do I emphasize the importance of the Royal Navy?’ An unknown voice interjected, ‘Because you’re in Chatham.’33 The episode of Wilson’s heckler should remind historians that all evidence must be sifted through its context. To put the matter another way, it is not safe to assume that a politician intended to build battleships merely on the evidence of a speech delivered at a naval base. Interrogation of the evidence does not imply the suspicion that our sources were out to fool us, even though we should bear in mind that they may have been less than frank with each other. Sad to relate, the closer we come to the present, the more likely we are to encounter documentation that has been designed to obscure and mislead. A student of the diplomatic archives of Nazi Germany warns that ‘we can never be sure whether a particular document represents a serious transaction or whether it was composed in order to provide evidence for the innocence of its author.’ One of the saddest features of the Watergate scandal was the way in which it encouraged the practice of ‘cover-your-ass memos,’ in which Washington insiders stuffed the files with documents ‘proving’ that they had no part in President Nixon’s crimes and cover-ups.34 Happily, most of the people in the more distant past who

History versus the Past / 31 generated the material that we study can be acquitted of any desire to bamboozle posterity, and it is necessary constantly to recall that they were probably hardly aware that they were creating ‘historical evidence’ at all. More often than not, their speeches and editorials and memoranda were aimed at persuading each other. As a young politician Gladstone was asked to defend government policy in the House of Commons. In a rare nod towards brevity, he asked Sir Robert Peel if he should be short and concise. Peel replied that it was better to be ‘long and diffuse,’ to argue a case ‘in many different ways, so as to produce an effect on men of many ways of thinking.’35 When, say, John A. Macdonald extolled the benefits of Confederation, we should not assume that he was revealing why he personally wanted Confederation, and no more do his arguments necessarily explain why Confederation came about. We are in fact at a mid-point in historical evidence, at which Macdonald was trying to persuade other politicians why they should support Confederation, which might be a very different exercise indeed. The most we can say is that, on balance, he guessed right about the priorities of enough of his fellow legislators to carry his policy. By extension, it follows that speeches which claimed that there were six distinct reasons for Confederation should be regarded as exercises in coalition building, attempts to bring on board six different interest groups. It is dangerous for historians to jump to the conclusion that there were six causes for Confederation, and to tie them into a neatly integrated explanatory textbook package. It is difficult to imagine that any historian would claim total survival of evidence for any episode of the past. Indeed, some may wearily conclude that far too much evidence has survived, especially if it consists of archival mounds that they must quarry to ensure that their own research is comprehensive, even though the documents were never designed to help their enquiries in the first place. Some are tempted to defend their own specialized research by insisting that enough of the materials needed to form an explanation have survived. This assertion, which is often the only basis on which the scholar can go to work, ultimately rests upon the internally contradictory premise that we can identify the materials needed for an explanation even though we cannot be sure that we know everything about the problem we seek to explain. More plausible, and very often far more enjoyable for the

32 / Past Futures: The Impossible Necessity of History spectators, is the assumption that enough survives to make it possible for one scholar to rip apart the theories of another. Indeed, we might make more of the gladiatorial potential of our calling if we adopted the Spanish word for historian, historeador. Others may offer the common-sense caveat that almost all decisions rest upon incomplete evidence: nobody would ever be convicted for murder if it were objected that the court had not heard an account of the crime from the victim. Indeed, comparison with the legal process is illuminating in highlighting the shortcomings of the craft of history. Courts of law operate according to agreed rules governing the nature of evidence and with accepted procedures for its assessment. Furthermore, lawyers and jurors usually find themselves evaluating evidence that has been generated within a shared system of values, whereas historians have to surmount the barrier of misunderstanding caused by the problem that the past often talked our language without thinking our thoughts. In the English-speaking world, the legal process disdains hearsay, frowns upon absence of corroboration, and insists that the presumption of innocence overrides any hypothesis of guilt. Even so, there are miscarriages of justice. Historians, by contrast, not only have an incomplete knowledge of what happened in the past, but are remarkably cavalier about the way in which they interpret even the partial materials that have survived. ‘The evidence which convinces lawyers often fails to satisfy us,’ commented the gadfly historian A.J.P. Taylor; ‘our methods seem singularly imprecise to them.’36 Chapter 3 suggests that the reservations of our critics are more than justified.

3 The Impossibility of Explanation

This page intentionally left blank

If incomplete survival of evidence from the past is a fundamental obstacle to our understanding, bizarre forms of historical argument constitute at least as great a handicap to understanding. These fall into two groups: those which assert wayward causal linkages between events and those which impose dubious theories which turn out to be little more than empty forms of words. While the former generate the more amusing blunders, the latter probably represent the greater threat to the survival of the craft. Perhaps their most pernicious contribution is that their very absurdity disguises the unpalatable truth that perfect historical explanation would be impossible whatever methodology we adopt. The view of history advanced by the philosopher Michael Oakeshott has been summarized by John Cannon: ‘[T]he only proper historical explanation is a complete account of the antecedent events.’ Cannon added: ‘I approve the theory: the practice is difficult in a normal lifetime.’ In fact, the theory is unsatisfactory since it blurs the distinction between narrative and explanation, between the raw past and its reconstruction in historical guise. A recitation of events based on the assumption that one thing led to another can be used to explain anything from unplanned sex to World War One. It might seem that Cannon was not altogether fair to Oakeshott, who recognized that ‘mere antecedence’ was not enough, and that historians should ‘distinguish among the antecedents a passage of events which may be recognized to be significantly related to a subsequent [event] ... and thus transform the subsequent into some kind of consequent.’1 However, tackling the problem of the cacophony of a crowded past by insisting upon selection serves only to highlight the subjectivity of the criteria that must be employed. The historian can only decide which events are genuinely antecedent to an episode by applying some sort of causative hypothesis, thereby allowing the answer to determine the selection of its own evidence. Cannon’s own investigation of the origins of Britain’s 1832 Reform Act provides an instructive example. Oddly enough, on the eve of such a major crisis, there was little evidence of a demand for constitutional change: no petitions for parliamentary reform were addressed to the House of Commons at all between 1824 and 1829. Does this mean that the historian need not look for antecedent events prior to 1829 or – as Cannon preferred to do – should we leap the fal-

36 / Past Futures: The Impossible Necessity of History low years and trace the issue back deep into the seventeenth century?2 Where, in short, does antecedence begin? The adoption of a short-termist approach denies rather than solves the problem of antecedence. Many historians argue that two essentially external episodes influenced the revival of the Reform issue. The first was the decision in 1829, in the face of a challenge from Daniel O’Connell’s Emancipation movement in Ireland, to remove the barriers that prevented Catholics from sitting in parliament. Although this was one of the wisest decisions taken by the unreformed House of Commons, it was not so viewed by Protestant zealots. Some of them concluded that so craven a parliament required reform, thereby adding a conservative element to the coalition for change. Second, in 1830, the French ousted their reactionary monarch, Charles X, and adopted a slightly liberal constitution. Opponents of Reform in Britain had constantly pointed to the dreadful warning of France in 1789, where political tinkering had escalated into violent revolution. The fact that the French had managed to conduct a revolution in 1830 without recourse to the guillotine emboldened British reformers to show themselves the equals of their Gallic neighbours.3 Does this mean that an explanatory account of the antecedent events of the Reform crisis in Britain must also incorporate antecedent events from the French and Irish pasts? The problem of selection becomes gigantic: Irish patriots claimed that problems with their larger neighbour began with the Norman invasion of 1169; Charles X had signalled his interpretation of royal power by having himself crowned at Reims, reviving a ceremony that dated back to 1131. Even if we could be assured of the survival of sufficient evidence for every past event that might possibly be included, historians could only identify relevant antecedence by sifting possible candidates for inclusion through the filter of their own causal theories. This is a remarkably circular process. Interrogation of the past is not limited to the selection of appropriate antecedent events. It is also desirable to remain agnostic about their interpretation, avoiding the temptation to overlook the possibility that there may be more than one credible explanation even for the most attractively plausible of causal connections. Take, for example, the response of King James VI of Scotland – later to become James I of

The Impossibility of Explanation / 37 England – to the outbreak of witchcraft at North Berwick in 1590. James was an intellectual monarch, intelligent enough to doubt the existence of witches. On the other hand, since the Devil had reportedly appeared in the kirk at North Berwick and stated that the king was ‘the greatest enemy hee hath in the world,’ caution seemed appropriate. Unfortunately, it was difficult to make much sense of the bizarre collage of stories tortured out of the suspects. They claimed to have had sexual relations with the Devil, to have sailed the waters of the Firth of Forth on sieves, and to have inflicted bizarre acts of cruelty upon cats and toads. The king’s patience snapped when Agnes Sampson of Haddington ‘confessed ... sundrie things, which were so miraculous and strange, as that his majestie saide they were all extreame lyars.’ Stung by this reflection on her satanic honour, Agnes sought a word in private with her sovereign to prove her veracity. Taking James aside, she whispered to him ‘the verie wordes which passed between the kinges majestie and his queene ... the first night of mariage, with their answere ech to other.’ Agnes’s account seems to have been spectacular both in its content and its accuracy, for ‘the kinges majestie wondered greatly, and swore ... that he believed that all the devils in hell could not have discovered the same, acknowledging her words to be most true, and therefore gave the more credit to the rest.’4 In his haste to embrace one possible explanation, the king had failed to consider other possibilities. Judging from the account she gave of the demise of her toad, Agnes possessed a vivid imagination, sufficient to hazard a guess at the kind of discussion that would have been necessary to ease a well-bred princess into the marital implications of a dynastic alliance. Moreover, James was a powerful monarch who reigned over a small country. It is very likely that many of his female subjects, especially those living within twenty-five kilometres of Scotland’s capital, would have had opportunities to acquire direct and detailed information about their sovereign’s sexual predilections. The attempt by James VI to apply a logical analysis of cause and effect to the phenomenon of witchcraft was merely one example of the techniques adopted, especially in Protestant countries, to reconcile rational argument with supernatural or religious revelation. Essentially, these exercises involved subjecting biblical texts to proto-historical analysis, and they merit a glance as part of the prehistory of the

38 / Past Futures: The Impossible Necessity of History craft. Simple arithmetic persuaded seventeenth-century scholars that the world had been created sometime in 4004 BC, and most felt that the Creation had coincided with the Equinox. But which Equinox, March or September? The seventeenth-century divine John Lightfoot was sure he had the answer: ‘[T]o me, September beyond all doubt.’ Why was he so sure? Because there were ‘apples ripe, and ready to eat, as is too sadly plain in Adam and Eve’s eating the forbidden fruit.’5 Three centuries later, historians will empathize with Lightfoot’s note of delighted certainty in his own deduction. It seems unkind to point out that it rested upon the twin assumptions that there was only a brief interval between the Creation and the Fall of Man, and that the Garden of Eden was located in the northern hemisphere. William Paley’s Evidences of Christianity, published in 1794, went beyond textual elucidation of Holy Writ to argue that there was ‘direct historical evidence’ for the truth of revealed religion, a claim that would hardly have astonished Mandy Rice-Davies as the author earned his bread as dean of Carlisle Cathedral. Since Paley’s Evidences remained a textbook for the general Arts degree at Cambridge University for a century after its adoption in 1822, it may be taken as an example of what passed for rigorous causal argument during the era when the modern craft of history was taking shape. In defence of the Resurrection, the central mystery of the Christian faith, Paley essentially advanced two arguments. The first was that it was ‘extremely improbable’ that anyone would have engaged in ‘so dangerous an undertaking’ as to spread the story that Jesus had risen from the dead unless it were true. The problem with this argument is that it was applied to Christianity in isolation. If accepted as a general principle, it would imply that any persecuted messianic movement (and there have been plenty of them) must be accepted as the true faith: Paley himself had some problems in dismissing the rival claims of the Prophet Mahomet. The second argument claimed that if the body of Jesus ‘could have been found, the Jews would have produced it’ in order to discredit the story of a messiah risen from the dead. This assumed that the authorities in Jerusalem knew or cared about such claims, and specifically involved Paley in dismissing the counter-hypothesis that the body of Jesus was spirited away by the Apostles.6 Paley’s arguments neither prove the truth of Christianity nor do they cast doubts on the validity

The Impossibility of Explanation / 39 of Islam. They are important only in reminding us that one of the most influential examples of nineteenth-century rational analysis rested upon huge leaps in credulity. It was no wonder that at least for one Cambridge undergraduate, the future Irish leader Charles Stewart Parnell, the study of Paley’s text had the effect of shaking his religious faith. If such specious contentions could be advanced as proofs of the Christian religion, it is hardly surprising that the pioneer generation of secular historians endowed the craft with a legacy of equally unsound methodology. The most extreme form of building an explanatory structure upon the arbitrary interpretation of one single (and often bizarre) element is known as ‘Cleopatra’s Nose.’ According to legend, the Roman empire was plunged into an east-west civil war because Mark Anthony began a relationship with the Egyptian empress after being captivated by the beauty of her nose. It does not need an unusually suspicious intellect to conclude that to attribute this massive upheaval to a single human nose is to ignore many other potential causes, not to mention – as one male historian ungallantly pointed out – other attributes of Cleopatra herself.7 A more recent but equally startling example of Cleopatra’s Nose is to be found in A.J.P. Taylor’s claim that the Battle of Britain was won by Sir Hugh Dowding’s pencil, a conclusion likely to baffle those who had formed the impression that a key role was played by the Royal Air Force. In May 1940, as the Germans broke through in the West, the French government demanded that the British commit more fighter squadrons in their support. Although politically attractive as a means of bolstering a faltering ally, the proposal was doggedly opposed by Sir Hugh Dowding, the head of Fighter Command, who warned that he could not guarantee the defence of Britain if his already small force was further depleted. Armed with sheets of statistics and the pencil of destiny, Dowding attended a cabinet meeting to plead his case. ‘When argument failed,’ Taylor recounted, ‘Dowding laid down his pencil on the cabinet table.’ It was uncharacteristic for Dowding, whose nickname was ‘Stuffy,’ to indulge in even such a mildly histrionic action. Believing that the head of Fighter Command was on the point of resigning, the cabinet ‘cringed’ and agreed to retain the fighters on British soil. Thus ‘Dowding’s pencil won the battle of Britain.’8

40 / Past Futures: The Impossible Necessity of History There is a very large problem about the evidence for this incident. Surely it ought to have been both sufficiently momentous and recent enough for the survival of eyewitness accounts. Dowding himself did not recall wielding his pencil at all. Churchill did not mention the episode in his war memoirs, nor has his tireless and encyclopaedic biographer, Martin Gilbert. The source was probably the Canadian-born Lord Beaverbrook, for whom Taylor played the role of court jestercum-historian. Revealingly, Taylor himself became more sceptical about the story after his patron’s death.9 As a press baron, Beaverbrook may well have believed that the pen was mightier than the sword, but as Churchill’s minister of aircraft production, it was surely bizarre to have attributed so much importance to a pencil. Let us, however, assume that amnesia has cloaked this dramatic episode in British history, and that Dowding did indeed threaten the cabinet with his pencil. To accept Taylor’s interpretation, we must assume that in the greatest crisis in British history, a senior serving officer would have been permitted the luxury of a protest resignation. Air chief marshals were not politicians, but when Beaverbrook himself threatened to leave the cabinet a few weeks later, Churchill curtly told him that it would be out of the question to accept any resignation while the country faced invasion. Indeed, Churchill was unlikely to have been swayed by threats of resignation, as the histrionic Admiral Jacky Fisher had found when his bluff was called in the First World War.10 The truth is that Churchill never regarded his commanders as indispensable. Soon after the Battle of Britain, Dowding was fired anyway. Perhaps Taylor, who had no high opinion of politicians, assumed too easily that they could only be swayed by threats. In fact, British fighters were short-range aircraft that could only operate effectively from secure airfields. With the Germans shattering the Allied armies, the conditions for their effective use in France no longer existed. Perhaps Dowding proved to be more persuasive wielding his arguments than his pencil. Thus, it is possible to consider the role of Dowding’s pencil in the Battle of Britain but conclude none the less that we should place more emphasis on the Spitfire and the less glamorous Hurricane. In any case, as contributors to victory, aircraft did not operate in isolation. British fighters could not have resisted the far larger waves of the Luft-

The Impossibility of Explanation / 41 waffe without radar to pinpoint the attackers. Radar would have been of limited use without radio-telephone communications to guide RAF pilots to their targets in the air. None of these technological advantages would have been worth a row of beans without the desperate courage of the ‘Few,’ the pilots who included Poles, Canadians, Australians, and even volunteers from neutral Ireland and the United States. Nor should we forget that the British had the advantage of fighting over home territory. This reduced the comparative effect of its losses, since some Allied pilots were rescued and spare parts could be salvaged from crashed aircraft, whereas German losses in English skies could not be retrieved. Even taking all these elements into account, we might still challenge the assumption that it was the British, with or without Dowding’s pencil, who won the Battle of Britain at all. The RAF was close to abandoning the skies over southern England when the Nazi high command decided to change tactics and threw its bombers against London, thus taking the pressure off the badly mauled fighter force. It is doubly curious that Churchill did not mention Dowding’s pencil in his war memoirs, since it represented a form of historical explanation that had appealed to him when writing of an earlier period. In 1920, the young King Alexander of Greece died after being bitten by a rabid monkey. The throne passed to the unpopular King Constantine, who sought to bolster his own position by leading his subjects into a bloody war against Turkey. Churchill commented that it was ‘perhaps no exaggeration ... that a quarter of a million people died of this monkey’s bite.’11 In weighing Churchill’s claim, we might notice that no such cataclysm followed the death of Canada’s governor general from the bite of a rabid fox in 1819. Arrogant and avaricious, the Duke of Richmond had imported his dislike of French people direct from the battlefield of Waterloo. Applying Churchill’s interpretation of the death of King Alexander, we might consider the possibility that if Richmond had lasted any longer in Canada, the rebellions of 1837 would have happened much sooner and with far greater ferocity. As it happened, Richmond’s successor, Lord Dalhousie, eventually embroiled himself in a far more ludicrous quarrel with French Canadian nationalists, refusing in 1828 to accept their leader, L.-J. Papineau, as Speaker of the Assembly. However, even then a further nine years

42 / Past Futures: The Impossible Necessity of History were to elapse before revolt erupted, so it would seem to exceed the bounds of permissible argument to conclude that half a million Canadians were spared civil war by the bite of a fox. The Canadian comparison suggests that even if we accept that major consequences directly flowed from the death of King Alexander, we must still explain why a mere monkey bite was capable of causing such mayhem. After all, it would hardly be satisfactory to conclude that the Meech Lake Accord was killed by a blow from Elijah Harper’s feather without outlining the knife-edge context that permitted him to block a last-minute attempt to ratify the Accord in the Manitoba legislature. Feathers, pencils, and monkey bites do not operate in historical isolation. It is usually the case that a multiplicity of causes can be adduced to explain any single event, but that one of them can be designated as the trigger, or the match that ignites the gunpowder. While this form of argument provides a useful way of introducing students to the notion of cause and effect, it is best thought of as the thinking person’s Cleopatra’s Nose, as can be seen from a well-known example, which involved a real-life trigger. The assassination of the Austrian archduke, Franz Ferdinand, at Sarajevo in Bosnia in June 1914 is often presented as the spark that set off the First World War. The Austrian occupation of the province angered the Bosnian Serbs, who wanted to become part of Serbia. The Austrians held the Belgrade government responsible for the murder, making demands that were an affront to the prestige of Serbia’s Slavic protector, Russia. Neither Germany nor France could afford to permit the humiliation of their weaker allies, Austria and Russia. The Germans could only avoid military encirclement by knocking out the French before turning against the much slower war machine to the east. This, in turn, required the German army to bypass the heavily fortified French frontier in a flanking march through Belgium. After some hesitation, Asquith’s government advised the king that Britain’s status as a guarantor of Belgian neutrality justified a declaration of war on behalf of the whole of the Empire. Thus it was that, without the slightest consultation, Canada was committed to fight for the freedom of small nations. Obviously, it would be implausible to claim that so vast a conflict was solely caused by the assassination of a little-known princeling in a Balkan town that Canadians did not hear of again until it became note-

The Impossibility of Explanation / 43 worthy as the birthplace of Mila Mulroney. Had the archduke met his end twenty years earlier at the hands of Jack the Ripper in Whitechapel, or twenty years later through an encounter with Al Capone in Chicago, the international community would perhaps briefly have regretted his passing and probably deplored his taste for adventurous travel, but it is unlikely that the death of Franz Ferdinand in itself would have sent millions of men to fight each other in the trenches. Consequently, the sensational murder was hardly the entire cause of the First World War, but it can be portrayed as the trigger that brought other and more potent causes into effect. Thus, the device of the trigger avoids the absurdity of Cleopatra’s Nose while satisfactorily accommodating a single dramatic event within a comprehensive overall explanation. It is the very neatness of the package that should arouse our suspicions. Closely investigated, the determinist, Greek-tragedy view of the assassination proves less persuasive. ‘There was no reason intrinsically why such an incident should necessitate war between Austria and Serbia.’12 In a third of a century prior to 1914, assassins murdered the crowned heads in Russia, Austria, Italy, Portugal, and Serbia itself, while France lost one president to terrorists and the United States experienced two such assassinations. Although several of these outrages were the work of exiles, none of them brought about war. The archduke was killed on 28 June, but it was not until 23 July that the Austrian government dispatched its menacing ultimatum to Belgrade. Evidently, Gavrilo Princip’s bullet had slowed down since the moment when he had squeezed the trigger three weeks earlier. The death of the archduke touched off confrontation in the Balkans because the Austrians decided that it should do so. Even then, the Serbs tried to dodge the bullet by sending an unexpectedly conciliatory reply, and for a moment the crisis seemed to have passed. Again, it was the deliberate decision of the Austrians to reject any compromise that brought about their declaration of war against Serbia on 28 July. However, it still did not follow that every barrel of gunpowder across Europe was bound to ignite simply because of a local explosion between Austria and Serbia. While we should not forget that the mobilization schedules of the continental powers – A.J.P. Taylor’s famous railway timetables – constrained their responses, much influential opinion in

44 / Past Futures: The Impossible Necessity of History Britain remained detached. ‘Happily there seems to be no reason why we should be anything more than spectators,’ the prime minister wrote on 24 July.13 Viewing the events of 1914 through the filter of the disintegration of Yugoslavia eighty years later, it is tempting to conclude that the Austrians were not entirely misguided when they insisted that ultimate responsibility for the activities of the Bosnian Serbs lay in Belgrade. Certainly, there was a strand of opinion in Britain unsympathetic to the turbulent petty nationalisms of the Balkans. Thus, if the First World War was solely the result of the murder of the archduke, Britain and Canada probably fought on the wrong side. The probability is that the murder of Franz Ferdinand was as much the pretext for war as its trigger. When a state is bent upon confrontation, it will find some reason to proclaim that it has no alternative but to respond. Hitler supplied a particularly cynical instance when he manufactured a grievance against Poland in 1939. A dozen convicts were scooped out of Third Reich prisons, dressed in Polish army uniforms, shot dead, and their bodies dumped just inside the frontier.14 It would be unkind to suggest that Vienna deliberately consigned the emperor’s nephew to his grizzly fate, but with memories of the way tiny Piedmont had harnessed Italian nationalism against them half a century earlier, there can be little doubt that the Austrians welcomed the opportunity to crush Serbia before it could similarly mobilize the southern Slavs. The apparent merit of the trigger approach is that it insists that a single element, whether it be a royal monkey or an imperial nose, must be assessed in relation to other contributory events. But this merely brings us back to the problem of appropriate antecedence – which events? In a general sense, we can place the crisis of July 1914 in the context of the aftermath of the Balkan Wars of 1912–13, which had left the Austrians more wary than ever of Serbia and Slav nationalism. But locating the episode in time is not the same as explaining why it happened. The murder of the archduke causally intersected with many other events, but to postulate even this much raises the question of the precise role played by each of them. Hence, the next logical step from identifying the spark is to analyse the nature of the gunpowder. One satisfying but not necessarily satisfactory way of attempting this is to group possible causes into ‘long term’ and ‘short term.’ Something like this was attempted by Arthur Lower in his textbook

The Impossibility of Explanation / 45 account of the coming of Canadian Confederation. Despite its shortcomings, the structure that he employed is still useful in helping to wean undergraduates away from narrative and towards an attempt, however hopeless, at explanation. Lower began with ‘large’ or ‘remote’ factors, forces or events that made Confederation possible even though they were not sufficient in themselves to bring it about. He then moved to ‘intermediate causes,’ and finally took account of the role of personalities. In between, he proclaimed a special role for the short-term factor. ‘Direct occurrences and influences that manifest themselves in the period just preceding the culmination of a historical movement may be denominated “immediate” causes.’ Of course, Lower did not have space to recount every antecedent event. Some plausible episodes were included and others failed to make the grade. One of his ‘immediate’ causes was the first royal tour of British North America, undertaken in 1860 by the future King Edward VII. The tour deserved a place in the list of causes, according to Lower, because it encouraged the Duke of Newcastle, who travelled with the prince, to support Confederation. In fact, if anything, the duke’s travels left him more impressed by the problems of communication inherent in the sheer size of the provinces, reinforcing his natural tendency to hesitation before taking decisions. Lower omitted another interprovincial tour, a fact-finding junket by a party of Canadian journalists around the Maritimes in August 1864, which helped to focus the attention of New Brunswickers and Nova Scotians on the emerging issue of union. Maybe he was right, since the welcome given to the Canadians exceeded the enthusiasm expressed for joining forces with them politically. On the other hand, Lower’s ‘immediate’ causes did include the 1858 Canadian federal initiative, despite the fact that this had been even more roundly rejected in the Maritimes.15 By the very act of his thoughtful attempt to identify antecedent events as causes of Canadian Confederation and arrange them into an explanatory pattern, Lower demonstrated the impossibility of complete historical explanation. Fundamentally, he could only assume, for he could not prove, the existence of a causal relationship between the historical landmark and the antecedent events in his list. A British historian, Ralph Bennett, has called this ‘the “must-have-been” delusion – that because A preceded B, then B “must have been” caused by A.’ In

46 / Past Futures: The Impossible Necessity of History February 1914, the British foreign secretary, Sir Edward Grey, succeeded in giving up smoking, an achievement that he confessed left him ‘in a mood of offensive arrogance.’ Six months later, the First World War broke out. Quitting nicotine can be a stressful experience. Was there a causal connection? Common sense would indeed regard it as a delusion to believe that event A, Grey’s self-liberation from tobacco, was in any way linked to event B, the outbreak of war, even though A preceded B in time, and Grey was at the centre of both episodes.16 Yet in a parallel example, the fact that Canadian Confederation occurred in 1867, immediately after the American Civil War of 1861–5 has led historians to consider the possibility that Confederation was ‘caused’ by the conflict between North and South. In this case, the connection between the Canadian event B and the American event A that preceded it is too complex to be shoehorned into the simplistic pattern of explanation that is often called post hoc, ergo propter hoc, a Latin phrase meaning ‘after this, therefore because of this.’ The impossibility of proving a causal link between any two events is compounded, as we have already seen, by the inherent circularity of identifying which of the millions of antecedent events in the human experience can appropriately be adduced to explain any given episode. Why should we select the American Civil War rather than the contemporary French invasion of Mexico, unless we have already determined that the American conflict ‘must have’ been more important than the Mexican? Even to recognize that the American Civil War was far more gruesome and prolonged than most of the comic-opera conflicts of contemporary mid-nineteenth-century Europe is not to accept that its logical by-product was the political union of the provinces to the north. The most fundamental attempt to redesign the Canadian constitution between 1867 and 1987, the Royal Commission on DominionProvincial Relations, produced its report in 1940, and it was soon obvious that the crisis of a world war was the worst possible time for a major constitutional change.17 Some of the Fathers of Confederation cited the American conflict as a reason for uniting the provinces, but – they would, wouldn’t they? In some respects, it is striking how little reference they made to the crisis south of the border, perhaps because they did not wish to provoke neighbours who were already angry enough. Judged by their deeds, the founders of modern Canada were

The Impossibility of Explanation / 47 almost supine in their complacency. The year 1867 may have marked the establishment of a Canadian nation, if we use the term in a loose sense, but it certainly did not see the creation of a Canadian army or navy. Even in 1864–5, questions such as fortifications, barracks, and even local militias had very much taken second place even when related to the union of the provinces at all. Confederation was certainly post hoc in chronological relation to the Civil War, but it is impossible to prove that it was propter hoc, that this particular event B was in any way the product of America’s massive event A. The fundamental reason why this is so lies in the difference between history and the experimental sciences. Under laboratory conditions, scientists can endlessly repeat experiments, constantly changing the mix of reagents or particles, varying the temperature, altering the conditions until they can identify, with repeated and hence statistical certainty, the combination of causes for a given phenomenon. Yet while scientific method is theoretically perfectible, this is not to say that the scientific community exists in a state of permanent harmonious agreement on all issues. Medicine, in particular, is characterized by major controversies over cause and effect. Even after decades of scientific and medical controversy over the relationship between smoking and lung cancer, the state of the argument is essentially one of overwhelming probability rather than certain proof. Post hoc evidence tells us that smokers are more likely to die of cancer, but a full-scale propter hoc deduction falters in the face of those smokers who live to a ripe old age, not to mention the non-smokers who do not. Oblivious both of the shortcomings of scientific method and its entire dissimilarity to historical argument, some historians have sought to ‘explain’ Canadian Confederation by garbing themselves in laboratory coats and pretending to have conducted experimental research. This has enabled them to claim that their list of causes, and only those causes, interacted as if in a scientific reaction. Chester Martin, for instance, likened the British North American provinces to ‘chemical reagents which are inert towards one another under normal conditions of pressure and temperature,’ but which crisis fused together into ‘a permanent chemical compound capable of withstanding the stresses and strains of normal atmospheric conditions.’ Other historians have sought ‘a kind of algebraic formula or polygon of

48 / Past Futures: The Impossible Necessity of History forces’ to account for Confederation, which has also been described as the result of ‘Newton’s first law of motion; all bodies continue in a state of rest or of uniform motion unless compelled by some force to change their state.’18 One feature of this use of scientific analogy is the primitive nature of the imagery. As Hayden White pointed out as far back as 1966, when historians liken themselves to scientists, they seem to be invoking a nineteenth-century world of test tubes and bunsen burners, remote from the physics of Einstein and his successors. Perhaps all this represents a shuddering recollection of the high school science classes of yesteryear that most historians abandoned in their teens, sometimes because (to draw upon autobiography) they could not make the grade. It is tempting to agree with Fischer that the real problem is not that history is ‘an inexact science,’ but rather that ‘historians are inexact scientists, who go blundering about their business without a sufficient sense of purpose or procedure.’ In short, when historians pretend to be scientists, far too often they are invoking a hopelessly antiquated notion of scientific endeavour, and one in which they themselves failed to excel or even understand in their own schooldays. The blundering lack of proper procedure may be seen in the highly circular form of historical explanation that it generates. Thus, to Chester Martin, Confederation was the scientific product of the fusion of certain contemporary causes. How do we know that this is so? Because Confederation happened when these causes were present. Post hoc, ergo propter hoc: the hypothesis constitutes its own proof. The pretended methodology breaks down simply because the laboratory conditions employed by experimental scientists are just not available to historians. Unlike scientists, who can repeat their experiments endlessly, varying the reagents to test their hypotheses, we cannot put the eighteen-sixties back into the test-tube, shortening the American Civil War in one experiment and making railway construction cheaper in another until the sudden bubbling of the chemicals gives us the precise and crucial cause of Confederation. The most we can attempt is a comparison of two similar episodes: to suggest why the proposal to unite the provinces failed in 1858–9 but succeeded in 1864–7. It is tempting to jump to the conclusion that since there was a civil war raging in the United States in 1864 when there had been none in 1858, we have hit upon the smoking gun of propter hoc.19 Not necessarily so: the Civil

The Impossibility of Explanation / 49 War did not prove to be the miracle ingredient in 1862, when the British government gave colonial politicians a heavy hint to couple the Intercolonial Railway with political union. Nor was the American Civil War the only fresh element that crept into the picture between 1858 and 1864. Historians can pretend to be scientists if the delusion amuses them, so long as they accept that they are conducting simplistic experiments, producing unverifiable data from unsterilized equipment. Perhaps the most pernicious aspect of the pseudo-laboratory argument is that it elevates ‘causes’ into independent historical actors, capable of dictating to people. As one of the grandees of the historical profession, G.M. Trevelyan, wisely remarked, ‘There is no way of scientifically deducing causal laws about the action of human beings in the mass.’20 Surely it is not only pointless but also offensive to analyse the causes of Confederation as if there were no distinction between the behaviour of infectious bacteria and the conduct of New Brunswick politicians. The former are governed by scientific laws; the latter enjoyed a considerable measure of freedom to be swayed by the arguments that they found persuasive and to reach their own decisions, so much so that they sometimes arrived at contradictory conclusions on the basis of identical evidence. Paradoxically, the most that can be said in favour of historical method is that its attempts at causal explanation must depend upon something more than statistical correlation based on repeated testing. Alexander Fleming discovered that penicillin was a formidably effective antibiotic, because every time he used it, penicillin worked. He won the Nobel Prize, but he had not the faintest notion how his wonder drug operated.21 The historian who argues that the American Civil War ‘caused’ Canadian Confederation must at the very least supply the deficiency of a proven connection by suggesting how it could have been that British North Americans regarded so potentially provocative a strategy as a timely response to continental crisis. Unfortunately, this strategy of making sense of the past takes us across the line that separates the unmanageable cacophony of antecedent events from the misleading devices of grand theory by which historians obscure and compound the insoluble problem of explanation. If historians are already on insecure ground when they use empirical methods to trace causal connections between events, then the imposi-

50 / Past Futures: The Impossible Necessity of History tion of theories leads them down a treacherous path to the Coldingham Fallacy, through the pompous perils of the Viking Syndrome, onward through the delusions of reification until they finally come to face-to-face with the Lobster of Kirriemuir. Each of these horrors must be exposed in turn. The Coldingham Fallacy is a device designed to overawe by apparent profundity of analysis. The monastery at Coldingham, on the east coast of Scotland, was founded in the seventh century, in the days when religious houses accommodated both monks and nuns. Around the year 680, the buildings were swept by fire. Coldingham does not owe its name to its climate but it is located on the same latitude as James Bay and it does face into a north-east wind. So it is hardly surprising that the monks and nuns lit fires to keep warm nor that there was a danger that their timber buildings might be accidentally destroyed. Writing in the eighth century, England’s first historian, the Venerable Bede, accepted that the fire was ostensibly caused by carelessness. However, he insisted that this was not the real reason: ‘all who knew the facts’ understood that the fire stemmed from a more fundamental cause, the wickedness of the inmates. A careful historian, Bede duly cited his source (a priest called Edgils) and outlined his evidence. One of the monks, an Irishman called Adamnan, had dedicated his life to the expiation of a terrible (but unspecified) crime. His original intention had been to deny himself food and sleep so that he might pray non-stop night and day. Fortunately, Adamnan’s confessor persuaded him to modify the penance into a rolling program of three-day fasts. He also undertook to review Adamnan’s regime, but unfortunately died soon afterwards. A lesser spirit might have interpreted this as a divine signal to seek a second opinion, but Adamnan ploughed on with his ordeal, allowing himself only two meals a week and praying around the clock. It was during one of these vigils that a stranger appeared to him. The visionary figure commended Adamnan’s devotion to religion, but warned that the monastery faced a fiery punishment because his fellow inmates were mired in sin. Since Adamnan was probably suffering from both malnutrition and sleep deprivation, his spectral visitor may be regarded as a hallucination, a device that enabled him to vent his resentment against those who enjoyed a more comfortable life at Cold-

The Impossibility of Explanation / 51 ingham. It is revealing that the only sin that his austere but presumably virginal imagination could conjure up was that of midnight dressmaking. However, the evidence was enough to persuade Bede that there was a causal connection between the vision and the subsequent fire. There are several fires in the pages of Bede. Some were God’s punishment. Some stopped abruptly, demonstrating the sanctity of the person who stood in the way. Some were just fires, with no special message. Only at Coldingham did Bede argue that the apparent cause was not the real explanation.22 It may be objected that historical method has moved on since Bede laid down his quill pen back in the year 731, and that it is unfair to a pleasant Scottish village to burden it with a general fallacy of historical explanation. So let us move forward twelve hundred years. In the interval, Metro-Coldingham had spawned a satellite fishing settlement, called St Abbs. There is a curious feel of Newfoundland about St Abbs, with its forlorn breakwater in a tiny cove, surrounded by houses clinging cussedly to unfriendly rock. The sole touch of elegance is supplied by a small mansion overlooking the village. In 1930, this house belonged to a lawyer and politician, William Mackenzie. That same year, he was raised to the peerage as Lord Amulree, and three years later he was sent to Newfoundland at the head of a searching enquiry into the island’s social and economic problems. It is a curious detail that the man who presided over the Newfoundland royal commission of 1933 should have lived in a fishing village himself. Newfoundland was in a grim state as the Depression wiped out income from its staple exports of fish and timber, leaving the island, still a self-governing British colony separate from Canada, overwhelmed with debt and its people sunk in poverty. It would be pleasant to think that Lord Amulree brought with him a sensitive awareness of the vulnerability of a community reliant on fishing – pleasant but, unhappily, wrong. The man from St Abbs applied the Coldingham Fallacy to the view from the window of his St John’s hotel room. The Amulree report barely acknowledged the possibility that the Depression was to blame for Newfoundland’s problems. Rather, it insisted that Newfoundlanders had plenty of fish to eat along with unlimited timber to build houses and boats and – with what was surely a nod to the fertile lowlands of Scotland – they ought to grow

52 / Past Futures: The Impossible Necessity of History more potatoes. Indeed, ‘it might be thought that the people of Newfoundland were in a better position than those of many other countries to withstand the ravages of a world-wide economic depression.’23 Unconsciously imitating Bede, Amulree decided that the real explanation for the island’s degradation lay much deeper than its apparent cause. All who knew the facts (and gossip-ridden Newfoundlanders were only too willing to confess each others’ sins) could see that the real problem lay in the colony’s rampantly wicked politics. Newfoundland was obviously reaping a disaster that had been sowed over decades, a punishment for its reckless expenditure, its squandering of natural resources, and its endless heaping-up of debt. Hence, the root problems of the island colony were not economic but political. It followed that the cure was to put an end to politics and hand over the government to a commission appointed from Britain. The role of Adamnan was taken by civil servants in London, who had been bemoaning the misgovernment of Newfoundland for decades. Since civil servants suffered neither from malnutrition nor from sleep deprivation, their analysis deserves to be taken seriously. Yet the argument did not convince the Canadian nominee to the Amulree commission, Charles A. McGrath, who, as chairman of Ontario Hydro, was reluctant to blame Newfoundlanders for investing in grandiose projects. However, faced with a call for volunteers to help Newfoundland out of its hole, Ottawa had taken two steps backwards, and so McGrath swallowed his doubts and signed the report. None the less, the subsequent history of Newfoundland suggests that Amulree’s decision to delve below the apparent problems and blame the weaknesses of the island’s economy upon the sins of local politicians was a classic example of the Coldingham Fallacy.24 It is not only inhabitants of Coldingham who are afflicted by the Fallacy. A notable outbreak occurred in the writings of the Oxford professor Goldwin Smith early in the American Civil War. In November 1861, a Northern warship stopped the Trent, a British steamer, on the high seas and seized two Southern politicians who were on their way to Europe to lobby for the Confederacy. The affront to the Red Ensign brought Britain and the United States to the edge of war, throwing into dramatic relief the vulnerability of Canada to American attack. This, in turn, prompted Goldwin Smith into arguing that it was high time the

The Impossibility of Explanation / 53 province was cut free from Britain. There were plausible arguments both for and against Canadian independence, and the history of the next thirty years would be punctuated by the rhythmic flip-flop of Smith changing his mind about the country’s future. In 1862, however, his problem lay in the first step of the argument, how to connect the argument for Canadian nationhood with a crisis that had blown up far out on the Atlantic. The answer lay in the bold invocation of the Coldingham Fallacy. ‘If there had been a war with the United States,’ Smith announced, ‘the Trent would have been the occasion, but Canada would have been the cause.’ How could the Americans be restrained from invading their northern neighbour? ‘There is but one way to make Canada impregnable, and that is to fence her round with the majesty of an independent nation.’ Smith’s intellectual sojourn in Coldingham was short-lived. Three years later, as the Civil War drew to a close with a Union victory, he insisted that the Northern States would not settle their scores with Britain by seeking revenge in Canada. ‘They know that, though a nominal dependency of England, the country is virtually independent.’25 The paradox of the Coldingham Fallacy is that if enough people subscribe to it, for practical purposes it becomes transformed into the Coldingham Fact. It has already been noted that the initial British response to the murder of Archduke Franz Ferdinand was by no means uniformly pro-Serb. British opinion tended to reverence all forms of royalty and to regard nationalism as a nuisance. All this was changed by the Austrian ultimatum of 23 July. Ignoring the trigger theory, the senior civil servant at the Foreign Office dismissed the Austrian allegations against Belgrade as ‘pretexts.’ The confrontation was not about Serbia, ‘but one between Germany aiming at a political dictatorship in Europe and the Powers who desire to retain individual freedom.’ Thus voices opposing intervention, such as that of the Manchester Guardian, which claimed to care as little about Belgrade as Belgrade cared about Manchester, found themselves subtly missing the point. Once the Coldingham Fallacy had been applied to the Sarajevo tragedy, all who knew the facts determined that the real issue at stake was the threat that military might would dictate the future of Europe.26 The Coldingham Fallacy surfaced again in assessments of the failure

54 / Past Futures: The Impossible Necessity of History in the late nineteen-eighties of Canada’s attempt at constitutional change, the Meech Lake Accord. At a purely factual level, the Accord expired because it did not secure the necessary provincial ratifications within the required three-year period. Yet this explanation was not enough to satisfy those who sought a more profound analysis. Some objected to the way in which the deal had been stitched together, arguing that the real reason for its failure lay in the fact that it was the work of eleven white male politicians, produced in secret session by an elite out of touch with the people. At first this view seemed to be espoused mainly by those who also concluded that the eleven white male politicians had done a poor job, but when the Accord crashed, it provided a handy excuse for the deal’s disappointed champions, enabling them to blame their defeat upon process rather than content. The ‘crystal-clear lesson’ which Brian Mulroney drew from the failure was that ‘a way must be found to ensure public involvement in the constitutional amendment process.’27 It was equally possible to argue that the Accord had failed not because the political elite was arrogant and remote but rather because it was divided and ineffectual. The country’s elected leaders were unable to control events in the crucial province of Manitoba, nor could they rebut the dire warnings from an unexpected manifestation of Pierre Trudeau that the sin of decentralization would bring down fiery punishment upon the Canadian nation. The sorry sequel of the Charlottetown Accord, a camel-shaped compromise supposedly designed by millions of Canadians, suggests that blaming the collapse of Meech Lake on the wickedness of elite politics was a notable example of the Coldingham Fallacy. The Viking Syndrome is the first cousin of the Coldingham Fallacy. Whereas the Fallacy appeals to some shred of contemporary evidence, even if only the overheated imagination of Adamnan, the Syndrome intrudes an entirely unproven explanatory hypothesis derived from a completely different discipline. Once the hypothesis has been introduced, it stays around to constitute its own evidence, behaving like a gatecrasher at a party whose presence is assumed by all the other guests to be the result of somebody else’s invitation. Since the Vikings were notoriously unsocial gatecrashers, it seems appropriate to take them as archetypes for the Syndrome. In intermittent bursts spread over three centuries, the Vikings

The Impossibility of Explanation / 55 erupted from their Norwegian fjords and travelled to destinations as disparate as Newfoundland and the Crimea, raiding, trading, and colonizing. To account for these sudden eruptions, some scholars resorted to the hypothesis they were triggered by a rapid increase in population. The argument sounded impressive, especially since it appealed to an imposing and eminently respectable branch of the social sciences that was called demography. (Actually, the theory was first put forward in 1699 by the pioneer statistician, Sir William Temple, almost two hundred years before the coining of the d-word itself.) Once the population hypothesis was launched into the debate, its spurious nature was camouflaged by portentous comparisons, for instance comparing the Viking phenomenon with the mass exodus from Ireland caused by the Great Famine one thousand years later. Such learned packaging obscured the inconvenient absence of supporting evidence of any kind. There were, for instance, no census statistics and so there could be no certain knowledge of Viking populations at all. It was suggested that polygamy might have been a factor in the postulated population explosion: one Viking chief was recorded as having as many as forty wives. Middle-aged insecurity prompts the reflection that if one male monopolized the reproductive capacities of forty women, condemning thirty-nine rivals to celibacy, the birth rate would have been more likely to fall than to rise. Even if the demographic hypothesis could have been proved, the simplistic causal link between overpopulation and pillaging misses out several stages in the argument. The telescoping of logic is reminiscent of the legendary response in the pages of Irish history made by the Earl of Kildare who, when asked in 1495 to account for the fact that he had set fire to Cashel cathedral, pleaded that he had believed the archbishop was inside at the time. If the Vikings were hungry and knew how to build boats, why did they not go fishing – the solution to the problem of population pressure that was so successfully adopted by the sixteenth-century Dutch? The analogy with the Irish Famine one thousand years later was massively far-fetched. Hunger did indeed drive the Irish to flee their homeland, but the Famine of 1845–9 did not stimulate large-scale sea-borne pillaging of Boston or Liverpool. Indeed, it would have been just as plausible to have argued that the Viking raids might have been the by-product of a catastrophic fall in the

56 / Past Futures: The Impossible Necessity of History Norwegian population. Too few to maintain economic and social structures – so the hypothesis might run – surviving Vikings became so lonely that they set out on their boats in a desperate search for human company. (Indeed, something like this scenario may account for the way in which the last Norsemen vanished from Greenland centuries later.) Unfortunately, their dysfunctional experience had so damaged them that the badly scarred Vikings were incapable of forming new relationships and resorted to theft and mayhem instead. Although perverse, such an interpretation is not necessarily inconsistent with the available evidence, and it could be shored up with equally bogus comparisons with psychological studies of modern refugee crises. Gradually, scholars began to recognize that, however neat and convenient the hypothesis of a population explosion in the Viking homeland, the supporting evidence required for its acceptance simply did not exist. As F. Donald Logan remarked in 1983, the eruption of the Norsemen was an episode in which the search for causes vanished into a ‘penumbra where partial evidence and assumptions meet to produce conclusions far from certain.’ He cheerfully accepted that this was ‘an uncharacteristically humble position for the historian.’28 In retrospect, it seems that humility may have been the best policy. More recent archaeological research has thrown doubt upon the likelihood of a population rise of any kind in Viking Age Norway. Many farms had been abandoned between AD 500 and 700 – immediately before the Norsemen took to their boats – and these sites remained unpeopled until the later Middle Ages. ‘The long-held view that the Viking Age began because an expanding population put unbearable pressure on resources ... is now largely abandoned.’29 As argued in chapter 7, historians are remarkably naive about population, sometimes failing even to establish the basic numbers game underlying the situations they seek to analyse. While population forms part of the problem, it does not necessarily dictate the solution – as can be seen from the fact that European imperial rule could be imposed upon the teeming millions of India but not upon the equally vast numbers of people in China. So far as the Vikings are concerned, if historians had not suffered from a collective inferiority complex, they might have recognized much sooner that guesswork does not become proof simply because it has been wrapped in a fancy social-sciences label.

The Impossibility of Explanation / 57 Demography was a relatively benign discipline compared with the intimidating source quarried by D.C. Moore when he applied the Viking Syndrome to Britain’s Great Reform Act of 1832. Traditionally, historians had tended to see the Reform Act as a deft retreat by an enlightened section of Britain’s landed aristocracy in the face of demands for a share of political power from new urban and industrial forces. In a provocative challenge, Moore argued that the Reform Act represented not ‘concession’ but rather ‘cure.’ The political elite inflicted drastic surgery on the body politic to enhance their own control not by admitting urban voters to a small share of power but rather by excising them from the still-dominant rural areas and hiving them harmlessly into a handful of town constituencies. In a major article published in 1966, Moore claimed to have revealed ‘the sociological premises of the First Reform Act.’ For the nineteen-sixties, this was heady and heavy stuff. While a traditional scholar such as Canada’s Donald Creighton might boldly dismiss the ‘analytical tools’ of the new social sciences as little more than Neolithic flint implements, most historians were wary about tangling with something as intimidating as sociology. Despite its overall plausibility, one episode in the Reform crisis did not sit comfortably within Moore’s interpretation. This was the passage of the Chandos clause, which gave the vote to large numbers of prosperous farmers who rented their land on very short tenancies. In the days of open voting, possession of the vote made such men vulnerable to pressure, since if they refused to support their landlord’s candidate, they would face prompt eviction from their farms. In Moore’s unkind phrase, those enfranchized by the Chandos clause were little more than ‘opulent serfs.’ At first sight, the Chandos clause seemed to support Moore’s contention that the Reform Act represented a grand plan and not a measured retreat, since the voters it created certainly strengthened the aristocratic grip on the countryside. However, there were also reasons to doubt the symmetry of the theory. The Chandos clause was not part of the original ministerial scheme, the supposed blueprint designed to secure continued elite dominance. Rather, it was forced upon the government during complicated parliamentary infighting. Inconveniently for Moore’s hypothesis, the Chandos clause was supported by many of the Radicals, the faction most dedicated to

58 / Past Futures: The Impossible Necessity of History weakening the power of the landowners. Unfazed, Moore stood by his hypothesis: Radicals who voted for the Chandos clause were simply ‘sociologically naive.’30 In short, they were too stupid to grasp the complexity of Moore’s analysis, and they voted the wrong way. To be dismissed in the pages of history (or, in this case, Cambridge University’s Historical Journal) as sociologically naive is a humiliating fate. However, at this point we need to remember that words can mislead. In his account of the downfall of the Nazis, Chester Wilmot was no doubt unintentionally misleading when he wrote that ‘Goebbels took his own life and those of his wife and six of his children.’31 Unless Goebbels was the world’s first posthumous mass murderer, the implied order of events cannot conceivably be correct. It is perfectly possible to use the English language to state that a man committed suicide and then killed his family, but the episode itself could never happen. A sociologically naive radical may be another artificial construction of language, invented to shoehorn some annoyingly inconvenient people into a rigid hypothesis. Recently two historians have established that there was widespread labour unrest in Canada’s Maritime provinces between 1917 and 1925, thereby strengthening the revisionist view that the well-known upheaval of the Winnipeg General Strike of 1919 was not an isolated event. The authors conclude that ‘the fact that this regional labour revolt could be largely extinguished in collective memory tells us something about the success with which bourgeois hegemony was restored.’ The invocation of an essentially sociological concept, ‘bourgeois hegemony,’ suggests another outbreak of the Viking Syndrome. Folk memories are unusually long and often bitter in Atlantic Canada: perhaps the ‘regional labour revolt’ was not sufficiently widespread to survive in the vigorous local culture of story and song. More to the point, what precisely is ‘bourgeois hegemony,’ and how does it operate? Are Maritime schoolchildren put under deep hypnosis and mentally fumigated? We are not told.32 In assessing the formulations used by historians, it is important to distinguish between simplification and reification. The first is a shorthand telescoping that highlights a basic theme; the second is the use of words to create a Thing, with the attendant risk that the Thing becomes an independent player in the causal account. An acceptable

The Impossibility of Explanation / 59 simplification is the statement that ‘slavery caused the American Civil War.’ At first sight, this looks like an example of the Coldingham Fallacy: after all, it was not until September 1862 that President Abraham Lincoln issued a threat that he would free slaves in rebel areas or, to put it another way, precisely those slaves whom he could not actually touch. In any case, why do we need such a profound explanation? Historians find little problem in nominating the trigger that touched off the conflict. The Sarajevo of the American Civil War was Fort Sumter in Charleston Harbour, and the secessionist state of South Carolina played the role of Princip by bombarding this symbol of United States authority. However, as with the assassination of the archduke in 1914, we cannot simply identify the match that touched off the conflagration without considering the volatility of the gunpowder. Ten other Southern states supported South Carolina in its armed confrontation with Washington, and these were the states with the largest percentage of slaves in their populations. Slavery, it is hardly necessary to point out, was not a system in which the slaves themselves willingly participated. A society based on slavery was a community founded upon repression, in which slave-owners sought to conscript other whites as fellow jailers. Hence the timing of the crisis in the immediate aftermath of the election of America’s first Republican president. Under Abraham Lincoln, the Northern states could no longer be counted on to function as the outer perimeter of the Dixieland prison. The Fugitive Slave Law would be enforced much less rigorously, if at all. Abolitionists’ rights to free speech would be given priority over the imperative need of slave-owners to repress resistance. The annexation of new territories to sustain the ecologically wasteful plantation system would be blocked. While by no means all slave-owners subscribed to an immediate doomsday view of Lincoln’s election, the fire-eaters succeeded in forcing the pace of secession, arguing that it was better to gamble on securing immediate full control over their own region. And so the war came. Thus, the statement that ‘slavery caused the American Civil War’ may seem an acceptable simplification of a complex set of explanations. Its shortcoming is that the simplification does not really explain very much. Identifying slavery as the context within which the crucial decisions for war were taken is not in itself an explanation at

60 / Past Futures: The Impossible Necessity of History all. To impart some form of historical validity to the shorthand statement, we must locate the crisis in time, making sense of it through the fact that it unfolded during the secession winter in which Lincoln was waiting to take office. The most that can be said in defence of the simplification is that it is not intended to imply that slavery was a malign monster that slithered out of a Mississippi swamp and manipulated North and South into murderous conflict. Unfortunately, the disclaimer cannot be made with the same confidence about the use by historians of the British empire of another abstract term, imperialism, ‘a word which tends to confuse rather than clarify.’ Historians of the British empire are alternately repelled and seduced by its deathly embrace. A.P. Thornton regarded imperialism as ‘a listing in someone else’s index, never one’s own’ – although the index references in C.A. Bayly’s Imperial Meridian outweigh the author’s sparing use of the term, often with the protectively agnostic device of inverted commas. D.C.M. Platt once suggested that it would be ‘more helpful ... to abandon the traditional vocabulary of “economic imperialism.”’ Ronald Hyam goes further, refusing to use the term at all, although characteristically he admits that his refusal is regarded as ‘eccentric.’ P.J. Cain and A.G. Hopkins, by contrast, have insisted that ‘historians need holistic terms, even if they also need to be wary of them.’33 The problem is that we can only demonstrate our scepticism by checking the theory against the facts, a process that tempts partisans of theory to retort that any mismatch between the two simply proves that we have applied the wrong facts. Its uncertain content is compounded by definitions of imperialism that go beyond the abstract to label it as ‘activity let loose.’ Cain and Hopkins define it as ‘an incursion ... into the sovereignty of another state.’34 Unfortunately, only a thin line separates the analytical concept from the autonomous monster. Thus, nineteenth-century manifestations of imperialism have been described as ‘a process whereby agents of an expanding society gain inordinate influence or control over the vitals of a weaker society.’35 Imperialism comes to life, becomes a historical player in its own right, a self-propelled bulldozer that roams the world creating level playing fields for capitalism. Real people and even other processes, such as industrialization, are reduced to the status of ‘agents.’ Even Luke Trainor, who was predominantly cautious in his handling of the

The Impossibility of Explanation / 61 term, could suggest that the movement for Australian federation ‘was hijacked by imperialism,’ as if the beast had a perverted mind of its own. Furthermore, historians feel able to maintain their claim to ethical neutrality while endowing imperialism with qualities of greed and stupidity which they would never dream of attributing to, say, vegetarianism or Methodism. Why did imperialism annex eastern New Guinea in 1883? It is not enough to reply that the monster is inherently nasty. As Bayly insists, ‘an adequate theory of imperialism’ must ‘explain the timing of European expansion as well as its form and limitations.’ The problem is compounded if, like Cain and Hopkins, we regard imperialism as ‘an integral part of the configuration of British society’: if the dragon is lurking all the time, it becomes all the harder to account for its individual manifestations, to locate its outbursts satisfactorily in time.36 A contributory problem in the controversy over the meaning of imperialism is that the competitive instincts of academics too often polarize their debates into head-to-head conflict between rival hypotheses, both of which may eventually prove to be off-beam. This problem of false antithesis is as old as the identification of the Lobster of Kirriemuir. Before the days of railways, Scotland was served by carriers, whose wagons occasionally shed packages along the potholed roads. Around the year 1800, one such item was discovered by the villagers of Kirriemuir and found to contain a lobster, a creature that none of them had ever seen. They carried it ‘reverentially’ – perhaps it was still alive – to the local schoolmaster. He had never seen a lobster either, but this did not prevent him from setting the terms of debate: it ‘maun either be an eeliphant or a turtle doo,’ he pronounced, ‘for they are the only two beasts we dinna ken by sight.’37 Much historical debate, on topics as varied as the rise of the gentry in seventeenth-century England and the agricultural crisis of early-nineteenth-century Lower Canada, is carried on between protagonists of elephants and turtle doves. Sometimes such debates create a false apposition between strategy and tactics. Confronted with the challenge ‘Joseph Howe: Opportunist or Empire-Builder?’, it is surely possible to advance a compromise view that Howe used opportunistic methods in pursuit of empire-building aims. Although he was no lobster, Joe Howe was a Nova Scotian, endowed with a global outlook but limited resources with which to

62 / Past Futures: The Impossible Necessity of History pursue his vision. Yet even this compromise view from hindsight depends upon the convenient fact that Howe apparently operated within a political system that immediately seems familiar to us, one in which personal ambition was harnessed to the pursuit of the economic betterment of the community. The Lobster of Kirriemuir can prove to be even more misleading when entirely modern concepts are imposed upon pre-modern situations. Christopher Hill insisted that to ask whether the campaign for a war against Spain in Cromwell’s England was motivated by religious fervour or economic self-interest was to pose ‘an anachronistic question,’ since the implied distinction would have been meaningless in the seventeenth century.38 Unfortunately, we cannot be entirely sure that Joseph Howe belongs to our world rather than that of Cromwell. When we encounter him as a railway promoter, we can at once place him within a political tradition of resource development and regional rivalry characteristic of Canada today. When we find him waxing enthusiastic for empire and monarchy (which latter enthusiasm, it must be admitted, Cromwell did not share), we are forced to appreciate that his world was, in some measure, distant from ours. In the scholarly debate on the nature of imperialism, longevity would place Jack Gallagher and Ronald Robinson in the elephant camp, while the more recent contenders, Cain and Hopkins, may be considered the turtle doves. The former reinterpreted British expansion in the mid-nineteenth century under the banner of the ‘Imperialism of Free Trade,’ while the latter have rallied behind ‘Gentlemanly Capitalism.’ It is noteworthy that both camps should have adopted slogans based on the paradoxical coupling of opposites, for imperialists were not thought to be enthusiasts for free trade and capitalists were certainly not always gentlemen. In the circumstances, the wisest of fools may be puzzled to decide which theory is correct. Launching their theory back in 1953, Gallagher and Robinson sought a larger reconciliation still, one that would break down the barriers between the formal and informal spheres of British influence overseas. The contention was that the main determinant of nineteenth-century imperial power was not the shade of red on the map but the extent to which Britain exercised sway on the ground to open the way for British traders. On the whole, they argued, the British preferred informal

The Impossibility of Explanation / 63 control and only hoisted the flag when local conditions so degenerated that a territory ceased to be a reliable market for British exports. Accordingly, the ultimate determinants for formal expansion, the annexation of new colonial territories, should be sought not at the centre of empire but on the periphery. For Cain and Hopkins, a quarter of a century later, the driving force was to be found not in manufacturing but in investment, manipulated by the City of London and backed by the more traditional capitalforming landowners. Hence the ruthlessness of capitalism, operated at arm’s length, was compatible with the code of gentlemanly superiority. Imperialism was thus less concerned with opening markets for British manufacturers than with ensuring that debtors should be required to keep up their payments. Hence, the gentlemanly capitalists of London were perfectly happy to see revenue-generating industrial development in India even if this was at the expense of nouveau-riche Lancashire factory-owners. A general theory is surely only as useful as the specific examples it can illuminate. Sad to say, neither the elephants nor the turtle doves offer much insight into the history of Canada. Faced with the anomaly of a self-governing dependency, Gallagher and Robinson seemed uncertain whether to regard Canada as part of the formal or informal empire, and simply ignored the problem that the senior colony made life difficult for British manufacturers by pursuing heretical policies of tariff protection.39 It was probably just plain bad luck that caused Cain and Hopkins to refer to ‘the French-speaking province of Upper Canada,’ but their repeated assertion of ‘the importance of the St Lawrence Seaway’ as ‘a conduit for trade’ in the nineteenth century seems puzzling, given that the Seaway did not open until 1959. They argued, plausibly enough, that the concession of low-interest guaranteed loans in 1841, to finance canal-building, and in 1862, for the Intercolonial railway, showed that British governments were ‘occasionally ... willing to use the leverage which London’s financial predominance gave them.’40 However, Cain and Hopkins omitted to comment on the six-year period after 1867, during which Canadian politicians showed themselves to be very ungentlemanly capitalists indeed. Ottawa disregarded the conditions attached to the British-backed loan for the Intercolonial Railway and smartly reinvested the money at a favour-

64 / Past Futures: The Impossible Necessity of History able rate. Macdonald then proceeded to blackmail the British government into underwriting the Pacific railway by threatening to torpedo the Treaty of Washington with the United States.41 Most blatantly of all, the Canadian government diverted an imperial loan intended for the fortification of Montreal to a project for deepening the canals. Far from calling the tune, the British government despairingly defended this piece of double-dealing by assuring Parliament that anything which made Canada richer also made it stronger. Appropriately, the minister charged with uttering this specious argument, Edward Knatchbull-Hugessen, devoted his leisure hours to writing fairy stories for children.42 It is only fair to acknowledge that Gallagher and Robinson were mainly interested in Africa, while Cain and Hopkins deserve credit for attempting to accommodate unfashionable Canada within their general theory of imperialism at all. The root problem lies with the application of overarching grand theory as a means of historical explanation in individual cases: sooner or later, pressure for consistency requires that the claws of the lobster must be portrayed as the elephant’s trunk or the beak of a dove. Problems are compounded when the rival camps circle around a reified term that can all too easily take on a life of its own. It was for that reason that one of the greatest historians of the Empire, W.K. Hancock, dismissed imperialism as ‘no word for scholars.’43 Of course, it is always possible that his desire to excise the term was simply an attempt to maintain bourgeois hegemony. The academic mind sometimes identifies issues where no real problem exists. In theory it may indeed be the case that the past is a seamless web of antecedent events, such that, as Lord Acton put it, ‘we can trace things back uninterruptedly, until we dimly descry the Declaration of Independence in the forests of Germany.’44 In practice, some events turn out to be more antecedent than others, so that it is perfectly possible for historians to construct a persuasive account of the movement towards the independence of the United States by ignoring every episode before the Stamp Act crisis of 1765, including the Teutonic tribal origins of American democracy. Moreover, there is nothing to prevent historians from applying straightforward empirical methods based on common sense, so that Coldingham, Kirriemuir, and North Berwick

The Impossibility of Explanation / 65 can once again become pleasant spots on the map of Scotland rather than traps for the intellectually pretentious. In short, historians are people who identify episodes and then seek to account for them by asking ‘why?’ Whatever else is uncertain about the craft, this innocent definition would seem to rest beyond controversy: historians are people who explain what happened in the past. Unfortunately, the more the implications of asking ‘why?’ are explored, the more apparent it becomes that historians can go no further than locating events in time and place. The very notion of posing ‘why?’ questions is denounced by David Hackett Fischer in his very enjoyable book Historians’ Fallacies, a syllabus of methodological blunders in which no practitioner would wish to be pilloried. Fischer insists that historians should confine themselves to framing questions around ‘who?’, ‘when?’, ‘where?’, ‘what?’, and ‘how?’, consigning ‘why?’ to the realm of the metaphysical. Noting that Benjamin Labaree had criticized a fellow historian for failing to explain the reasons for eighteenth-century American objections to British intervention in their affairs, Fischer scornfully wondered ‘what kind of “why” answer would have satisfied Labaree. The hand of God?’45 At the root of Fischer’s objection lies a complication in the meaning of the word ‘why’ itself. Sometime around the end of the sixteenth century, the practical ‘why?’ shouldered aside the more profoundly questing ‘wherefore?’ When Juliet appeared on her balcony to ask, ‘wherefore art thou Romeo?’, she was not trying to spot her lover’s hiding place in the garden. Rather she was venting her anger at the cruel fate that the young man with whom she had fallen in love should belong to a family locked in a blood feud with her own. Written about 1595, Romeo and Juliet was one of Shakespeare’s earliest plays. During the Bard’s lifetime, ‘why?’ was already ousting ‘wherefore?’ in questions posed about the sort of cosmic injustices that are so hard to cope with when you are – like Juliet – only fifteen years of age. Shakespeare himself posed ‘wherefore?’ questions in only one play after 1599, even though it was in this phase of his career that he confronted the absurdities of fate in his great tragedies. Another example of the eclipse of ‘wherefore?’ can be found in the opening verse of Psalm 2, which was couched in the simpler ‘why?’ form as early as the King James Bible of

66 / Past Futures: The Impossible Necessity of History 1611, ‘Why do the heathen rage: and the people imagine a vain thing?’ Writing Messiah in a great hurry in 1741, Handel used the slightly extended version of the verse from the 1662 prayer book, but substituted ‘nations’ for ‘heathen.’ He also anticipated Fischer by adding the warning that the hand of God would ‘dash them in pieces like a potter’s vessel.’ Thus it was that generalized theoretical questions originally of the ‘wherefore’ type came to be posed instead through the short and severely practical ‘why?’46 So long as we bear in mind that the practical ‘why?’ has muscled in on the more philosophical ‘wherefore?,’ there would still seem no reason to ban historians from asking why particular nations have raged together in specified conflicts, or why specified people have imagined vain things. Fischer was, however, on the right lines when he drew attention to the other interrogatives open to historians. If we are to define our search for causal explanation so that our questions do not produce ponderous ‘wherefore?’ answers, then ‘why?’ must be coupled with ‘who?’, ‘what?’, ‘where?’, ‘how?’, and ‘when?’ to form the simple vocabulary that Rudyard Kipling called his six honest serving men. To ask ‘what?’ is a useful check to ensure that a historical episode has been correctly tagged. How should we categorize the turbulent events of 1837 in Upper and Lower Canada? Since the insurgent leaders, Papineau, Mackenzie, and Duncombe, were consciously engaged in challenges to an established political order, it seems reasonable to dispose of the ‘what?’ question by using the term ‘rebellions,’ the plural form noting that there were at least three distinct uprisings across the Canadas, centred on Montreal, Toronto, and London. By contrast, it is no longer acceptable to refer to the Native counterattack of 1763 against the advancing American frontier as ‘Pontiac’s rebellion’: Pontiac did not see himself as a rebel because he did not accept that he was anybody’s subject. (The even more dismissive term ‘Pontiac’s conspiracy’ was at best a grudging tribute to Amerindian generalship.)47 For different reasons, ‘rebellion’ is not a clear-cut answer to the ‘what?’ question when applied to the ‘troubles’ at the Red River in 1869–70. Since there was a degree of ambiguity about both the effectiveness and the legitimacy of established order in the last days of the Hudson’s Bay Company regime, the Metis could fairly plead that that there was nothing for them to rebel against.

The Impossibility of Explanation / 67 The question ‘what?’ can be usefully posed on its own, but the same cannot be said of ‘why?’ Applied to 1837 in Upper Canada, ‘why?’ has produced a macro-explanation of the rebellions that displays ‘remarkable durability, particularly in popular literature and in student essays.’48 This explanation ranges back in time, at least as far as 1791, painting a bleak picture of incompetent government by selfish cliques in both provinces, an explosive time bomb primed with fraud and incompetence. One such irritation was the clergy reserves, the allocation of one-seventh of land for settlement in support of the clergy of the Protestant Established Church, a phrase which pitched Anglicans against Presbyterians, and a principle denounced as unfair by members of most other churches. Even if a precise causal relationship, the ‘smoking gun’ of detective fiction, cannot be traced to link the clergy reserves with the rebellions, most historians have regarded them, in a slightly misty way, as a factor in the situation. They might have profited from G.R. Elton’s stern warning that a factor is no more than an official who manages a Scottish estate.49 It is indeed tempting to draw a causal straight line from the macroexplanation to the rebellions themselves, glossing over the connecting links, just as the Earl of Kildare assumed that his antipathy towards the Archbishop of Cashel was sufficient excuse for setting fire to His Grace’s cathedral. If we can persuade ourselves that the government of Upper Canada was unjust and tyrannical, we are entitled to leap in a straight causal line to the fact that there were rebellions. Unfortunately, there are problems with such an analysis. Its factual basis is, to say the least, exaggerated: the government of Upper Canada was often inept but it was hardly despotic. Worse still, it strays into the realm of ‘wherefore?’, coming close to a generalized law of human behaviour stating that bad government provokes popular revolt. Historians, it will be argued in chapter 6, ought to be more open about the ideological baggage that they bring to their analysis, but they should not pass off their political principles as proven laws of causation. Worse still, the explanation reflects a closed mental process, in which an assumption is used as its own proof: if bad government causes people to rebel, and since there were rebellions in Canada in 1837, then government in Canada must have been very bad indeed. The evidence for the badness of the government is the fact of the rebellions, and the badness of the

68 / Past Futures: The Impossible Necessity of History government caused the rebellions. This is a circular argument, and a sterile one. If the huge assumption that ‘1837 was implicit in 1791’50 hardly answers the question ‘why what?’, it is even less satisfactory in explaining ‘why when?’, since it glosses over the inconvenient fact that both provinces had managed to escape rebellion during the intervening forty-six years. If arrogant misgovernment explains rebellion, why were there no uprisings in 1809, when the British governor, Sir James Craig, was behaving like a petty tyrant, or in 1826, when Tory louts sent Mackenzie’s presses to the bottom of Lake Ontario, or in 1828, when Lord Dalhousie clashed so pompously with Papineau? Once again, the answer is implicit in the assumptions behind the question: the fact that rebellions broke out in 1837 is the proof that it was in 1837 that misgovernment finally became intolerable. As an intellectual argument, this is little better than an exercise in the whereforing of Romeo, lacking even Juliet’s excuse of adolescent passion. The baneful effect of this moralistic macro-explanation can also be seen in its implicit failure to ask ‘why where?’ The rebellions of 1837 did not take the form of a single massive popular uprising from end to end of the Canadas. Rather, they constituted a scattering of outbreaks in different parts of the two provinces. Thus, the assumption that they were ‘caused’ by half a century of accumulated misgovernment is forced to treat the actual outbreaks as the tip of the iceberg. The question ‘why where?’ is tacitly dismissed as irrelevant: it is taken for granted that the colonial regime was obviously so unjust that somebody was going to resist it somewhere. But to dismiss a question is not to answer it. If rebellion was a response to repressive government, does this mean that the government was more unjust at DeuxMontagnes, where the habitants rose in rebellion, than it was at TroisRivières, where nothing much happened? The exploding time bomb of colonial misgovernment had been ticking under both provinces since 1791: why did it detonate in the delightfully named Upper Canada township of Sodom? If the rebellion to the north of Toronto was in some sense triggered by the uprisings around Montreal, why was there no revolt at Kingston, which was much closer to Canada’s largest city and fell more firmly within its economic orbit? ‘Why where?’ undermines a number of tried but not necessarily true explanations.

The Impossibility of Explanation / 69 Take the clergy reserves, generally held to have been both an obstacle to settlement and a visible symbol of unacceptable privilege that affronted honest freemen. Colin Read identified fifteen townships in western Upper Canada which were caught up in the rebellion, and found that on average each contained over 7000 acres of clergy reserves. However, in nineteen neighbouring townships that stood aloof from the uprising, the average size of the clergy reserves was almost 9000 acres. On the face of it, the British grand strategy of 1791, that the Anglican Church would operate as a force against disorder, would seem to have been working on the ground.51 However, such statistical correlations should be treated with some care. In British general elections, fringe parties opposed to the immigration of nonwhite people poll best in neighbourhoods with sizeable Asian and Afro-Caribbean populations. It would be unsafe to conclude from this apparent correlation that visible minorities vote for racist candidates. ‘Why where?’ is a step towards posing the central question, ‘why who?’ Can historians throw light on the hundreds of individual decisions to cross the line from disaffection to outright revolt? Attempts to categorize the rebels of 1837 point to general probabilities, but not to precise and determinant explanations. It appears that the native-born were more likely to have fought, but this does not mean that every rebel was born in North America, still less that everyone whose family originated from the United States took up arms. Even the partial correlation with American birth may be misleading, since many settlers in western Upper Canada had family roots south of the border. Thus, if the ‘why where?’ question could identify special circumstances leading to conflict in the London district, this in itself would carry implications for the disproportionate presence of Americans in any ‘why who?’ answer. Perhaps the surprising element is the participation alongside the Americans of so many migrants from Britain. If the Upper Canada rebellions were essentially American affairs, we should expect to find American leadership. Yet Colin Read and Ronald Stagg conclude that ‘the uprising at Toronto was uniquely the creation of a frustrated William Lyon Mackenzie,’ and Mackenzie had imported his frustrations from Scotland. Very few of the insurgents were members of the Church of England, but as Anglicans were relatively numerous among recent immigrants from Britain and relatively scarce among the

70 / Past Futures: The Impossible Necessity of History American-born, it is not clear whether this correlation owes more to national origin or to the influence of the clergy reserves.52 The truth is that neither wealth nor religious affiliation nor national origin could permit the historian to look at Upper Canada on the eve of the uprisings and predict with certainty who would rebel and who would not. Trevelyan was surely right when he insisted that there is no scientific way of explaining the behaviour of a mass of human beings. Some historians have flirted with the Viking Syndrome, in this case seeking their explanation in economic causes. It is easy to establish that 1837 was a depression year, and the analysis may be fleshed out by learned allusions to upheavals in the banking system and credit structure sufficient to dislocate fragile frontier communities. Yet to postulate this hypothesis as an answer to the ‘why when?’ question requires the application of chronological blinkers: economic ups and downs were frequent experiences in the colonial world, and they were not all followed by revolts. The economic hypothesis cause also falters in the face of the question ‘how?’ Food prices were high in 1837, but we cannot convert this descriptive fact into a historical cause. It is true that most Upper Canadian rebels were farmers, but in itself that tells us very little, since most loyalists were farmers too: Upper Canada was a very rural society. In any case, as Douglas McCalla has observed, ‘it is not usual to associate farm discontent with high prices for farm produce.’ In the London district, rebels generally came from prosperous townships containing relatively high proportions of cultivated land, communities well beyond the pioneering stage of frontier life. Verily, economic motives were moving in mysterious ways.53 In short, the economic hypothesis is not very helpful in providing answers to the question ‘why who?’, not least because there is a considerable difference between economic motivation and economic determinism. Perhaps the most unsteadily tottering debtor in Upper Canada was Allan MacNab, and we might have expected him to seize the opportunity to join the ranks of revolution. Inconveniently, MacNab not only rallied to the government but led his Hamilton volunteers to fight for the defence of Toronto. In any case, the evidence is incomplete. Proceedings for recovery of debts against twelve prominent rebels from western Upper Canada showed that several owed substantial sums of money. Unluckily, this is one of those instances

The Impossibility of Explanation / 71 where court records were not generated to help historians, but rather to enable creditors to collect cash owed to them by men who had fled to the United States. Other evidence suggests that the twelve were merchants and speculators, community leaders rather than desperate ne’er-do-wells. In all probability, they were creditors as well as debtors, but those who owed them money could be forgiven if they felt less urgency about making repayment. Moreover, it would surely require a massive dose of bourgeois hegemony to convert twelve rebels into a popular uprising. Even if their own debts had driven the twelve to insurrection, they can hardly explain why so many others joined in. Men were hanged for taking part in the Canadian rebellions, and it is hard to see why so many would have risked their necks for somebody else’s overdraft.54 In summary, an earlier generation of Canadian historians laudably attempted to move from narrative to analysis, and embraced a hypothesis that sounded impressive. Closely examined, there is no more reason to assume that the Upper Canadian rebels of 1837 were motivated by economic desperation than there is to believe that Viking raids were the product of over-population. The example of the Canadian rebellions of 1837 demonstrates that the first step in any attempt at historical understanding must be to define the events that are to be studied. Thus, the armed popular movements of 1837 aimed at challenging an established government, unlike the disturbances at the Red River in 1869–70, which were motivated mainly by the desire to fill the vacuum caused by its effective absence, and to influence the nature of the incoming Canadian regime. Next it is necessary to locate the events, first by place, and then, most crucially of all, in time. These are tasks that should be within the competence of established methods of historical research. To go beyond that, to distill all the individual decisions to support or resist insurrection into a single overarching explanation, the verdict of history, is impossible. A thoughtful discussion of the 1837 rebellion in western Upper Canada leads Read and Stagg to the conclusion that ‘no single explanation – a solitary political grievance, a depressed economy, a general hunger for land or food – suffices to explain why men shouldered arms.’ Of course, there is no inherent reason why a single cause should ever be regarded as a satisfactory explanation for a historical explanation. Our problem is in fact far more profound and discouraging, for it is rather

72 / Past Futures: The Impossible Necessity of History that the same combination of provocations could provide wholly diverse responses in different individuals. Thus, the problem of accounting for the decisions of some Canadians to rebel in 1837 and of others to support the established order can never be fully solved. Therefore, in the fullest sense, historical explanation is impossible. Worse still, attempts to account for the motives of individual participants are complicated not only by the usual problems of incomplete evidence but by the high degree of confusion that surrounded their understanding of events, thanks to a combination of poor communications and deliberate misinformation. Telling his supporters that ‘Montreal was unquestionably in the hands of Papineau,’ Mackenzie added that ‘he had no more doubt of Quebec’s being in the hands of the [French] Canadians, than he had of his own existence’ – although he had to admit that ‘he had not received positive information of it.’ Rebel leaders in turn then spread word across western Upper Canada that Mackenzie had taken Toronto. Some rebel claims were bizarre. Mackenzie was reported as warning that ‘the Lower Canadians would most undoubtedly turn their victorious arms against Upper Canada,’ a sentiment that casts a touch of doubt on the notion of a united revolutionary movement of the Canadian people.55 Settlers were also warned that a vengeful government was planning to arrest leading reformers and execute them under martial law. Indeed, so great was the confusion at the time that it may be tempting to launch a last-minute counterattack on behalf of historical explanation: if none of the participants really knew what was happening around them, then the historian can step in as an aloof observer and point to an underlying and unifying cause. There is undoubtedly something in McCalla’s contention that an indirect causal role can be attributed to economic factors in bringing about the events of 1837. It is an approach that at least permits historians to locate the rebellions in time.56 Even so, we have no reason to assume that instability would trigger rebellion, rather than some other widespread social response. A similar sense of crisis in Nova Scotia sixty years earlier is appealed to by historians as the ‘cause’ of the religious revival led by Henry Alline, which probably diverted Maritimers away from joining in the armed revolt of their fellow colonists against Britain. The Lower Canadian uprisings were much more clearly grounded in economic distress,

The Impossibility of Explanation / 73 although even there by no means all the disadvantaged areas rose in arms. More immediately, the way in which the authorities in Upper Canada had responded to the banking crisis had further undermined confidence in the system of government. A key element here was the unpredictable personality of the governor, Sir Francis Bond Head, whom the British prime minister subsequently called a ‘damned odd fellow.’57 As it happened, Head shared Mackenzie’s opposition to paper money, so it was hardly simple disagreement over economic policy that drove the rebellions. Rather, Head’s two wayward years in office suggested that he was capable of any kind of overreaction, even – as rebel leaders so luridly warned – letting loose the Indians to scalp peaceable settlers. The fact that many of the prisoners rounded up in the aftermath of the rebellions pleaded that they had been duped is not surprising. It was preferable to plead stupidity and survive than to admit treasonable principles and die on the gallows. Yet we should not totally abandon ourselves to the cynicism of Mandy’s riposte. There is no reason to believe that individual rebels made illogical decisions. Rather, their tragedy was to have taken what seemed to be an intensely logical decision to defend themselves on the basis of wholly misleading information. Where, then, does all of this leave historical explanation? If we conclude (as we must) that, ultimately, historical explanation is impossible, we may seem close to accepting that all forms of history are bunk and that historians serve no useful purpose. Daunting though archives and newspaper files may seem to the newly launched doctoral student, the truth is that relatively little evidence has survived for our scrutiny from the totality of the past. Much of that material we cannot usefully interpret because it was compiled for purposes far removed from the questions we wish to pose, and some of it may even have been consciously designed to mislead. We cannot be sure that we have reconstructed all the antecedent events that may have shaped an episode; indeed, our only means of identifying which events we regard as indeed appropriately antecedent is to filter them through the same causal hypotheses that they are supposed to sustain. To select a single event as the explanatory trigger may well distort the overall picture (or as much of it as we can recover), encouraging us to commit ourselves

74 / Past Futures: The Impossible Necessity of History to one possible explanation to the exclusion of equally plausible alternatives. Once we seek to move from narrative to analysis, the pitfalls are if anything deeper, although they are also less obvious and far more seductive. Abstract concepts almost invariably mislead; erudite theories obfuscate as often as they illuminate. The Coldingham Fallacy and the Viking Syndrome are ever-present dangers; too often, academic debates recall the false antithesis that overtook the Lobster of Kirriemuir. Posing the question ‘why?’ risks stumbling into moralistic ambiguity: to argue, or perhaps worse still, to assume that bad government ‘causes’ rebellions represents not so much an explanation as an ethical judgment. Asking ‘why where?’ and ‘why when?’ can help to locate an episode in space and time, while ‘how?’ is useful in deflating manifestations of the Viking Syndrome. It is when we come to ‘why who?’ that we are brought face to face with the impossibility of explaining the sum total of individual human responses and decisions. At best, we are left with partial explanation, dependent upon incomplete and provisional causal inferences. In the laboratory sense, we cannot prove the accuracy of such explanations, simply because we lack the power to rerun events in the crucible of time past in order to vary the mixture of ‘factors’ and ‘causes.’ Yet to dismiss all study of the past simply on the grounds of the inadequacy of standard forms of historical explanation would merely be to leave a dangerously presentfocused intellectual and social vacuum. (Even more basic, to argue for the impossibility of any form of explanation of events past would also destroy the intellectual underpinning of two of the foundations of the modern world, the legal system and the insurance industry.) In any case, it is impossible to evade the paradox that some form of historical explanation is necessary, that it is an inescapable part of the human condition to seek to make some sense out of our past. Should we, then, be satisfied with the partial and incomplete explanations that so laboriously emerge from scholarly tomes? At one level, this compromise may go some way towards solving our dilemma, provided we remain faithful to Ralph Bennett’s injunction and resist the temptation to present inference as certainty. This is not always easy, as John Lightfoot showed in his triumphant and delighted proclamation of an unravelling of a biblical story that now seems nonsense from start to

The Impossibility of Explanation / 75 finish. To advance our theorizing as merely tentative and provisional may be an uplifting exercise in scholarly humility, but it cannot come close to resolving the central issue. Historical explanation could only be satisfactory if it were entire and complete, both in its ability to reconstruct events in the past and in its capacity to prove causal connections among them. Since our knowledge of the one is inadequate, and our ability to tackle the second no more than inferential, true and complete historical explanation must necessarily be impossible. What is needed is a strategy that enables historians to regroup and redirect their craft, seeking less to ‘explain’ the past in the mechanical terms of cause-and-effect, but aiming rather to locate events in relation to one another within the sweep of time. It may be objected, however, that the claimed impossibility of historical explanation represents nothing more than admission of the failure to secure total comprehension of the past in a single intellectual leap. Accept partiality as a first stage, it might be argued, and it becomes possible to move to a second, far more comprehensive and satisfactory stage of analysis. At the core of the partial explanation available to us lies one simple and all-embracing causal statement: most events happen as a result of the decisions that people take. Those decisions do not always consciously aim at the results they bring about. Drunk drivers probably do not intend to kill anyone else on the road, but the law and society will rightly conclude that their decision to get behind the wheel while impaired by alcohol renders them responsible for the tragedies that ensue. Even those great and impersonal forces of history, such as famine and industrialization, owe something of their timing and impact to human failures and human achievements. Thus, while historical explanation derived from the autonomous interplay of abstract causes may sound very impressive, we have a better chance of understanding the past if we adopt a two-tier approach, starting with the multiplicity of individual human decisions. Once we have identified the decision-makers and the nature of the decisions they take, it should be possible to explore the rationality that lies behind them. That way, we can bypass the problem of the incompleteness of our own knowledge of the past, and arrive at patterns of logical underlying explanation through the medium of human choice. Unhappily, this approach merely transfers historical explanation to a higher threshold of impos-

76 / Past Futures: The Impossible Necessity of History sibility. Historians study the nature, timing, and outcome of human decisions, but somehow a core issue has been overlooked. What is a decision, and how does it come about? Even if we accept that a decision is logical within its own terms of reference, there remains an unanswerable question of the selection of rationality. In most circumstances, alternative courses of action present themselves, most basically in the choice between ‘yes’ and ‘no.’ There are usually plausible arguments for saying ‘yes’ and there are almost always reassuring reasons for saying ‘no.’ To say that people make rational decisions is one thing. To understand why different people adopt entirely opposed responses to the same challenge is quite another. Chapter 4 suggests that a focus on the definition and unravelling of human decisions offers no easy answers.

4 The Moment of Decision

This page intentionally left blank

In the pages of Canadian history, Arthur Gordon is remembered (if at all) as the governor of New Brunswick whose prickly sense of selfimportance complicated the province’s reluctant adherence to Confederation in 1867. Despite his remarkable ability to irritate both his colonial subjects and his imperial masters, Gordon continued to serve the British Empire in different parts of the globe. As the son of a former prime minister, Lord Aberdeen, he had the social influence to secure prestigious employment, and it seems to have suited his patrons to keep him as far from Britain as possible. A decade after he left the snows of New Brunswick, Gordon found himself with the splendid title (he liked titles) of British High Commissioner and Consul-General for the Western Pacific, with a roving commission over Polynesian island groups that were on the brink of being swallowed up by the imperial powers. In 1878, he visited Samoa to conclude a trade agreement with a deputation of local chiefs. Gordon’s mission was potentially a delicate one: the British had already annexed nearby Fiji; the Americans and especially the Germans were squaring up to pounce upon Samoa. The chiefs duly called upon the High Commissioner, speeches were delivered and translated. To Gordon’s surprise, he discovered that the Samoans were going far beyond acceptance of a trade treaty. Distrustful of both the Americans and the Germans, they were offering him the unconditional cession of their islands, begging to be allowed to join the British Empire. It was a very tempting offer, but even while the orations were in progress, Gordon knew that he had reached two instinctive conclusions. The first was that he would have to say ‘no’: the prize was not worth the resentment that its snatching would cause to the Germans. The second was that he would have to hightail it to the kaiser’s agents at the local German consulate as soon as he could decently conclude the meeting, to ensure that ‘no distorted report of what had happened should reach them before they heard it from myself.’ It was an exciting episode, one that reminded Gordon of the days when he had taken charge of the defence of New Brunswick in the face of a threat of invasion from the Fenians. He reflected too on the speed with which his mind had reacted. ‘It is curious with what clearness, rapidity, and decision mental operations are carried on at a moment of crisis.’ Historians attempt to explain why decisions are taken and study the

80 / Past Futures: The Impossible Necessity of History consequences that flow from them. Yet, paradoxically, they rarely reflect, in the way that Gordon did in Samoa, on the nature of a decision itself. Perhaps it is that we take decisions just as we breathe and speak and walk, each of them basic attributes that we carry out unconsciously but could not easily explain to somebody else. If we are to move away from asserted causal links and misleading theories of explanation to view history as a collage of individual choices, we must begin by focusing upon the circumstances in which those individuals reach their decisions. We must also seek to understand how some propositions emerge as matters for choice, while others that may in retrospect seem equally plausible did not qualify for the contemporary agenda. Thus, for instance, a historian studying Canada in the eighteen-sixties should not only seek to understand how it was that Confederation was presented as a choice for the future of the provinces, but must also account for the fact that annexation to the United States was not presented as a practical alternative. When Arthur Gordon praised himself for his quick response to the offer of Samoa, he used the word ‘decision’ in a specialized sense, one that nowadays would be more often rendered as ‘decisiveness.’ As we shall see, by no means every historical personality was capable of reaching a decision so rapidly. Nor does the quality of decisiveness necessarily guarantee the wisdom of the response. Despite his previous lack of familiarity with international affairs, Harry Truman impressed his advisers with his ability to take a firm lead on foreign policy issues when he became president of the United States on the death of Franklin Roosevelt in 1945. ‘You could go into his office with a question and come out with a decision from him more swiftly than any man I have ever known,’ one of them recalled. It was a quality less admired by the British, who were not always the beneficiaries of Truman’s fresh view of world affairs. They grumbled about the new president’s ‘snap judgements.’ Equally important, historians should be able to distinguish between those liable to waver and recall their decisions, and those who stood by them. In the former group was the Pakistani dictator Zia-ulHuq, who styled himself ‘Chief Martial Law Administrator’ and signed his numerous decrees with the initials ‘CMLA.’ His detractors claimed that the letters stood for ‘Cancel My Last Announcement.’ Perhaps surprisingly, the latter group includes W.E. Gladstone, whose labyrinthine

The Moment of Decision / 81 mental processes preceding the point of decision usually baffle historians. ‘It was not in the habit of his mind to go back on decisions once reached,’ recalled one of his ministers. ‘On the contrary, he was always disposed to repel doubts and hesitations, even those which he had felt before.’ In this, Gladstone seems to have been the antithesis of his own father, a wealthy Liverpool merchant. ‘I am seldom long in making up my mind,’ John Gladstone once wrote, ‘though I have sometimes had cause to regret having done so too quickly.’1 Evidently the quality was not hereditary. It is too easy to assume that everyone responds to challenges in exactly the same way. Failure to focus upon the nature of decisions, and the relative decisiveness of individuals, can produce historical writing in which ‘causes’ take control, and dictate events to the people involved. Identifying people’s decisions provides the key to presenting history, in the words of Maurice Mandelbaum, as ‘a linear sequence of intelligible human actions.’2 Decisions may be classified into three overlapping manifestations: mass, collective, and individual. Mass decision-making represents the sum total of millions of personal responses, but those individual choices can be subject to shared influences, such as bandwagon effects at elections or health scares in consumer behaviour, thus acquiring something of a communal quality. Collective decision-making occurs when people respond not individually or severally, but to determine the policy of an institution, such as a committee or a cabinet. It is this version that is usually subjected to studies of the ‘decision-making process.’ Perhaps it is the fastening of the notion of ‘process’ upon intuitive and instantaneous individual responses that explains why governments and corporations so often commit themselves to crazy schemes. Occasionally, collective decisions are entirely divorced from the opinions of contributing individuals. In 1730, an Essex jury was unable to agree on a verdict in an assault case. Eventually, the foreman proposed ‘to put twelve shillings in a hat and hussell [sic] most heads or tails whether guilty or not guilty.’3 Happily for the cause of abstract justice, the accused was acquitted, although on the basis of a decision taken not by weighing the arguments but rather through counting the coins. Collective decisions do not always reflect the preferences of any of the contributing individuals, a truism recognized in the exasperated witticism that a camel is a horse designed by a committee. When the

82 / Past Futures: The Impossible Necessity of History British cabinet faced the need for naval rearmament in 1909, the Admiralty argued for six new battleships while the parsimonious Treasury would only pay for four. Churchill recalled that in the face of a massive public campaign, ‘we finally compromised on eight.’4 On other occasions, a collective decision represents a consensus of individual decisions reached at convoy speed. A classic example of this can be found in the deliberations that led the British cabinet the week following 28 July 1914 to accept the case for war against Germany. As Hazlehurst’s detailed account makes clear, it was not collective decision-making focused upon a single problem that had a range of possible solutions. Rather it involved the reverse – a range of issues, each of them posing the narrow choice between peace and war. The same politicians who were averse to becoming involved in a Balkan war and resistant to the claims of an informal alliance with France gradually found themselves constrained by the implied obligations of that entente to defend the English Channel and committed to resist the violation of Belgian neutrality. In 1914, decision-making centred not so much upon the question of peace or war as upon selecting which of the potential issues would compel British intervention. The ‘supreme decisions’ that took Britain into war, Churchill recalled, ‘were never taken at any Cabinet.’5 However, even when collective decisions detach themselves from individual opinions, they do so as a result of a tacit agreement by the participants to set aside their own common-sense opinions (as they saw them), to put the coins into the hat, or be swept along by a public outcry or a tide of events. Thus, even apparently illogical collective decisions depend ultimately upon individual decisions to bow to a group dynamic. Before examining these individual responses, however, it is necessary to confront one massive objection to the strategy. This is the contention that to study history in this way privileges those who have enjoyed the freedom to choose and the opportunity to record, thus taking the craft back to the bad old days of concentration upon elites, and especially of elites within liberal Western societies. Even then, social historians may object that even in those communities and throughout most centuries, the mass of people have lacked the wealth and the education necessary for the luxury of commanding any aspect of their own destinies. A determinist might go yet further and insist that as a result of either economic imperatives or social exclu-

The Moment of Decision / 83 sion, many people have been incapable of exercising any influence over their personal destinies at all. Paradoxically, these objections, although weighty, do not get to grips with the real problem of treating history as a pattern of decisions. Since all survivals of the past are patchy and many are misleading, an approach based upon the scrutiny of decisions is neither more nor less plausible than any other. Indeed, by forcing us to focus upon the contrast between the instantaneous and often intuitive nature of the decision and the constructed rationalizations that constitute historical ‘evidence,’ it is an approach that calls into question all assumptions about the causal function of motivation. Hence it utterly undermines most forms of historical explanation. In any case, the decisions of the rich and powerful have also been narrowed by constraints (until the twentieth century, diplomatic history usually revolved around loveless dynastic marriages), while even the poor and the helpless usually have some opportunity to manipulate their environment. Studies of the antebellum American South have done little to mitigate the horror of slavery, but they have shown that even people deprived of the most basic of all human rights found ways of responding to the system, from low-level sabotage to running away or even occasional acts of mass resistance. All of these responses were the product of personal decisions, decisions which differed only from those taken by the powerful in that the consequences of an unsatisfactory choice were likely to be far more unpleasant. One of the paradoxes of the modern world is the high degree of intellectual ambiguity in the ways in which advanced countries run their affairs. Most societies operate everything from judicial systems to free markets on an assumption of some degree of individual responsibility, which implies an ability in each and all of us to make relatively informed and unconstrained decisions. Yet much of the discourse that frames government policy, particularly that contributed by social scientists, rests upon generalizations that deny individual autonomy, such as ‘poverty causes crime.’ At the very least, this is a manifestation of the Coldingham Fallacy, if not the outright imposition of the Viking Syndrome. Without doubt, there is a correlation between poverty and crime, but the proposition has limited predictive value, since by no means all poor people commit crimes. To elevate it into a law of

84 / Past Futures: The Impossible Necessity of History human behaviour adds insult to injury, the more so as it is honest people living in socially deprived areas who are most often the victims of crime. Nor does it begin to account for the inconvenient fact that some of the most breathtakingly avaricious crimes are committed by the very rich. None the less, the basic institutions of society assume individual freedom to decide, while much of the policy guiding those institutions is shaped by a determinist belief that people are not responsible for their own actions. The failure of historians to confront the nature of decisions is at least partly responsible for this confusion. There are social scientists who are aware of the paradox. Diego Gambetta sought to understand why some working-class teenagers in Italy remained in the education system while others dropped out. When he surveyed the academic literature, Gambetta found a conceptual split. Broadly, sociologists had confined themselves to one of two approaches. Some had concentrated on explaining why so many working-class children allowed themselves to be condemned to remain in the same social group and occupational categories. Others preferred to ‘ask how it comes about that many manage to escape the forces of social reproduction and the destinations that their ascribed status would predict.’ Few studies had bothered to take account of both possibilities, strongly suggesting that the spirit in which the question was posed tended to determine the answer that would be generated. Calling his book Were They Pushed or Did They Jump? Gambetta answered his own question: ‘If anything, they jumped.’ He did not disregard the obstacles that faced working-class school-leavers, nor the many constraints operating upon them. ‘They jumped as much as they could and as much as they perceived it was worth jumping.’6 Even in the face of social and economic hurdles, individuals can retain some degree of autonomy in making decisions. The familiar image that decision-making is a ‘process’ is helpful only insofar as it explains the narrowing-down of available options. A decision itself is an instantaneous choice of whether to say ‘yes’ or ‘no’ to a particular proposition. An illuminating simile may be found in the way in which computers operate, reacting positively or negatively with immense rapidity to a whole host of propositions. Of course people are not machines, but the responses of human beings are arguably just as instantaneous as those of computers, a fact captured in the con-

The Moment of Decision / 85 ventional phrase ‘the moment of decision.’ Unlike computers, people do not face an infinity of choices, but are usually obliged to select from a finite range of available options. The real task for historians is to locate that moment in time and to account for the range of options on offer – in short, to ask ‘why when?’ and ‘why what?’ Explaining the decision itself, asking ‘why?’, is a much more perplexing exercise. Standard historical methodology seeks evidence from the speeches or writings of the person who made that decision in the hope of discovering an analysis of the case for and against sufficient to identify the crucial argument that determined the outcome. The problem with this approach is partly evidential: it privileges the articulate minority of humanity. They may be particularly adept at defending themselves. In any case, the process is as much one of rationalization as of explanation. Few politicians have provided us with more interminable speeches and more extensive writings, including his intensely private and self-exculpatory diaries, than Mackenzie King. Yet, according to his biographer, King ‘arrived at his decisions almost intuitively.’ The challenge for the historian is to determine how far King correctly interpreted his own intuitive motives when he compiled his own private record, and to what extent he disclosed them in his public utterances. It is unlikely that Mackenzie King was unique in his tendency to rationalize his decisions to himself and present them to others in a manner designed to make them palatable. Such accounts cannot automatically translate into ‘explanations’ in the sense that historians use the term. Among the twentieth-century historians of Canada, Arthur Lower stands out as one of the most argumentative, a scholar who was not afraid to proffer bold explanations of events and motives. In 1933, Lower looked back upon his decision to fight in the First World War, admitting that ‘he did not enlist on matters of high principle so much as on an emotional impulse.’ Lester Pearson similarly recalled that his motives for joining an army medical unit in 1915 were ‘mixed and not particularly noble.’ His diary entry at the time offers little help, since it merely described his enlistment as ‘very sudden.’7 Indeed, many people seem unable to recapture even the moment at which they reached a decision, let alone the precise balance of arguments that swayed their choice. Of course there are examples in the history of intellectual speculation where the moment of decision comes

86 / Past Futures: The Impossible Necessity of History with startling clarity, as in the famous tale of the idea of gravity coming upon Newton in a flash as he watched an apple falling from a tree. Bertrand Russell could recall ‘the exact moment’ during his studies at Cambridge when he adopted the philosophy of Hegel. ‘I had gone out to buy a tin of tobacco, and was going back with it along Trinity Lane, when suddenly I threw it up in the air and exclaimed: “Great God in boots! – the ontological argument is sound!”’ Russell was an unusually loquacious decision-taker. Yet there are occasions when even the monosyllabic ‘Yes’ or ‘No’ does not capture the fact and moment of decision. As war loomed in July 1914, Winston Churchill knew that it was his duty as First Lord of the Admiralty to move the British fleet to battle stations in the North Sea in order to forestall the possibility of sudden German attack. He knew, too, that the cabinet was still divided over its response to the European crisis. Consequently, Churchill went direct to the prime minister, Asquith, and boldly informed him that he intended to order the warships into position to defend Britain’s east coast. ‘He looked at me with a hard stare and gave a sort of grunt,’ Churchill recalled. ‘I did not require anything else.’ Asquith’s grunt did more to protect Britain from invasion than Dowding’s pencil, but it is difficult to analyse a grunt as a decision-making process.8 The focus upon a decision as an instantaneous moment of stark choice sits uncomfortably with the literary, even wordy, techniques used by both historians and politicians, to say nothing of the analytical methods of social scientists. There is something anticlimactic about one of the most momentous decisions in the history of the twentieth century, Eisenhower’s unleashing of the Allied invasion of Normandy. With the greatest armada in history poised on high alert to cross the Channel in the first days of June 1944, the weather suddenly turned foul. For two fraught days, Eisenhower authorized continual shortterm postponements. Then, at 4:15 a.m. on 5 June, updated meteorological reports predicted calm waters and clear skies for just long enough to get the troops ashore. It is a story of high drama and tense excitement, but the denouement was almost ephemeral. Eisenhower’s decision took the form of just three words: ‘OK, let’s go.’ So fleeting was the instant that some who were present remembered the phrase as ‘O.K. We’ll go.’ Eisenhower’s staff were acutely conscious of its worldshaking implications.9

The Moment of Decision / 87 The problem for historians is even worse: the recorded moment may not represent the decision at all, but merely its retrospective articulation. Whichever three words Ike used in the small hours of 5 June 1944, they expressed a confirmation of a decision virtually taken several hours earlier, when he had provisionally set in motion naval movements which could not be delayed if troops were to hit the beaches around dawn on D-Day. ‘I don’t see how we can possibly do anything else,’ one participant remembered him saying shortly before the fateful moment. It is permissible to suspect that many people do not formally take a decision on important matters so much as realize that, for them, a decision has been reached. In 1900, Edith Lyttelton had to choose between accompanying her husband, a prominent politician, on a mission to war-torn South Africa or remaining behind in England to look after their three small children, the youngest of them only two months old. Although she wished to be with her spouse, she ‘could not bear’ to leave her baby and ‘dreaded the sea journey inexpressibly.’ Thus far, her account suggested that the balance of argument would lead her to stay at home, but ‘one morning I woke up to find I had decided,’ and off to South Africa she went. This story has disquieting implications for historical explanation: if decisions are indeed taken intuitively, subsequent accounts of the motives behind them are likely to consist of rationalizations rather than reasons. For rational people, it may indeed be easier to imply control over key decisions rather to admit that we are at the mercy of bewildering uncertainty. ‘I do not remember exactly what we did,’ wrote the founder of Ireland’s Gaelic League, Douglas Hyde, of the exciting climactic in his romance with Lucy Kurtz: ‘[T]hings took their course, between hope and despair, certainty and uncertainty, doubt and assurance, anxiety and confidence, but each day the net was closing around my neck until we decided firmly and finally that we were going to get married.’10 Conventional phraseology may mislead us into claiming that we ‘take’ decisions. Sometimes, decisions take us. For William Coaker, a member of Newfoundland’s wartime coalition, reaching a decision to support conscription in 1918 was even more agonizing. ‘I never want to live those days over again,’ he recalled. ‘Every hour during day and night was to me torture. I could not eat or sleep.’ Although Coaker vividly remembered his mental ordeal, he does not

88 / Past Futures: The Impossible Necessity of History seem to have recalled the balance of argument that eventually determined his decision. All he could relate was that the issue settled itself in his mind the day before the crucial cabinet meeting. ‘I resolved to vote for Conscription, and the weights were lifted, and refreshing sleep came to the rescue. The issue was no longer uncertain to me. I realized what my duty was and resolved to do it, come what may.’11 There was a moment of decision (‘I resolved to vote for Conscription’), but the relief that followed suggests that it represented not so much a formal choice between options as a recognition that the decision had somehow taken itself. (‘I realized what my duty was ...’) Is such a decision susceptible to the type of explanatory analysis practised by historians? Coaker’s biographer, Ian McDonald, believed that the decision could be explained by the need to maintain the coalition government in office, in the interests of Coaker’s own political movement, the Fishermen’s Protective Union. ‘Rejection of conscription would isolate the FPU and destroy Coaker’s chance of implementing an FPU fishery reform programme.’ No doubt such a consideration was among those present in Coaker’s mind, but this does not prove that it was crucial in swaying a decision that he claimed to have found so traumatic. Indeed, the almost flippant plausibility of the purported explanation might lead us to wonder why Coaker went through such an ordeal over the issue at all. The root problem with McDonald’s analysis was that, from his standpoint as biographer, he thought it ‘strange’ that Newfoundland should even have considered compulsory military service.12 Coaker agonized in the face of pressure to send young Newfoundlanders, including his beloved fishermen, to ‘face German shell and poison gas and the hell of war.’13 Half a century later McDonald, an idealistic young scholar, concluded that, according to his own values, the answer should have been ‘no.’ Since perversely Coaker had opted for ‘yes,’ an explanation had to be sought in the squalid realities of partisan politics. It is easy for historians to impress by confident guesswork, but our chances of understanding the pressures upon William Coaker are improved if we focus upon an instant of response within a context of constraint. The truth is that we can never know precisely what determined Coaker’s interpretation of his duty because he was evidently incapable himself of analysing his own decision into minute parts.

The Moment of Decision / 89 For the obvious reason that a mistake could not be rectified, capital punishment was an issue, both in Canada and in Britain, which demonstrated that some people found decisions easier than others. The death penalty was generally supported in both countries until the middle of the twentieth century. The Toronto newspaper magnate George Brown, for instance, regarded hanging as a ‘a grand thing for the criminal classes,’ but Brown was never called upon to confirm a death sentence. The governor general, Sir Edmund Head, who did, felt ‘the greatest horror’ towards the death penalty; ‘it used to depress him for weeks, and make him utterly wretched,’ so much so that his attorney general, John A. Macdonald, ‘used to dread having to announce to him when it was necessary.’ As Colonial Secretary, Lord Granville found it ‘a most painful thing ... to decide upon the life of a fellow creature’ when he found himself the final arbiter of a case in Prince Edward Island in 1869. Sir William Joynson-Hicks, the British Home Secretary in the nineteen-twenties who was seen as a fervent hanger and flogger, acknowledged that confirming the death sentence in one controversial case was ‘one of the most terrible moments of my life.’14 An earlier Home Secretary, H.H. Asquith, had been tormented by the fear that he had once sent the wrong man to the gallows until (so it was claimed) a prison chaplain bent the seal of the Catholic confessional to assure him that his conscience should be clear. As dawn approached on the morning of the hanging of two murderers, Sir Edward Grey, who had confirmed the executions, ‘kept meditating upon the sort of night they were having, till I felt as if I ought not to let them hang unless I went to be hung too.’15 By contrast, others took the issue in their stride. The commander of the British garrison in Canada occasionally deputized for the governor general. Accustomed to the anguished response of Sir Edmund Head, Macdonald was ‘electrified’ when General Sir Fenwick Williams enthusiastically confirmed a death sentence, booming ‘of course he must be hung; hang, hang, hang them all when they deserve it.’ Herbert Morrison, Home Secretary in Britain’s postwar Labour government, was a stern believer in retribution, and comforted himself with the belief that the men he had sent to the gallows were ‘very bad, nasty people ... [S]ociety did not lose by their departure.’ He was even reported to have expressed a wish to watch the execution of a woman.

90 / Past Futures: The Impossible Necessity of History None the less, Morrison took his duty to review hanging cases very seriously.16 One of his predecessors, Sir John Simon, had placed on the wall of the Home Secretary’s office a Latin tag warning that no delay could ever be too long before taking a human life. It was removed in 1957 by R.A. Butler, who found the motto gruesome and the task ‘hideous.’ Butler would lock himself away for up to two days at a time considering all aspects of a case before confirming a death sentence. In Canada, in the same period, John Diefenbaker made capital sentences a subject for retrial by the cabinet, in debates that sometimes continued for hours.17 Where judicial sentence of death was subject to administrative and political confirmation, it is often impossible to disentangle the precise balance of arguments behind the final decision. Joynson-Hicks felt that he was ‘not bound by any technicalities, by any laws or any rules.’ Sir John Simon resisted any suggestion that Home Secretaries should state their grounds for confirming death sentences. ‘As long as we have men with a high sense of responsibility to hold the office, it is better for the public to put confidence in their conclusion rather than to start a debate about reasons.’ The wisdom of Simon’s position, although not of the death penalty, was shown in the controversy in 1955 over the execution at London’s Holloway Prison of Ruth Ellis. Ellis had recently suffered a miscarriage, probably as a result of physical abuse from a violent and faithless lover. Armed with a gun, she had confronted him in a histrionic scene and killed him in a wild fusillade of shots: her mental state was best captured in the fact that she refused advice to appeal against her sentence and announced that she wished to die. Unfortunately, she had also slightly wounded a passer-by, who was understandably indignant and publicly poured scorn upon the notion that any English person could commit a so-called crime of passion. It was generally believed that the decision to refuse this sick woman a reprieve was driven by a concern to protect the general public from gun crime. If so, the irresistible conclusion must be that Ruth Ellis was hanged for being a poor shot – hardly the most obvious way to deter people from committing murder. This was almost certainly an example where analysis of the arguments distorted the nature of the decision, however wrong-headed and barbaric that decision proved to be.18

The Moment of Decision / 91 In the Ellis case, it is possible to hazard some sort of a guess at the balance of arguments that sent her to the gallows. Yet in other instances, the more earth-shaking the decision, the harder it is to pin down the precise reason that swayed it. Perhaps the most celebrated nineteenth-century account of an intellectual odyssey – one that remains impressive to this day – is John Henry Newman’s Apologia pro Vita Sua, in which he defended his decision to join the Catholic Church. A prominent figure in the Oxford Movement that sought to revive Catholic practices within the nominally Protestant Church of England, Newman gradually found himself unable to accept his own compromise between the two traditions. In 1845, after much agonizing, he went over to Rome. His detractors suspected that Newman had in fact been secretly received into the Catholic Church during a visit to Italy twelve years earlier. According to this magnificent conspiracy theory, he had been given a papal dispensation to lie and dissemble in order to create confusion in the Anglican camp before leading those he had deluded into the Catholic fold. To defend himself, Newman ransacked his own writings to produce an account of his changing religious opinions for a quarter of a century before 1845. Yet, remarkably, at no point in this account of one hundred thousand words of his search for religious certainty did he identify, still less describe, the precise moment at which he chose between ‘yes’ and ‘no’ on the issue of his own conversion. In retrospect, Newman came to realize that by 1841 he was on his ‘death-bed, as regards membership with the Anglican Church,’ deliberately choosing this image to draw a tasteful veil over a period of ‘tedious decline.’ In 1843, he disavowed all former criticisms of Catholicism and resigned from his post as vicar of a prominent Oxford church; but even after this major step, ‘two years yet intervened’ before the final step. ‘I find great difficulty in fixing dates precisely; but it must have been some way into 1844, before I thought not only that the Anglican Church was certainly wrong, but that Rome was right.’ Still he hesitated. ‘To be certain is to know that one knows; what test had I, that I should not change again, after that I had become a Catholic?’ He sought to clear his mind by writing a book about the development of Christian doctrine. The book was never finished. ‘Before I got to the end, I resolved to be received.’ The greatest intellectual autobiography

92 / Past Futures: The Impossible Necessity of History of the nineteenth century was unable to identify the precise moment of its central event. In the aftermath of the decision, Newman was conscious of ‘perfect peace and contentment ... like coming into port after a rough sea.’ Understandably, he attributed this calm to his discovery of the true faith. More likely, it was the resolution of the issue itself that created this sense of peace. Harriet Martineau experienced very similar emotions (‘something more like peace than I had ever yet known settled down upon my anxious mind’) after becoming an atheist. So too did William Coaker once he had accepted the imperative of conscription, while a later Newfoundland politician, Clyde Wells, reported a similar lifting of his burden on resolving to torpedo the Meech Lake Accord. (‘Having made that decision, I felt for the first time at my ease.’)19 John Henry Newman’s relationship to the most important decision in his life seems almost passive, as if it were something that occurred externally to him. At the other extreme, Fenwick Williams took a workmanlike grasp upon life-and-death decisions, even if the life belonged to someone else. In a third category, Edmund Head confirmed death sentences but found the experience agonizing. Procrastination is a natural response to the demand to decide, but one mid-Victorian Colonial Secretary, the Duke of Newcastle, took the process to extremes. The British civil servant Frederic Rogers noted that Newcastle was thrown off balance whenever Colonial Office policy-makers disagreed, ‘and rather catches at the notion of getting fresh advice, as if the bulk of advice made it easier to decide,’ a failing that also characterizes the modern ‘decisionmaking process.’ The duke was ‘oddly dilatory ... even when he has substantially made up his mind’ about an issue: ‘he seems to hesitate at making the plunge, and goes on letting the idea simmer in his mind.’20 It was characteristic of Newcastle to write in 1853 that it was ‘impossible to delay any longer a decision’ on whether to reform Canada’s Legislative Council. ‘Decision, however protracted, is at last inevitable,’ he announced when he extended the term of office of an increasingly angry Lord Elgin, who after seven successful years as governor general could not see any reason why the matter should be spun out at all. Newcastle played a key role in adjusting British policy towards accommodating a British North American push for Confederation when it would eventu-

The Moment of Decision / 93 ally come. His cautious refusal to rush his fences irritated his contemporaries, but contributed an unobtrusive and positive quality to the founding of modern Canada.21 Every decision is a double decision, requiring us to ask, first, ‘why when?’ and only then ‘why what?’ The operative second part is the action of saying ‘yes’ or ‘no’ to the available options. The historical core is to be found in that crucial first part, the decision to take a decision. We should begin by seeking to understand not simply why a decision was taken but when it was taken. This helps us to appreciate why, at that time, some options but not others were available to those making the decision. If, for the Duke of Newcastle, procrastination was a personal characteristic, for many politicians it has proved to be a tactical approach to policy-making. Detractors claim that it is the working principle of the British Foreign Office: ‘All decisions have consequences, all consequences are unpredictable, therefore take no decisions.’22 In an era of slow communications, it was certainly an attractive way of administering the colonies. When Lord John Russell became Colonial Secretary in 1839, he was advised by his predecessor, Lord Normanby, that the real art of the job lay in deciding whether problems ‘press for immediate decision or ... might by postponement dispose of themselves.’ The latter, Normanby cynically advised, was ‘a process to which ... many Colonial questions are not unapt to yield.’23 Joseph Pope defended John A. Macdonald’s nickname, ‘Old Tomorrow,’ on similar grounds. ‘He was a firm believer in the efficacy of time as a solvent of many difficulties which beset his path, and his wisdom in this regard has time and again been exemplified.’ ‘Postpone, postpone, abstain,’ was F.R. Scott’s scornful epitaph on Mackenzie King. His Australian counterpart, Robert Menzies, remarked in 1943 that ‘the popular device in politics is postponement.’ Even those politicians who seem determined to seize destiny by the scruff of its neck were not always averse to procrastination. ‘Sometimes there are great advantages in letting things slide for a while,’ wrote Winston Churchill in May 1945 as he faced the chaos in defeated Germany. ‘I’ve learned one thing in politics,’ Margaret Thatcher remarked to an astonished supporter in 1986. ‘You don’t take a decision until you have to.’24 The instinctive strategy of a cau-

94 / Past Futures: The Impossible Necessity of History tious individual can easily be grasped at as a constructive evasion by an irresolute group. Thus, Asquith’s cabinet discussed the issue of peace or war on 29 July 1914 and, as one of them put it, ‘decided not to decide.’25 Asking ‘why when?’ to locate decision-making in time is the essential preliminary to posing the question ‘why what?’ to account for the propositions that were available for a yes/no response. The narrow range of practical alternatives that were available may sometimes puzzle later observers, just as Ian McDonald found it ‘strange’ that Newfoundlanders should be obliged even to consider compulsory military service in 1918. The familiar phrase ‘a choice of evils’ reminds us that we do not always have the option of saying ‘yes’ or ‘no’ to a range of ideal choices. In 1849, Canada’s governor general, Lord Elgin, had to face a public-order crisis in Montreal where, in the absence of effective local policing, anglophone Tory mobs repeatedly staged riots against what they perceived as ‘French domination.’ Effectively, Elgin had two options: he could either allow the Tories to rampage and (as it proved) eventually destroy themselves politically, or he could bring in the military, shoot down the protesters and so turn irritating thugs into unforgettable martyrs. ‘My choice was not between a clearly right and clearly wrong course,’ he later recalled, adding the heartfelt comment: ‘how easy it is to deal with such cases, and how rare they are in life!’ In 1898, Bernhard Wise urged the people of New South Wales to ignore objections to the proposed federation of the Australian colonies: ‘until we learn the secret of obtaining absolute perfection in human affairs, a political choice must always lie between the lesser and the greater of two evils.’ This constraint operates with particular inflexibility in military matters. As Kitchener put it in 1915, ‘[U]nfortunately we had to make war as we must, and not as we should like to.’ More succinctly, that same year Churchill remarked that ‘it must be “Aye” or “No” in war.’26 Hence, in Newfoundland, Coaker was faced with one particular proposition, the imposition of compulsory military service, thanks to the ‘why when?’ imperative of a war in Europe that seemed on the point of being lost. In an ideal universe, Coaker might have invited the Kaiser to Newfoundland and dramatically negotiated a lasting peace. Or he might have re-equipped his fish plant to invent a death ray that would kill Germans and leave Allied soldiers unharmed. In the real

The Moment of Decision / 95 world, he had neither the opportunity to propose alternatives nor the possibility of asking for delay. Eventually, it seems, he realized that the issue had resolved itself in his mind by some process that transcended rule-of-thumb rationality. One of the most central landmarks in Canadian history was the decision to adopt Confederation in the eighteen-sixties. Why should this decision have forced itself in that decade, and why was it the union of the provinces that presented itself as the matter for choice? Historians have usually taken the situation of the British North American provinces in 1864 as the starting point for explanations of the proposal for Confederation, thereby deducing causes from circumstances. This is a misguided approach, although it is easily sustained from contemporary evidence. The politicians defended their actions with a plentiful supply of reasons culled from the needs and challenges faced by the provinces. Well, they would, wouldn’t they? One of the most obvious of those challenges was the underlying political crisis in the province of Canada which consumed two ministries within three months of 1864. A fundamental error in the received historical explanation has been the assumption that the Canadian political crisis provides an answer to the question ‘why what?’, that only through a union of all the provinces could Canada’s sectional pressures be dissolved. In reality, the political problems of the province of Canada answer only the prior question, ‘why when?’ The equal representation of Upper and Lower Canada within the united legislature was becoming increasingly hard to defend. Some historians have slipped into the easy verbal trap of using the term ‘deadlock’ to describe Canadian politics in 1864. The image is unhelpful to an attempt to locate the Confederation initiative in time, since by its nature it suggests total inability to move. A far more illuminating metaphor, as well as one that is more appropriately Canadian, would be that of a logjam. Moreover, the logjam was about to break: the parliamentary correspondent of the Montreal Gazette was ‘surprised to find’ how many Lower Canada members were prepared to concede the Upper Canadian case. Hence the importance of George Brown’s committee of twenty leading politicians who met in secret sessions during May and June of 1864, and whose French-Canadian members proved to be unexpectedly keen to find a way out of the

96 / Past Futures: The Impossible Necessity of History province’s problems.27 It is noteworthy that a number of opponents of Confederation concerned themselves less with the ‘what?’ of the proposal – indeed, many of them accepted it as a reasonable long-term aim – than with the ‘when?’ of raising it in 1864. Their response was to deny the existence of any fundamental problem that could require so large an immediate solution.28 They proved to be a minority. Thus, we can locate in time ‘why when?’: the decision to take a decision about the future structure of British North America can be traced to the urgently felt need for radical change in the political system of the largest province. However, the first stage of the decision, the ‘why when?’ of Canadian Confederation, does not necessarily help us in tackling the second, ‘why what?’ What propositions for change presented themselves? If the provinces had embarked on a modern-day ‘decision-making process,’ they would perhaps have commissioned a management team to review their position and prospects. The outcome of such a consultancy exercise might even have been a recommendation to seek union with the United States: one critic warned that the alleged arguments for uniting with the Maritime provinces would be ten times more alluring if applied to annexation.29 Such a process has occurred once in Canada’s recent history, with the emergence of the free-trade option in the nineteen-eighties. ‘This country could not survive with a policy of unfettered free trade,’ Brian Mulroney had pronounced in 1983. ‘We’d be swamped.’30 Within two years, opinions on the issue – including those of Mulroney himself – were shifting. The deliberations of Donald MacDonald’s royal commission on Canada’s economic future were partly responsible for ventilating the issue, although it is possible to invert the causal hypothesis and suggest that the MacDonald enquiry was created in response to uncertainty about the continued viability of a Canadian economy unless it became integrated on a continental scale. By contrast, back in 1864, George Brown’s committee commanded neither the research facilities nor the time available to a modern royal commission, and consequently it concentrated on the two alternatives that already formed part of Canadian political debate. Thus, it would be an exaggeration to conclude that British North Americans rejected annexation in the eighteen-sixties. The truth is that the available options were narrowed in such a way that they were

The Moment of Decision / 97 never seriously invited to consider the theoretically plausible possibility of union with the United States at all. In retrospect, one answer looms above all others, the creation of a union of all the existing provinces with the potential to expand and embrace the whole of British North America. Although glamorous, this solution faced one large handicap: it would require the consent of those other provinces, whose politicians, Maritimers and Newfoundlanders alike, might be more inclined to say ‘no’ to Canada’s problems, rather than ‘yes’ to Canada’s opportunities. There was, however, an alternative, smaller-scale proposition, the reconstruction of the province of Canada as an Ontario-Quebec federation. This could combine a guarantee of local autonomy for the predominantly French and Catholic community of Lower Canada with representation by population in the common legislature that would place the dynamic force of Upper Canada in the saddle. This solution of a purely Canadian federation would have suited both Cartier, the leader of the Bleus, and certainly appealed to Brown, the most intolerant voice of Upper Canadian Reform. Against this could be set a potential problem at Westminster, where British politicians were unlikely to hold their noses and ratify a constitution based on the embarrassingly democratic principle that political power should necessarily follow mere weight of numbers. It is easier to identify the two options available to Canada’s Great Coalition (even if modern textbooks sometimes pass over the smaller union in silence) than it is to account for their presence on the table in June 1864. Neither policy aim can be seen as the product of a massive groundswell in popular opinion. The idea of general union had received a muted response when it was raised by the Canadians in 1858, and Christopher Dunkin alleged that politicians ‘did not waste a word or a thought upon this gigantic question’ in the six years that followed. This was not quite true: John A. Macdonald had devoted a whole sentence to the issue in a lengthy election manifesto in 1861. In May 1864 he defended himself against the charge of inactivity by pleading that it was necessary to wait until the other provinces were interested.31 The restructuring of the province of Canada was if anything even less actively canvassed. It was the preferred solution of the most underrated creative political thinker of the Confederation period, A.-A. Dorion, but he had failed both in 1858 and 1863–4 to deliver his section of the prov-

98 / Past Futures: The Impossible Necessity of History ince in government. Upper Canadian Reformers had committed themselves in 1859 to the creation of ‘some joint authority’ to link the two parts of Canada, but the formula had been adopted not for any merits of its own but rather as a device to head off a radical demand for an outright break-up of the Canadian Union. However, even if the smaller federation was hardly a prominent cause, by 1864 it had some attractions for French Canadians as a way of protecting their culture while giving ground to the demand for change. One of the less noticed achievements of the Canadian Union was about to bear fruit: the codification of French civil law was almost complete, adding to the attractions of creating a local assembly for its management and development. As Joseph Cauchon told the readers of his Quebec City newspaper, Lower Canadian rights could be protected by federating the two Canadas just as effectively as they might within Confederation.32 It was the collective decision of Canada’s coalition cabinet at its formation in June 1864 which determined that these two forms of political reconstruction, and these two alone, were the practical options for reform. The immediate choice was further narrowed down to the single solution of Confederation by the intercolonial conferences held at Charlottetown and Quebec later that year. In this, the politicians took advantage of a latent notion that British North America would one day unite, as will be discussed in chapter 5. However, with far less comradely spirit than most historians have observed, Canada’s Great Coalition remained pledged to devote just twelve months to the pursuit of the larger union of the provinces before falling back on the smaller federation of the two Canadas. It was the existence of the smaller scheme as an alternative proposition that prompted the decision of Nova Scotians and New Brunswickers (or, at least, of enough of them) to accept Confederation. If they ever wished to become part of a larger union, the decision was now or never, however unpalatable some of the terms. If the Great Coalition could not secure the larger union within a year, ‘we Canadians must address ourselves to the alternative and reconstruct our Government,’ Macdonald warned. ‘Once driven to that, it will be too late for a general federation.’33 The collective acceptance of Confederation by the provinces of Canada, New Brunswick, and Nova Scotia in 1867 was an amalgam of

The Moment of Decision / 99 many individual decisions. By presenting neat patterns of interconnecting causes for any historical event, those who write textbooks gloss over the probability that some of its supporters may have subscribed to the solution not because they were persuaded by the package as a whole but rather because it was compatible with their particular priorities, interests, or ideologies. At best, the bundles of explanations fashioned by historians only approximately relate to the totality of individual decisions. Is it possible to pare down the three million people of British North America to identify the key decision-makers of the eighteen-sixties and account for their actions? At first sight, the task seems daunting but not impossible. In 1865, there were 286 members of the six colonial assemblies. The total number of elected legislators in the Confederation period was slightly larger thanks to the normal wear-and-tear of politics caused by death and defeat at the polls. In addition, some politicians, such as Etienne Taché, exercised influence from the less powerful upper houses: for one glorious moment in 1866, the destiny of British North America was in the hands of the nominated placemen of the Legislative Council of New Brunswick. Newspaper editors were generally little more than mouthpieces: the most outspoken, such as Brown and Cauchon, were politicians in their own right. Church leaders, especially Catholic bishops, could also command political support, and there was an occasional public figure, such as Nova Scotia’s Joseph Howe, operating outside the colonial legislatures. On realizing that we are talking of perhaps three hundred people, our historical spirits ought to rise a little. In theory, it ought not to be an impossible task to unravel the reasons that triggered three hundred decisions for or against Confederation. It has already been suggested that the massive amounts of oratorical evidence provided by those decision-makers was intended not to help historians but rather to bombard one another. We cannot even be sure that a decision to endorse Confederation was motivated by any of the perceived or alleged advantages of the scheme itself at all. Some Conservatives in June 1864 regarded the whole exercise as a brilliant manoeuvre by Macdonald to trap the Grits. From New Brunswick, Arthur Gordon reported ‘an indisposition to believe that the change is seriously meditated and an inclination to regard the plan rather as intended to produce by its agitation some immediate effect on the con-

100 / Past Futures: The Impossible Necessity of History dition of existing political parties than as designed to inaugurate a new Colonial system.’34 The Confederation issue certainly did generate some remarkable cross-party alliances in both Canada and Nova Scotia, while New Brunswick’s Tilley started the eighteen-sixties as an advanced Liberal in Fredericton and managed to end the decade as a Macdonald Conservative (in all but name) in Ottawa. Thus, perhaps some politicians backed Confederation for reasons that may have been largely unrelated to the causes listed in the historical textbooks. One of the first clerics to support the project in Lower Canada was Monsignor Louis Laflèche, an intellectual priest who was anything but afraid of intervention in the public sphere. Having ministered to the Catholic people of the Red River for twelve years, Laflèche was well equipped to think in British North American terms and to articulate measured support for the creation of a transcontinental nation. In reality, he did no such thing. Without a trace of embarrassment, he explained that he did not have time to undertake minute study of the complex scheme proposed. He said ‘yes’ to Confederation simply because the alternative was a sectional conflict that would end either in civil war or Upper Canadian domination.35 Laflèche is a reminder of the shortcomings of any form of historical explanation that assumes the operation of logical causes independently of human decision-making. Nor should we discount the possibility that some embraced Confederation not because they subscribed to the whole package of arguments that constitute a textbook explanation, but rather because just one of them happened to intersect with their own political agenda. A revealing example of the intellectual modus operandi of a colonial politician was supplied by the legendary member for Charlotte County in New Brunswick, who considered each proposal for legislation purely in terms of its effect on his riding. ‘I ain’t got anything to do with the Province,’ he was said to have explained. ‘I sits here for Charlotte and if they tells me it’ll do good for the Province but do harm to Charlotte then says I “I go in for Charlotte.”’ Conversely, he would support legislation even ‘if they tells me that it’ll harm the Province but do good for Charlotte.’36 Fortunately, the needs of Charlotte County were basic and simple. Tucked away in the far southwest of New Brunswick, it was hardly located on the high road of transcontinental destiny. Indeed, after a brief moment in the limelight of the Fenian raids, Charlotte County vanished from the mainstream of Canadian history until

The Moment of Decision / 101 1986, when the special need to sell substandard tuna from the Sun-Kist canning plant indicated that there were still those who were prepared to go in for Charlotte at the expense of the wider public interest. British North America was a patchwork of Charlottes, each of them electing representatives whose first duty was to secure local interests. ‘We are all yet mere provincial politicians,’ Macdonald admitted in 1869. James Dickson was the member for the newly settled counties of Huron and Bruce, a burgeoning part of Upper Canada which was seriously under-represented in the Assembly. Dickson was so singleminded in pressing his constituents’ grievance that he was jocularly accused of threatening secession. Federation of the provinces was all very well, he argued, but ‘Huron and Bruce were nearly the same size’ as the smallest of the Maritime provinces ‘and contained about the same population, and yet Prince Edward Island had an Executive Council, a Governor and a Legislature.’ Since Dickson voted for Confederation, it seems that he felt that the project was compatible with the achievement of justice for his riding, even though there was no proposal to confer provincial status upon the Bruce peninsula.37 However, that inference does not entitle us to jump to the conclusion that any politician who supported one of the textbook causes of Confederation would necessarily support the scheme itself. J.B. Pouliot was a fervent supporter of the Intercolonial Railway, one of the workhorse causes that features in any historical explanation of Confederation. The Intercolonial would have to pass through his riding, the narrow neck of Témiscouata that lay between the St Lawrence and the United States border, and sluggish Témiscouata badly needed the economic stimulus that a railway would bring. Yet in Pouliot’s case, enthusiasm for one part of the Confederation package did not extend to support for the whole scheme. Unlike Monsignor Laflèche, he saw the union of the provinces as a threat to the cultural survival of French Canada.38 The inclusion in the Confederation package of a major incentive that appealed to local interests may have persuaded Dickson but did not capture Pouliot. The behaviour of human beings, as Trevelyan said, cannot be reduced to scientific laws. Historical ‘causes’ are one thing; personal responses quite another. The conclusion would seem to point inescapably to the impossibility of historical explanation in the rational manner in which it is usually prac-

102 / Past Futures: The Impossible Necessity of History tised. Because individual responses are so unpredictable, there is no way of distilling them into a credible summary that begins: ‘British North Americans accepted Confederation for the following reasons ...’ Even the relationship between popular opinion and the decisionmaking politicians is opaque. Macdonald, who was no woolly democrat, dismissed the idea of consulting the voters through a general election as an ‘obvious absurdity.’39 The masterful Reformer George Brown favoured democracy so long as the people supported him, in which case it hardly seemed worth the bother of asking them. Insisting that he had ‘no fears whatever of an appeal to the people’ on the Confederation issue, Brown poured scorn on the idea of holding an election ‘at a vast cost to the exchequer, and at the risk of allowing political partisanship to dash the fruit from our hands at the very moment when we are about to grasp it!40 It was enough for Brown that the political elite was virtually united in supporting Confederation. Yet in Nova Scotia, R.G. Haliburton found that ‘the combination of political leaders, so far from recommending the scheme, filled their partizans with as much dismay, as if the powers of light and darkness were plotting together against the public safety.’41 As Canada’s politicians were to discover afresh with the rejection of the Charlottetown Accord in 1992, an elite consensus was as likely to provoke popular mistrust as general acquiescence. Even when an issue is (apparently) endorsed by voters, it remains dangerous to generalize about their motives. Prince Edward Islanders stood aloof in 1867, but the Dominion was to gain its seventh and smallest province after the Island elections of 1873. Does this mean that they had actively embraced Confederation? Pro-Confederation politicians insisted that the issue had been fully debated at the polls, but opponents alleged that Islanders had not been given enough information to make an informed judgment.42 Well, they would, wouldn’t they? In short, elections represent a moment of collective choice in which the generalizations of causal analysis almost certainly fail to capture the complexity of a mosaic of individual decisions. Thus, the textbook commonplace that New Brunswick voted against Confederation in 1865 but in its favour in 1866 can be proved only in terms of outcome, of the use that politicians made of those election results. Who can say that this necessarily reflected the intentions of those who voted

The Moment of Decision / 103 for those politicians? A key milestone between the two appeals to the voters was the victory of Charles Fisher in the York County by-election of November 1865. However, insofar as the York County vote was a turning point in New Brunswick politics, it was probably because it coincided with shifting attitudes towards Confederation within the provincial legislature as with any movement in popular opinion. Fisher, for instance, carefully distanced himself from the issue, insisting that the voters should be consulted on any future scheme for intercolonial union. Consequently, to convert the York County by-election into a referendum on Confederation, the Saint John Globe was obliged to invoke the Coldingham Fallacy: ‘The real issue is between Confederation and anti-Confederation.’ York County voters, who had given large majorities to anti-Confederates eight months earlier, now returned Fisher by a handsome margin. Had they changed their minds about Confederation? The more New Brunswick voting is examined at local level, the clearer it becomes that we shall never know precisely what motivated individuals as they cast their ballots. Merely because an election was held at the time of Confederation, we are not entitled to assume that voters regarded it as about Confederation.43 Indeed, a general election is a democratic ritual of mass decisionmaking, in which millions of people pause to cast a vote that effectively says ‘yes’ or ‘no’ to the government that rules over them. (Some citizens, of course, vote for, or against, particular candidates, regardless of party affiliation. Others cast their ballots in support of single-issue campaigns. As always, individual motives may vary, but their cumulative outcome is the same: a decision whether or not to fire the government.) It is hard to better the comment of the Winnipeg newspaperman J.W. Dafoe on the Canadian election of 1896, which he called ‘the classic example of a logical and inevitable end being reached by illogical and almost inexplicable popular processes.’44 By this, Dafoe, who was a devout Grit, meant that virtue triumphed in the election of Laurier, even though disturbingly large number of Canadians had persisted in the error of voting Conservative. Even more perverse was the response of voters in different parts of the country to what the politicians claimed was the central issue of the election, the rights of Catholics in the Manitoba school system. Manitoba voters failed to punish the reluctant Con-

104 / Past Futures: The Impossible Necessity of History servatives for seeking to impose an unwanted solution upon them, while Quebec’s Catholic electorate spectacularly failed to reward them. Indeed, to add to the inscrutability of the outcome, across the Dominion the Conservatives polled over nine thousand votes more than the Liberals. It was the lottery of a single-member, first-past-the-post voting system that placed them thirty seats adrift of Laurier’s Liberals. The only possible conclusion is that voters did not define the issues in the same way as contemporary commentators and subsequent historians: perhaps they were less impressed by the way in which the Conservative government had dithered and bickered than by the solution that it was finally driven to adopt. In the face of tens of thousands of individual decisions in the polling booths, we simply cannot solve the mystery. Indeed, we cannot even pose the question in the form, ‘Why did the voters prefer Laurier to Tupper in 1896?’, since the Liberals did not even win a majority of the popular vote. In a sense, Macdonald was right to dismiss the idea of consulting the voters on Confederation as an ‘absurdity’ since general elections are rarely single-issue affairs, however much it may suit politicians to proclaim that a particular issue is at stake. Individual voters may accept that the election must decide the issue that the politicians have placed before them but refuse to endorse their remedy. Equally, they may determine that wholly different considerations should sway their votes. Even when one issue does tower above all others, it cannot be guaranteed to produce overwhelming agreement among voters. The most convincing candidate for a single-issue poll in Canadian history is the general election of 1917, usually referred to as the ‘conscription election.’ The mere fact that the Toronto Mail and Empire declared that a vote for Laurier was a vote for the sinking of the Lusitania would not justify us in concluding that the 40 per cent of voters who cast their ballots against the government were endorsing unrestricted submarine warfare. Few modern campaigns could have seemed more dominated by a single issue than the 1987 provincial election in New Brunswick, which was overshadowed by the bizarre lifestyle of Conservative premier Richard Hatfield. The result, a Liberal sweep of all 57 ridings, would seem to justify a shorthand explanation that New Brunswickers decided upon a complete change – but even so, a remarkable 28 per cent of voters, more than one New Brunswicker in every four, man-

The Moment of Decision / 105 aged to find reasons to swallow their embarrassment and vote for candidates pledged to Disco Dick.45 Faced with a miners’ strike in 1974, the British prime minister, Edward Heath went to the polls, offering voters ‘a choice between moderation and extremism.’ It would be unfair to conclude from the fact that he lost that voters opted for extremism, not least because the outcome was a near dead heat and a minority government. None the less, it would appear that Heath had been unable to impose his own chosen issue upon the electorate.46 Yet even to conclude that the loser got the issue wrong does not justify us in asserting that the winner got it right. Mackenzie King chose to fight the Canadian election of 1926 by asking the Canadian people to agree that the governor general, Lord Byng, had been wrong to instal a Conservative minority government after the Liberals had failed to win an outright majority the previous year. (Sad to relate, this solemn issue of principle has in recent years been dismissed as ‘the King-Byng wing-ding.’) Persuaded that he had been chosen by God to battle for the rights of the people, the Liberal leader devoted the evening on which the election was called to singing hymns. He then proceeded to throw himself into a campaign for the people’s rights, ignoring the pleas of his advisers to ‘go easy’ on constitutional niceties. His Conservative opponent, Arthur Meighen, on the other hand, sought to fight the election on grounds of his choosing, emphasizing scandals under Liberal rule and insisting that ‘the one outstanding issue ... is clean government.’47 The fact that Mackenzie King won the 1926 election with a handy majority proves neither that Canadians rejected clean government nor that King was correct in identifying the role of the governor general as the key issue. Although the Liberals did poll better, there was no massive shift in the popular vote between 1925 and 1926. The main reason why King won enough seats to form a majority government second time around was that his party managers succeeded in negotiating electoral pacts in Ontario and Manitoba with the Progressives, the small third party which held the balance of power. Yet even this explanation takes for granted the individual decisions of thousands of Ontario and Manitoba voters to endorse that electoral deal, and we cannot know precisely why they chose to do so. Blair Neatby argues that the constitutional issue has acquired a retrospective prominence

106 / Past Futures: The Impossible Necessity of History because politicians and historians have needed a reason to explain why King did better in 1926 than in either of his two previous campaigns. ‘There had been no such controversy in the previous campaigns; it seemed logical to assume that the new issue accounted for the Liberal victory.’ Neatby condemned such reasoning as fallacious. ‘No election in Canada has ever been decided on a single issue.’48 With electoral pacts in two key provinces, Mackenzie King might well have secured the same result had he wandered across Canada reciting ‘post hoc, ergo propter hoc.’ We shall never know why a Dominion-wide majority of twenty-five thousand Canadian voters preferred King (or King’s candidates) over Meighen and his promise to root out corruption. By focusing upon the yes/no of individual decision-making, we identify the key question in historical explanation, but condemn ourselves to the frustration of being unable satisfactorily to answer it. From this survey of individual, mass, and collective decision-making, it would be all too easy to conclude that people operate entirely beyond the realms of logic, that every human decision represents nothing more than a random and entirely arbitrary dip into the options on offer. Such a conclusion would be at least an exaggeration and probably an outright distortion. Once we think of a decision as a momentary matter rather than a process, we may indeed be doubtful of the subsequent structures of rationalization offered by that minority of people sufficiently self-analytical and articulate to explore and express their motivation. None the less, we have no reason to dismiss the assumption of rationality at that moment when the decision was taken, or was recognized as having happened. It may be conceded that even if decisions are instantaneous and intuitive, this does not prevent them from representing instinctive perceptions of personal values or intelligent self-interest. Given that most issues involve some form of choice, either among a range of options or more simply between ‘yes’ and ‘no,’ our problem as historians lies rather in our inability to penetrate the reasons why Rational Course A was perceived by the person taking that decision to be more persuasive than Rational Course B, C, and so on. Our incomprehension can only deepen when, as Ian McDonald found in accounting for Coaker’s endorsement of conscription, we regard the option chosen as utterly irrational.

The Moment of Decision / 107 While we cannot fully explain the rationality behind the decisions that shaped past events, we can at least attempt a different form of historical understanding by locating them in time. Every decision has its own formative past, the process that determined the why what? of options available for the yes/no response, and can help us to grasp the why when?, the culmination of a need to choose at a particular moment in time. That much is implicit in most historical writing and requires merely to be brought into sharper focus. There is, however, a dimension that is too often missing. Decisions were not simply taken in the frozen moments of time, however brief, that historians place under the microscope. Even if decisions themselves are instantaneous and hence historically irrecoverable in their occurrence, they do not come about solely within a balance of momentary forces. Rather they are taken with an eye to consequences, an awareness of ends that might be achieved or problems that might be prevented. ‘To take a rational political decision,’ wrote George Orwell, ‘one must have a picture of the future.’49 The most vital context of any historical decision was not its potentially recoverable past present but its far more speculative past future, the spirit which asked, ‘What’s in it for Charlotte?’ To convert the raw past into history, to locate events in time, we need to analyse and classify those past futures.

This page intentionally left blank

5 Past Futures

This page intentionally left blank

‘If we could first know where we are, and whither we are tending, we could then better judge what to do, and how to do it.’1 The opening words of Abraham Lincoln’s ‘House Divided’ speech in 1858 contain the key to the task of locating events in time: present decisions are taken with an eye to future consequences. As Oakeshott pointed out, ‘[T]he present we occupy in practical understanding evokes future. Indeed, it evokes a variety of futures ... [I]n every action we seek a future condition of things, uncertain of achievement and sure only of its transience.’ In every area of life, our present is constantly overshadowed by the future. Investment strategies explicitly aim to profit by second-guessing what may come: there is even a section of the financial services industry that trades in ‘futures.’ Public health policy urges us to change our behaviour in the present to prevent epidemics that may lie decades ahead; advances in medicine unleash treatments with no sure knowledge of eventual side-effects.2 As individuals, we enter into marriage or start pension plans on the basis of assumptions about our own distant futures. Historians have not always explicitly recognized that the people who shaped the past were themselves influenced by notions of what was to come. Our craft is moderately effective at taking an episode that happened in, say, 1850 and imposing certain preconceptions about relevant antecedence to trace its origins back to 1830, or earlier. Yet that constitutes only half the task of locating an event in time. For a full understanding, it is necessary to re-create how the people of 1850 imagined that their world would develop by 1870, or 1900. Past futures came in different forms, optimistic and pessimistic, long-term and short-term. Just as some people find the taking of decisions harder than others, so there are those whose ideas of the future are long-term and precise, while others are reluctant to speculate beyond the present, or perhaps incapable of accepting that their world may change. Futures are uncertain and unknowable, but since people speculate about them intellectually and speculate on them financially, the attempt to analyse and classify them is vital to the extension of the process of historical understanding. ‘The future is beyond our ken,’ remarked the Irish-born journalist W.H. Russell in 1865, after visiting Canada. Consequently, for some it was always tempting at any given moment in the transient present

112 / Past Futures: The Impossible Necessity of History merely to ‘do what is right and leave the future to work out its destiny.’ Obviously, it is a fallacy to assume that people willed the consequences that hindsight can trace as the inexorable outcome of their actions: the Japanese warlords who bombed Pearl Harbor may have been morally responsible for the retaliation that fell upon them, but it would be absurd to claim that they consciously willed Hiroshima. Conversely, Oliver Cromwell once remarked that the man who rose highest was the man who did not know where he was going.3 Abraham Lincoln applied the point to the whole of American society when he reviewed the course of the Civil War and the fate of slavery in 1865: ‘Neither party expected for the war, the magnitude, or the duration, which it has already attained. Neither anticipated that the cause of the conflict might cease with, or even before, the conflict itself should cease. Each looked for an easier triumph, and a result less fundamental and astounding.’4 Ignorance of the problems that would arise helps to explain the launching of massive projects from the Crusades to the Canadian Pacific Railway, although it is harder to understand why, when gigantic obstacles became apparent, those involved did not heed the sage advice of Stephen Leacock: if at first you don’t succeed, quit. People have responded to the problem of the unknown future in different ways. Some have believed that prediction could fill the void. ‘Ignorance of the future can hardly be good for any man or nation,’ warned Goldwin Smith in 1877, who filled the void with forecasts that were as confidently delivered as they were frequently reversed.5 Another Victorian professor, J.R. Seeley, claimed that ‘history ought surely in some degree, if it is worth anything, to anticipate the lessons of time. We shall all no doubt be wise after the event; we study history that we may be wise before the event.’ For all his bombast, Seeley seemed unsure which discipline would provide that crystal ball, since he also argued that ‘students of political science ought to be able to foresee, at least in outline, the event while it is still future.’6 In its most extreme manifestation, confidence in prediction takes the form of a claim to ownership over destiny. ‘You cannot fight against the future,’ Gladstone told the House of Commons when it rejected parliamentary reform in 1866. ‘Time is on our side.’7 As it happened, he was right: a Reform Bill was passed within a year. In a television interview in 1957, Nikita Khrushchev assured Americans that their

Past Futures / 113 grandchildren would be Communists. A commentator at the time noted that this was a modification of his earlier view, that the cause would triumph in a single generation, but that it was none the less a sincerely held key to Soviet long-term strategy.8 Conversely, some claims to ownership of the future may in reality represent not so much confidence in inevitability as attempts to find consolation for isolation in the present. In 1972, the left-wing Labour MP Ian Mikardo announced that the British monarchy would be abolished within a generation, a handily vague measure of time although one which seems to have elapsed. On the other side of the world, Donald Horne made the mistake of committing himself to a time span that was both shorter and more precise, announcing in 1976 that Australia would be a republic within ten years.9 A more adaptable attitude to the future is fundamental to understanding the success of some political adventurers. ‘What we anticipate seldom occurs,’ wrote Disraeli at the close of a career that proved his own dictum; ‘what we least expect generally happens.’10 In a friendly exchange with the Viceroy of India, Lord Linlithgow, in 1933 over his opposition to political concessions to Gandhi’s nationalist movement, Winston Churchill parried the accusation that he sought to return to the long-lost world of 1900: ‘[Y]ou assume the future is a mere extension of the past whereas I find history full of unexpected turns and retrogressions.’11 Churchill’s view of the future reflected his opportunistic personality, but arguably prevented him from constructing a larger picture. His chief of staff, Sir Alan Brooke, once tried to restrain one of Churchill’s wartime initiatives by enquiring how it fitted into overall strategy. In reply, ‘he shook his fist in my face, saying, “I do not want any of your long-term policies, they only cripple initiative!”’12 Brooke’s point was that Churchill ‘must know where he was going, to which he replied that he did not want to know!’ He was at least confident in his own destiny. Early in his career, Churchill was impressed by the fatalistic motto of a South African leader, M.T. Steyn: ‘Alles sal reg kom.’ Churchill quoted it frequently, both in Dutch and English: ‘All will come right.’13 Others have tried to ignore, even deny, the future, adopting responses that were either fatalistic or downright irresponsible. The previous chapter noted the curious fact that John Henry Newman wrote over one hundred thousand words of intensely per-

114 / Past Futures: The Impossible Necessity of History sonal spiritual history without actually identifying the moment of his conversion to Rome. His Protestant critics assumed that Newman must have been well aware of the path that he was treading, and consequently accused him of operating as a Trojan horse during his last years within the Anglican Church. They were wrong to do so. Newman’s attitude to the future was caught in the lines of his famous hymn, Lead Kindly Light: ‘I do not ask to see / The distant scene; one step enough for me.’ Fearfully watching his friend’s gradual defection from the Protestant cause, Gladstone perceptively identified Newman’s likely conversion to Rome as ‘not a conclusion finally reached in his mind, but one which he sees advancing upon him without the means of resistance or escape.’14 Another Oxford graduate, Lester B. Pearson, seems to have adopted a similar attitude. It is generally assumed that he was a relatively ineffective prime minister because throughout his five years in office Canadian voters denied his Liberal party a parliamentary majority. However, whenever he was warned by advisers of possible pitfalls ahead, he would airily respond: ‘We’ll jump off that bridge when we come to it.’15 Minority government may not have been the only constraint upon Pearsonian achievement. Refusal to look to the long term can lead governments step-by-step into eventual disaster. Looking back in 1824 on the failure of British colonial policy in the previous century, the prime minister, Lord Liverpool, concluded that ‘we got into our difficulties in North America from not looking forward, from the hope that what was likely to take place never would take place.’ As a result, the British government became trapped by policy positions ‘on which we should never have committed ourselves if we had in the first instance a large and extensive view of the consequences of what we were doing.’16 On the other hand, there may be times when it does not make sense to think too far ahead. In 1904, Canada’s opposition leader, Robert Borden, believed that Laurier’s policy of building a second transcontinental railway would be ruinous to the country, but he found that even his own Conservative supporters were reluctant to back him. One commented that the scheme ‘will give us good times for at least ten years and after that I do not care.’17 As a point of view adopted just a decade before the outbreak of the First World War, this might even be regarded as unconsciously prescient.

Past Futures / 115 Arguably, a more disingenuous attempt to dismiss the future was produced by Sir Edward Grey, who had been Britain’s foreign secretary at the outbreak of war in 1914. In his memoirs a decade later, he denied that long-term considerations affected British policy at all. The inspiration behind British diplomacy was ‘not to be found in farsighted views or large conceptions or great schemes.’ Rather, his predecessors at the Foreign Office had been ‘guided by what seemed to them to be the immediate interest of this country without making elaborate calculations for the future.’ This was special pleading, a response to those critics who had censured Grey for failing in those crucial summer days of 1914 to make it crystal-clear to Berlin that, if pushed to the limit, Britain would fight alongside the French. Hence his insistence that foreign secretaries had always sought to avoid ‘creating expectations that they might not be able to fulfil.’ Of course, this picture of short-term opportunism hardly explained why in 1914 Britain should have gone to war in support of a guarantee given to Belgium as far back as 1839. It was also in marked contrast to Grey’s bleak evocation of the domino theory in his speech to the House of Commons on 3 August 1914, in which he warned that Holland and Denmark would follow Belgium in losing their independence, to the long-term detriment of Britain’s position. There was no focusing upon ‘the immediate interests of this country’ in his bleak warning to MPs: ‘[D]o not believe, whether a great Power stands outside this war or not, it is going to be in a position at the end of it to exert its superior strength.’ Perhaps Grey sought to deny that policy evolved within a continuum of expectation in order to forget that he had offered the consolation of what proved to be one of the most unreliable predictions of all time. Thanks to the power of the British navy, he had assured MPs on that August day, ‘if we are engaged in war, we shall suffer but little more than we shall suffer even if we stand aside.’18 Denial of the future has been a special characteristic of privileged but insecure minorities who doubted their long-term ability to hang on to power. Ireland’s Protestants provide one example. Replying in the Irish parliament to a proposal aimed at benefiting generations to come, the eighteenth-century politician Sir Boyle Roche could not see ‘why should we put ourselves out of our way to do anything for posterity,’ adding triumphantly, ‘What has posterity ever done for us?’ Roche’s

116 / Past Futures: The Impossible Necessity of History oratorical absurdities were capable of reducing the legislators of College Green to hysterics, as on this occasion when he reportedly added the clarification that ‘by posterity, he did not at all mean our ancestors, but those who were to come immediately after them.’ He supported the Union with Great Britain as the best hope for long-term Protestant security, arguing that the two countries were sisters who ought to embrace as one brother. One of his successors placed a similar emphasis upon Ireland’s place within the United Kingdom, and was equally hostile towards policies aimed at the future. Offered in 1914 the compromise of the simultaneous enactment of Home Rule but postponement of its operation until 1920, Sir Edward Carson dismissed the compromise as a stay of execution on a sentence of death.19 An inverse but similar process can be seen in British attitudes towards the future of their empire in India, which began in optimism, declined into pessimism, and ultimately took refuge in denial. ‘The sceptre may pass away from us,’ the historian and politician T.B. Macaulay had said in 1833, claiming that the independence of India under British parliamentary institutions would be ‘the proudest day in English history.’ Thirty years later, after the terrible upheaval of the Mutiny, the essayist Walter Bagehot noted the death of ‘a notion – not so much widely asserted as deeply implanted ... that in a little while, perhaps ten years or so, all human beings might ... be brought up to the same level.’20 The implantation of Western-style institutions in India now seemed less likely. Could the British hold on indefinitely? Professor Seeley insisted that since the Indian empire defied rationality by existing at all, there was no reason why it should not continue to do so indefinitely. For all the stiffness of his upper lip, that most grandiose of viceroys, Lord Curzon, was less sure. He banned the hymn Onward Christian Soldiers from the church service at the Delhi Durbar of 1902 as ‘particularly inappropriate.’ It was not the affront to the feelings of the Hindu and Muslim soldiers of the Raj that troubled him. Rather he objected to the lines ‘Thrones and crowns may perish, / Kingdoms rise and wane.’21 The result was that by the early twentieth century, British governments were making policy for India with only a vague idea of the subcontinent’s eventual destination. ‘What will be the future of India, fifty, sixty, or a hundred years hence,’ a British minister assured Parliament in 1909, ‘need not ... trouble us.’ Britain’s duty was ‘to

Past Futures / 117 provide, as best as we can, for the conditions of the moment.’ It was not just coincidence that Lord Crewe should have quoted Sheridan, an Irish Protestant, in refusing to contemplate the fate of India in distant times, when ‘all of us are dead and most of us are forgotten.’22 Thus, one way in which the analysis of past futures can assist historians is by establishing whether the people we study had a structural attitude to the unfolding of events, or dealt with the uncertainties of their own location in time by attempting to ignore what might come afterwards. Identifying and categorizing such past futures is at the very least an antidote to the inconveniently omniscient disadvantages of our own hindsight. What matters is where they thought they were heading, not where we know they would eventually arrive. Yet in one important dimension, it is hard for people reared in a modern secular culture to enter into the universe of most English-speaking people until recent times. ‘We do believe,’ wrote the robustly worldly Anthony Trollope in 1874, ‘that if we live and die in sin we shall after some fashion come to great punishment.’ Precisely how potent was the belief in the afterlife may be open to doubt. John Lightfoot in the midseventeenth century observed that although many believed the Day of Judgment to be imminent, ‘yet few prepare for the end, which they think, is so near.’ James Stephen, later to become the kingpin of the Colonial Office, concluded in 1810 that ‘nine-tenths of the hurly-burly of London’ would cease if ‘each individual really believed that in the lapse of a few years at farthest, he should commence an endless course of agony or ecstasy’ based upon divine judgment of one’s conduct in this life.23 By 1896, Gladstone could regretfully conclude that the notion of posthumous punishments ‘appears to be silently passing out of view.’ Gladstone had made his own choice between the eternal and temporal two decades earlier. During his brief retirement from active politics, he had begun to write on Future Retribution, but in 1877 he was ‘called away’ (Gladstone’s decisions were often forced upon him by a higher power) to campaign against Turkish rule in Bulgaria, a hell very much of this world.24 Attitudes towards past futures can sometimes be more easily recoverable from unspoken assumptions than from explicit statements. The point was borne upon one of Sir Robert Peel’s reactionary allies during

118 / Past Futures: The Impossible Necessity of History the Reform crisis of 1831, when the disintegration of civil society seemed a real possibility. ‘Look at any country house which has been built in England for three hundred years,’ wrote J.C. Hobhouse in some bitterness, ‘and you will see that the owners have never dreamed of being obliged to provide against the attack of a Mob.’25 The peaceful English country houses that we now interpret as evidence of a golden past were once powerful statements of a confident belief in an endlessly comfortable future. It is no accident that the unfortified country house, one of the features of sixteenth-century England, took much longer to arrive in more turbulent Scotland. As late as 1632, the Lowland laird Sir Robert Kerr urged his son to be cautious in converting the family castle into domestic accommodation: the building, he advised, should be ‘strong on the out syde because the world may change agayn.’26 Political gesture may be as effective as architecture in conveying complex insights into past futures. In the crisis of 1940, the Canterbury School Board in distant New Zealand ordered children to begin classes each day by singing ‘There’ll always be an England.’ There was a tense ambiguity even in this explicit invocation of the survival of an ‘England strong and free’ at a moment of desperate danger, but the gesture also revealed far deeper assumptions about New Zealand’s eternal place in the world: ‘If England means as much to you / As England means to me.’27 Once past futures have been recovered from prediction or assumption, how can they be used to enrich historical study? We should start with the stern methodological point laid down by the historian Ron Norris in the debate over the decision of Australians to vote for federation at referenda held in 1898 and 1899. Since federation would remove intercolonial tariffs, some historians felt that the outcome of those popular votes could be explained in terms of winners and losers from free trade. However, in a study of South Australia, Geoffrey Blainey dismissed the argument altogether, since 40 per cent of the ‘Yes’ vote came from wheat-growing districts, even though wheat farmers ‘could, and did, gain little from the removal of colonial tariffs.’ Norris was devastating in his response. What happened to wheat exports after 1901 was ‘not relevant.’ All that mattered ‘is what was thought likely to happen at the time of the referendum.’ South Australia’s wheat lands had plenty of other economic interests, and even farmers

Past Futures / 119 who specialized in wheat were pinning their hopes, not without reason, on more open markets for flour.28 The rebuttal is a reminder that it is unwise for historians to assume that merely because people lived a long time ago, they did not know their own business. The point made by Norris is so obvious that it hardly seems necessary that it should be spelled out. Yet the start-and-finish dates of some major episodes are so deeply engraved in our memories that it requires a conscious effort to appreciate that contemporaries were aware only of the first and could not foresee the second. Everyone knows that the First World War began on 4 August 1914 and ended at the eleventh hour of the eleventh day of the eleventh month of 1918. Contemporaries, of course, were aware of the first date but could not foretell the second. ‘I am afraid the war will go on for ever at this rate,’ Clementine Churchill wrote to her famous husband in January 1916. It is through this filter of pessimistic expectation that we should seek to understand an apparently mysterious episode in British politics, the Maurice Debate of May 1918. Historians have generally criticized Asquith, who had been ousted as prime minister eighteen months earlier, for challenging his successor, Lloyd George, on military policy just six months before the end of the war. In the aftermath of heavy German attacks two months earlier, General Maurice had publicly accused Lloyd George of lying about the strength of the British army in France. Asquith, who had remained silent since his own downfall, chose to take up Maurice’s allegations. Even more puzzling, Asquith chose not to mount a frontal attack but instead delivered a low-key demand for a committee of enquiry.29 Asquith’s tactics have been condemned not in their own terms but because they have been filtered through two huge chunks of hindsight: first, that the war would soon end in an Allied victory and, second, that Asquith never thereafter recovered his previous political stature. Both considerations are totally irrelevant to an assessment of his actions in May 1918, the first especially so. Nobody knew that the war was going to end so soon. Only a few months earlier, a former governor general of Canada, Lord Lansdowne, had courageously called for a negotiated end to the fighting, since Allied leaders ‘tell us that they scan the horizon in vain for the prospect of a lasting peace.’30 Thus, in interpreting the politics of May 1918, two months after the British

120 / Past Futures: The Impossible Necessity of History commander-in-chief in France had described his troops as having their ‘backs to the wall,’ it makes more sense to assess Asquith’s political skill in terms of his pre-war mastery rather than his post-war decline. Indeed, his decision to press for a committee of enquiry was evidence not of any loss of political grip but the product of his enduring political cunning. With the Germans shifting soldiers to the West from the collapsing Russian front, the most likely past future immediately looming over the Maurice Debate was a further series of hammer blows in France that would expose Lloyd George’s lies far more effectively than any amount of Asquithian oratory. Far too often, historians allow their retrospective knowledge of events, or even their own crotchets, to obscure understanding of how the people they study reacted in the freeze-frame of a past present. Frederick the Great’s sudden seizure of Silesia in 1740 forms a dramatic event in eighteenth-century diplomatic history. Silesia stuck out at right-angles from his Prussian kingdom, dividing Saxony from Poland. The Elector of Saxony was king of Poland, but the powerful Polish nobility ignored their sovereign, since he could not get at them. If Saxony grabbed Silesia first, Poland would be tamed and Prussia hemmed in. In a biography first published in 1904 that long remained an authority in the English-speaking world, W.F. Reddaway recognized that the Elector was eyeing the prize, but simply overrode Frederick’s past future. ‘Had we no other guide than the map, we might be tempted to guess that it was to avert this peril that Frederick seized Silesia.’ Ignoring the absurdity of Austria-Hungary, still sprawling across the map in 1904, Reddaway pronounced that differences in language, religion, and national temperament would have made it ‘impossible’ for Saxons and Poles to form a single state.31 The annexation of Silesia proved to be milestone in the rise of Prussia, relegating Saxony to a minor role in German history. Equally, another central European problem, the Munich crisis of 1938, looms large in the story of the march of Hitler. Verdicts on British and French policy at Munich have been distorted by an unresolved glitch in historical methodology. To assess what happened in 1938, historians turn first to the opinions of the people of 1938. Unluckily, most of the people of 1938 who troubled to comment were ferocious partisans, whose objectivity may be gauged from the titles of books such as Guilty Men.

Past Futures / 121 It was all too easy for a more detached second generation of analysts to confer evidential status on the polemic of the first. This process creates a consensus, in the tendentious form of the judgment of history (or of historians), that becomes difficult to shift. In the case of Munich, the conventional wisdom has taken the form of condemnation of the Western democracies for abandoning Czechoslovakia in a craven response to Hitler’s bidding that still endows the term ‘appeasement’ with a pejorative meaning. The root problem with this condemnation is that it depends upon just one alternative past future, the assertion that Hitler could have been stopped in his tracks had the leaders of Britain and France, Chamberlain and Daladier, called his bluff. The various shortcomings in the assumption are easily summarized. For the British, who were still rearming, an eyeballing confrontation with Hitler would have been a high-risk strategy indeed. Nor was it certain that the factious French would stand firm. In any case, there was little that either country could have done to save the Czechs. The real weakness in the scornful denigration of the Men of Munich lies in the problem that it entirely ignores another alternative past future: what would have happened to Czechoslovakia if the Western powers had indeed succeeded in calling Hitler’s bluff in 1938? The Czechs and Slovaks in their fragile partnership constituted barely two-thirds of the republic’s population. About a quarter of the citizens of Czechoslovakia were German-speaking, concentrated in local majorities in border areas. It may seem astonishing to us, but they voted for politicians who campaigned to become part of Nazi Germany. Critics of the Munich agreement were reduced to denying that the people of the Sudetenland were Germans at all, portraying them rather as Teutons trapped in a temporary identity crisis from which they would emerge once their aspirations were firmly blocked. Churchill appealed to the ‘Sudeten Deutsch’ to ‘become trusted and honoured partners’ in a free Czechoslovakia. As a solution to a European crisis, this was fantasy on a grand scale. More realistically, one of Chamberlain’s supporters pointed out that there was ‘nothing more ridiculous than a guarantee that the frontiers of Czechoslovakia should not be violated when half the people in that country could not be relied upon.’ Perhaps Hitler could have been faced down and even ousted altogether in 1938, but international brinkmanship alone would not have stilled the complaints of Czechoslovakia’s

122 / Past Futures: The Impossible Necessity of History restive minorities, for we should not forget that the country’s Poles and Hungarians were equally discontented. Above all, as Taylor rightly pronounced, ‘the German national movement ... was a fact; and those who wanted to “stand by Czechoslovakia” never explained how they would deal with this fact.’32 It is far more likely that a tough stance by Chamberlain at Munich in 1938 would have committed Britain to backing the Czech government in an unpalatable and futile confrontation with its own unwilling citizens in 1940 or 1942. Hindsight has discounted this particular lost past future largely because of the engulfing effect of the Second World War upon our view of all preceding episodes. Moreover, the Sudeten problem, like eighteenth-century Saxony and Poland, has vanished from the modern map, because the Czechs after 1945 drove the German-speaking population out of the Sudetenland altogether.33 By contrast, Chamberlain’s preference for shifting the boundary seems both far-sighted and humane. Thus, in seeking to understand any single episode in the past, we have to remember that for those participants who shaped the outcome, it represented just one step in a range of probable continuums. This may be seen in three events that were highly specific in time, the outbreak of the American Civil War in 1861, of the Anglo-Boer War in 1899, and of the First World War in 1914. In each case, hindsight may tempt historians to suggest that compromise could have averted the conflict. Unfortunately, such a view would overlook a basic consideration, one that was all too obvious to those forced to take decisions from a narrowing range of options: the mere avoidance of war would not have constituted the solution of the underlying problems. Thus, a compromise in 1861 or 1899 or 1914 could offer no guarantee that war would not break out in 1862 or 1900 or 1915, when conditions for victory might be less favourable. Compromise schemes aplenty were dangled before President-elect Abraham Lincoln during the secession winter of 1860–1, but in a laconic comment he set his face against further concessions to the slave-owners. ‘The tug has to come, & better now, than any time hereafter.’34 Of course, Lincoln wanted a political showdown, not a shooting war. Alfred Milner, Britain’s High Commissioner in South Africa on the eve of the Boer War, was more coldblooded in his rejection of concession. In May 1899 he declared that ‘it would be better to fight now than 5 or 10 years hence, when the Trans-

Past Futures / 123 vaal ... will be stronger and more hostile than ever.’35 The German ambassador to London, Lichnowski, claimed in July 1914 that a similar mood of ‘pessimism’ existed in Berlin, where fatalistic diplomats were arguing that ‘trouble was bound to come and therefore it would be better not to restrain Austria and let trouble come now rather than later.’36 Thus far it has been argued that to understand the people of the past we must discover whether, like Seeley or Gladstone, they believed they could foresee and even control the future, or whether they adopted the fatalistic alternative view of Newman that they were passive in the grip of unknowable destiny. If neither they nor we can fully explain the decisions they took, we can at least seek to locate their actions in time by reconstructing the anticipated past futures to which they tried to respond, and to bear in mind the stern reprimand of Ron Norris as we explore the options open to Frederick the Great. Is it possible to go further, and to categorize futures themselves in order to suggest how they influenced human behaviour in the past presents that we study? At the very least, any given moment in the past can only be understood by establishing whether its predominant past futures were optimistic or pessimistic, long-term or short-term. By and large, this exercise is absent from most historical analysis. Remedying that defect is the key to placing our craft at the heart of all social and political discourse. Perhaps the simplest way of envisaging the future has been to assume that it will be a continuation of the recent past. ‘I can only look forward to the future by the lights given me by the past,’ remarked Christopher Dunkin in 1865 as he dissected the Confederation scheme in terms both savage and distressingly prescient. A century later, his words were echoed by Donald Creighton. ‘Of course, the trends and tendencies of contemporary Canada are not infallible pointers towards its future; but they are the only indications we have got.’37 Sometimes, this approach can be reactionary rather than progressive. Premier Maurice Duplessis, no enthusiast for change himself, used to joke in the nineteen-fifties that the people of Quebec fitted rear-view mirrors to their automobiles to see where they were going.38 A decade later, Quebecers proved more clear-sighted. The Churchill Falls agreement of 1969 entitled them to buy hydro-electric power from Labrador at a

124 / Past Futures: The Impossible Necessity of History price fixed for forty years. Their Newfoundland neighbours took for granted that the low inflation rates of the post-war period could be projected forward indefinitely. They soon found themselves giving their power away. A variant of the future as continuity is the future seen as the reenactment of specific events in the recent past: generals, it is said, are good at fighting the last war. By the early twentieth century, many in Britain not only expected conflict with Germany, but assumed that it would resemble the War of 1870, a short campaign of rapid movement culminating in a knock-out blow. Hence the Punch cartoon of 1909, entitled ‘The Great War of 19––.’ ‘It’s pretty certain we shall have to fight ’em in the course of the next few years,’ remarks an intimidating major, in a clear but unspoken allusion to the Germans. A naive subaltern replies: ‘Well, let’s hope it’ll come between the polo and the huntin’.’39 When the crisis did come in August 1914, it was widely assumed that the war would be over by Christmas. One motive for the initial rush to enlist, especially in distant Canada, was fear of arriving in France too late for the fun. Closely related is the domino theory, widely seen as the driving force behind American policy in the Vietnam war: the fall of one small southeast Asian country would touch off the collapse of the next. This was not a doctrine invented in the Pentagon. It had been invoked by Sir Edward Grey when he warned that Holland and Denmark would follow Belgium into the German maw as an hors-d’oeuvre for the gobbling of France. Similarly, Britain and Canada declared war in September 1939 as an expression of a refusal to wait their turns as the final dominoes to fall in a pattern of Nazi conquest. ‘Austria yesterday, Czechoslovakia today; what of tomorrow and the day after?’ the Winnipeg Free Press had asked at the time of Munich. ‘Hitler’s behaviour ... shows that he meant to go to war with us sooner or later,’ wrote a Cambridge don, adding ‘one might as well swallow one’s pill now rather than sit looking at it.’ It was in a similar spirit that Lincoln in 1858 approached the apparently technical question of the opening of the territories to slavery, which he interpreted as another forward step by an aggressive slave power: ‘Either the opponents of slavery, will arrest the further spread of it, and place it where the public mind shall rest in the belief that it is in the course of ultimate extinction; or its

Past Futures / 125 advocates will push it forward in all the States, old as well as new, North as well as South.’ If the Southern-dominated judiciary could confer upon slave-owners the right to take their property into the territories, then Lincoln predicted that ‘ere long’ there would be ‘another Supreme Court decision, declaring that the Constitution of the United States does not permit a state to exclude slavery from its limits.’40 It is not surprising that Southerners refused to accept Lincoln’s assurances on taking office in 1861 that he did not intend to interfere with their ‘peculiar institution.’ It was enough for them that his policies in the present would be guided by the aim of ‘ultimate extinction.’ It is not always appreciated that the domino theory can be applied in a constructive as well as a catastrophic context. An example can be found in the Duke of Newcastle’s memorandum on the Intercolonial Railway, the document that so randomly survived the fire that destroyed his home. A forward-looking Colonial Secretary, the duke sought to combat the reluctance of his cabinet colleagues to pledge British financial support to the construction of a railway between Halifax and Quebec in the here-and-now of 1862. Critics argued that the project was commercially useless and providing the money would merely foster political corruption in the Canadian present. In response, Newcastle embraced an upbeat version of the domino theory to invoke a more attractive Canadian future. ‘Canada cannot for ever – perhaps not for long – remain a British colony.’ Without the railway to the Maritimes ‘she cannot separate from Britain as an independent State’ and must inevitably collapse into ‘that powerful republic which will be formed in some shape or other on her southern frontier’ as a result of the American Civil War. The railway would lead to a union of the provinces as ‘a strong & self-reliant Colony ... and when she separates from us she would be a powerful and independent Ally and a most valuable, I believe an essential, makeweight in the balance of power on the American continent.’ As if that were not enough, Newcastle took ‘a still deeper dive into futurity.’ Those who had not studied ‘the vast Continent between VanCouvers Island and Canada will hardly be prepared to believe how soon, if time is measured by the life of Nations, there will be a direct communication between the Atlantic and the Pacific.’ The duke’s strategy was to lift a controversial railway project out of its mildly sordid past present by identifying it as a first step

126 / Past Futures: The Impossible Necessity of History towards a glorious end. ‘I cannot imagine an object more clearly marked out for a British statesman to aim at than to secure the continued separation of Canada and the United States and the eventual foundation of a powerful State out of the disjointed and feeble British North American Provinces.’ Lord Palmerston’s cabinet decided by a single vote to cast the first domino in the pattern that would lead to a transcontinental Canada.41 Forward projection of the recent past may seem a sensible way of conceptualizing the future, so long as it is possible to identify the dominant contemporary trends and tendencies and stack the right combination of dominoes. ‘Now if we could only have history beforehand, it would be much more instructive,’ proclaimed a reviewer of 1857. It was futile to be wise after the event. All that was necessary was ‘“to produce” the lines already laid down, and you have the unexecuted part of the line; the prophecy being quite as near the truth as the retrospect.’ Ironically, anyone attempting that exercise in 1857 would have been ill equipped to cope with the decade that followed, in which Britain faced shock after shock: an uprising in India, a war scare with France, and the overshadowing threat of the American Civil War. For the English-speaking world, these international crises coincided with the intellectual challenges of Darwin’s theory of evolution and the arrival from Germany of the critical Bible study heralded by Essays and Reviews. In Britain, ten bewildering years culminated in the reemergence of demands for further political change, achieved in 1867 in a Reform Act that Lord Derby branded ‘a leap in the dark.’42 Nor could Confederation in Canada that same year be said to be the result of the forward projection of the ‘unexecuted lines’ obvious in 1857, although it did have a good deal to do with the optimistic agenda of still-unconstructed railways inherited from that year. Moreover, even when forward projection of the current scene did produce reasonably accurate forecasts of the near future, it was a methodology that proved less and less reliable over the longer sweep of time, as an ever-widening range of variables had to be secondguessed. An engaging example of the shortcomings of a fantastical future can be found in the political tract published in 1947 by the South African historian Arthur Keppel-Jones. When Smuts Goes was an imaginary history of South Africa heading for disaster under Afrikaner

Past Futures / 127 white-supremacist rule in the second half of the twentieth century. A sense of the bizarre enabled Keppel-Jones to predict with some accuracy the segregationist legislation that would be imposed in the early years of a Nationalist government. Nor was it difficult to foretell other medium-term developments, such as the political marginalization of English-speaking South Africans, white emigration, and black unrest. Yet the further Keppel-Jones moved into his fantasy future, the more it became clear that he took for granted the continuation of landmarks that hindsight rushes to identify as highly provisional. He located South Africa’s great crisis in 1977 when the Afrikaner regime tried to stave off internal challenges by declaring war against Britain. In the climax to his narrative, the Royal Air Force bombed Pretoria from bases in Southern Rhodesia, which also provided the springboard from which tanks of the Coldstream Guards ripped into the Transvaal.43 The real-life world of 1977 was very different: in 1948, it was hard to foresee that Britain would abdicate its imperial role in tropical Africa within twenty years, and that its tanks and bombers would be impotent against a small but defiant community of Rhodesian settlers. Actuarial and institutional futures are simple variations upon the forward projection of the recent past. What will happen when the king dies? or the president retires? or the election comes? Sometime around the year 1500, a spy in the English garrison at Calais reported a seditious conversation in which senior officers discussed the poor health of Henry VII, ‘and of the world that should be after him’ if the king did not recover. The plotters assumed that Henry’s teenage son, Prince Arthur, would be passed over in favour of some powerful nobleman with a distant claim to the throne, such as the Duke of Buckingham or Edmund de la Pole. Despite the dynastic uncertainty, those present seemed confident that they ‘should be sure to make their peace, howsoever the world turn.’44 By modern standards, Henry VII was still a young man when he died in 1509, but he had outlived the sickly Arthur. The throne passed to his second son, Henry VIII, a mere boy who had not figured in the Calais speculations at all. Henry VIII had de la Pole beheaded in 1513, and Buckingham followed him to the block in 1521. It was probably no coincidence that Buckingham’s luck ran out when Bluff King Harry was becoming increasingly troubled by his own failure to produce a male heir.

128 / Past Futures: The Impossible Necessity of History Politicians, like generals, need an element of luck in identifying who is destined to make an early exit from the scene. When Quebec politics were disturbed by episcopal intervention in secular affairs during the eighteen-seventies, John A. Macdonald as usual declined to join in an anticlerical alliance. The ultramontane campaign, he argued in 1875, ‘depends on the life of two old men,’ the increasingly isolated Pope Pius IX in Rome and the controversial Bishop Bourget of Montreal.45 It was a shrewd actuarial assessment: Bourget was forced to retire within a year, and Vatican policy changed when the pope died in 1878. By contrast, the two greatest miscalculations in the political career of the British radical-turned-imperialist Joseph Chamberlain involved underestimates of the durability of two very tough old men. In 1877, Chamberlain supported the sixty-eight-year-old Gladstone in his bid to recover the Liberal leadership from which he had recently officially retired. Gladstone’s comeback seemed likely to be brief, but his triumph would be sufficient to weaken the aristocratic wing of the party, thus paving the way for Chamberlain and the radicals to take eventual control. Who could have foreseen that the Grand Old Man would remain at the helm for a further seventeen years, long enough to drive Chamberlain himself out of the party altogether? Almost two decades later, in 1895, Chamberlain became Colonial Secretary in the coalition that had been formed to block Gladstone, and repeated his miscalculation with the elderly Transvaal president Paul Kruger. Although now remembered largely for his part in bringing about the Boer War in 1899, Chamberlain in fact pursued a policy of patient pressure during his first three years in office. Until March 1898, he had believed that ‘time must be on our side,’ that there had to be an ‘improvement’ in the British position ‘when the present rule of President Kruger comes to an end, as it must do before many years are over.’46 Unfortunately, although this towering obstacle to a favourable South African settlement was well past his seventieth birthday, Kruger managed to get himself re-elected early in 1898, and showed no sign at all of wishing to meet his Maker. The collapse of Chamberlain’s comforting actuarial future left a vacuum that could only be filled by clumsy confrontation. Democratic societies institutionalize changeovers in power by setting regular terms of office. In the early days of the American republic, Alexander Hamilton made a double prediction: the time would come

Past Futures / 129 ‘when every vital question of state will be merged in the question, “Who will be the next President?”’47 The fixed four-year cycle of American politics is matched by a more flexible three-to-five-year electoral rhythm in Britain and Canada, but in each country the calendar of every politician is calibrated according to the likely date of the next election. Arguably, one of the most important landmarks in the decline of the British Liberal party was the general election of 1915. Of course, no psephologist will ever study the general election of 1915 – because it never took place. For Asquith’s government, the price of avoiding an appeal to the voters in wartime was a coalition with the opposition Conservatives, and from this shotgun alliance flowed many of the pressures that split the British Liberal party. Something similar happened in Canada, but there the phantom election of 1916 was merely postponed. Coalition, the price of avoiding an election in Britain, became the precondition for Canada to go to the polls a year later. Written constitutions design long-term structures in response to immediate needs, with the concomitant risk that they become inflexible in the face of changing circumstances. In the Federalist Papers, Alexander Hamilton argued that there should be ‘a capacity to provide for future contingencies as they may happen,’ although it was ‘vain to hope to guard against events too mighty for human foresight or precaution.’48 As an Australian federalist put it a century later, a constitution should be designed ‘on the same lines as the building of a bridge, allowing for its expansion in the hot weather, and contraction during the cold weather.’ Indeed, Australia’s constitution-makers even attempted to disprove the second half of Hamilton’s argument. In their first draft, written as early as 1891, they evidently sought to give the proposed federal government control over wireless communications. This was a remarkable exercise in futurology, since wireless had not yet been invented. The key to the mystery probably lies in the wide intellectual interests of the premier of Queensland, Samuel Griffith, who was briefed on the potential of Herzian waves by the professor of physics at Sydney University. The drafting committee at the Federal Convention of 1891 had already decided that posts, telegraphs, and a new gadget called the telephone should all be controlled by the central government. It was a simple matter to expand the clause to include ‘the transmission of information by any natural power.’49

130 / Past Futures: The Impossible Necessity of History Critics of Canadian Confederation in the eighteen-sixties noted that the blueprint lacked the Hamiltonian capacity to adjust to future contingencies.50 Products of a world in which colonies were ultimately regulated by British acts of parliament, and unduly hopeful of creating a system that would evolve by consensus, the Fathers of Confederation apparently gave no thought to the question of an amending formula. This omission, however, did not necessarily rule out changes in the internal balance of power. In December 1864, John A. Macdonald privately assured his doubting political ally, M.C. Cameron, that ‘you, if spared the ordinary age of men, will see both the Local Parliaments & Governments absorbed in the General power.’ The evidence requires interrogation. Macdonald, as always, was slanting his comments to their target. Cameron was not only a personal friend who might be trusted with a confidence but also a prominent Conservative critic of the Quebec scheme whom it was prudent to conciliate with such a reassurance. Moreover, Cameron was in his early forties, and so presumably some years would elapse before he reached ‘the ordinary age of men,’ by which time it was possible that neither he nor Macdonald would be active in public life. Nonetheless, Macdonald’s request that his prophecy should not be made known in French Canada suggests that his expectation may have been genuine. Even so, it may have been no more than one of a range of possibilities in his mind.51 ‘Except Macdonald, I know of none of the Delegates who really think enough of the future that is before us,’ wrote Alexander Galt from the London conference of 1866–7 that helped draw up the British North America Act. Galt added that ‘even he considers that our present immediate task is to complete the Union, leaving the rest to be solved by time.’ Twenty years later, Macdonald confided to another close associate that he was troubled by the implications of a powerful Quebec, not because of its language or religion, but because its distinct legal code would block the integration of a transcontinental Canada. ‘It is our duty as founders of a nation to look far into the future.’ Macdonald was looking ahead, ‘farther ahead perhaps than I should,’ and he disliked what he saw, as his expected future of a Dominion composed of subservient provinces gradually came unstuck.52 Sometimes, a comfortable future collapses abruptly, creating a past present awed by the sudden prospect of terrifying discontinuity. ‘The

Past Futures / 131 lamps are going out all over Europe,’ Sir Edward Grey remarked as he looked out of the windows of Whitehall into the darkening night on 3 August 1914; ‘we shall not see them lit again in our life-time.53 Like Kitchener, who began organizing for a war that would last for three years, Grey grasped that the war would not be over by Christmas, and that what Americans termed ‘normalcy’ was not going to return. Looking backwards, treating the twentieth century as an episodic soap opera, we can easily see how the carnage of the First World War tore apart the world that had gone before. It is only by projecting ourselves into the contemporary imagination that we can appreciate how the war destroyed so many of their familiar landmarks, thus shattering their assumption that life would continue to unfold in the future as it had always done in the past. This is the key to understanding the cross-party political combinations that emerged both at Westminster and in Ottawa. In Britain, the Liberal politician David Lloyd George had found no support for his bizarre schemes for emergency national governments in pre-war days; by December 1916, he was prime minister at the head of a win-the-war coalition. ‘Lloyd George lived for the day and the hour,’ according to his Canadian crony, Lord Beaverbrook. It was appropriate that he took supreme power at a terrifying moment in time when the future seemed to have disappeared altogether: we have to keep reminding ourselves that in December 1916 nobody knew when, let alone how, the war would end.54 Indeed, sometimes we can only fully appreciate the impact of crisis upon a society by realizing how great had been the optimism which had immediately preceded it. It is a central fact in Australian history that in the eighteen-nineties the continent passed through an interlocking inferno of challenges, economic, social, and even climatic. Yet it is impossible to appreciate just how destabilizing was the experience without understanding that it came immediately after several decades of heady optimism, culminating in 1888 with the celebration of one hundred years of European settlement, a story of progress based upon the unlikely foundations of a convict prison. ‘Australia can march forward to the second anniversary [sic] without a cloud to dim the horizon of the future,’ a New South Wales newspaper had proclaimed in January 1888.55 A more pessimistic Australia might have been better equipped to respond as its banks began to collapse,

132 / Past Futures: The Impossible Necessity of History its rainfall dried up, and its social order was threatened by violent labour disputes. Where an expected future is an article of national faith, its repeated failure to arrive can influence successive past presents in a more subtle way. In fact, dogmatic certainty about the future may actually contribute to the blocking of its own realization. One of the most fundamental facts in the history of Canada is that the country has not become part of the United States. Yet until well into the twentieth century, American national dogma asserted that it was inevitable that the Stars and Stripes would wave over the whole of North America, from the North Pole to the Caribbean, and beyond. So self-evident was this ultimate ‘Manifest Destiny’ in the eyes of Americans that, in two hundred years, they have made only two practical attempts to design machinery through which Canada might join the American Union. The first, a clause in the Articles of Confederation of 1777, was not carried over into the Constitution a decade later. The second took the form of a bill introduced into Congress in 1866 which quickly dropped out of sight in committee.56 At no time did the United States seriously consider modifying its own institutions to accommodate Canada’s parliamentary system or recognize Quebec’s distinct language. In short, Americans made virtually no practical attempts to pave the way for the incorporation of Canada. Was this because their own national mythology assured them that annexation was bound to take place some day? The corollary of this absence of practical interest on the American side was that no Canadian politician was ever tempted to engage in a sustained campaign for continental union. It seems that Manifest Destiny may be classified as a self-defeating future. The same mental straitjacket encased Ireland for at least half a century after its partition in 1920–2. Nationalists believed so passionately in the ultimate reunion of Ireland that they persuaded themselves that wayward Ulster must inevitably accept the folly of its ways. Consequently, they saw no reason to reshape either society or institutions in the Catholic south of Ireland to make them more accommodating to the Protestant Irish of the North. In 1971 the powerful Archbishop of Dublin not only dismissed the argument that the legalization of contraception in the South would advance the cause of Irish unity, but referred knowingly to ‘the indignant ridicule with which good North-

Past Futures / 133 ern people would treat such an argument.’ Why make concessions to achieve an outcome you believe will happen anyway? An extreme form of this mindset drove IRA terrorism. As a Sinn Fein apologist put it in 1992, republicans possessed ‘an absolute certainty that they will win, even when objective reality tells them it’s not going to be in their lifetime.’ Thus, dogmatic belief in the inevitability of the unity of Ireland contributed year by year to entrenching its division into two states.57 The sudden acceleration of a distantly perceived future lies behind both the creation of the Dominion of Canada in 1867 and the collapse of the British empire after 1945. By the end of the eighteen-fifties, the union of British North America was vaguely seen as part of ‘a matter in the distance.’ Lifting its eyes briefly above the malice of daily journalism, the Toronto Globe in 1856 suggested that if the link with Britain were ever broken, ‘it is not only probable, but morally certain, that instead of seeking annexation to the States ... a confederacy of all the Provinces would be cemented, and then a great northern power would be formed.’58 The Nova Scotian P.S. Hamilton put it more bluntly: the provinces had no history and only through union could they gain ‘an assurance that the future would not present the same dreary void.’59 However, there was little pressure to bring that future closer. When the Canadian ministry raised the issue with the other colonies in 1858, it sought only to examine ‘the principles on which a bond of a federal character ... may perhaps hereafter be practicable.’ ‘In a dozen years,’ remarked a New Brunswick newspaper in 1861, ‘it will be time enough to think of federation.’60 When the Duke of Newcastle tried to prod colonial politicians into considering Confederation along with the Intercolonial Railway in 1862, they preferred to postpone the question to ‘a more convenient season.’61 How then can we answer the ‘why when?’ question that put Confederation on to the agenda for immediate action in 1864? Evidently, the future had suddenly speeded up, leading another New Brunswick journalist to write: ‘[W]e have all at different times had our dreams of a future when the British possessions in America should become one great nation. For the first time we are being brought face to face with the reality.’62 As W.H. Russell pointed out, there were many possible destinies for British North America. The historian of past futures can-

134 / Past Futures: The Impossible Necessity of History not simply thread together the prophecies that turned out to be true and ignore those which never came to fruition. As early as 1856, the Toronto Globe had been confident of ‘the ultimate accomplishment of a railway from the head of Lake Superior to the Pacific,’ and it is tempting to slot this far-sighted prophecy into the deep background that seems to colour the inevitability of the Confederation story. Yet in that same article, the Globe added that it did not ‘presume to question the possibility of constructing a canal from the harbour of Toronto to the Georgian Bay.’63 (John A. Macdonald, by contrast, jocularly commented that the Georgian Bay canal would be ‘made when our grandchildren are greybeards.’)64 Why, then, did the Confederation future arrive so abruptly in 1864, when so many other mirages dissolved on the distant horizon? Chapter 4 argues that it was the ministerial crisis of June 1864 that forced politicians in the province of Canada to face the crucial first half of their own decision, the decision to decide. This, in turn, ensured that a vaguely foreseen future of British North American union made a sudden arrival on Maritime doorsteps, forcing sometime reluctant politicians in the smaller provinces to a here-and-now response. Only a few months previously, a Halifax newspaper had concluded, with apparent regret, that Confederation was ‘no longer a project which men of the present generation can hope to see accomplished.’65 To maintain its fragile unity, Canada’s Great Coalition adopted a dual policy: it would devote twelve months to the pursuit of a union of all the provinces. If unsuccessful in this larger aim, it would then turn to the creation of a smaller-scale, Ontario-Quebec federation. Hence, in 1864, Confederation was not an inevitable destiny but a highly transient opportunity. ‘Everybody admits that Union must take place sometime,’ John A. Macdonald told a Nova Scotian audience. ‘I say now is the time.’ The more Maritime resistance to Confederation is examined, the more it appears that doubts were directed not against the end but at the terms. As George Brown told that Halifax gathering in 1864, ‘at no time has any Provincial Statesman ever expressed a doubt that the fitting future of these Colonies was to be united under one Government.’66 The historian William M. Baker argues that if Confederation had been ‘postponed in the mid-1860s, it would have been postponed indefinitely.’67 While there is no way of proving Baker’s

Past Futures / 135 assertion, it was certainly true in the case of Newfoundland, where each failed courtship strengthened the barriers against Canadian wooing. What is clear is that there were enough Maritime politicians who preferred not to take the risk of hoping for a better offer. In 1867, it seemed better for Nova Scotians and New Brunswickers to swallow their reservations about the immediate conditions of union in order to seize the chance of securing the long-term end. Identifying the union of the provinces as an eventual target, Sir Edmund Head in 1851 thought it immaterial ‘whether that future be arrived at gradually or suddenly.’68 In the event, the disorienting suddenness with which Confederation arrived as a practical proposition left a legacy of resentment among many Maritimers. In the case of Canadian Confederation, acceleration of a widely accepted but previously distant future resulted in the creation of a new and positive structure. For the post-war British empire, the sudden approach of a remote future brought about a downward slide masked by talk of ‘creative abdication.’ Once the process began, it could only gather speed. In 1948, British policy was based on the assumption that it would take twenty-five years for Malaya to achieve local selfgovernment. Ten years later, Malaya was fully independent. Cyprus made the journey from ‘never’ to full independence in just five years. ‘Why am I here?’, said a senior colonial administrator in the Gold Coast in 1953: ‘to preside over my own liquidation.’69 One casualty of rapid decolonization was the philosophy of ‘partnership’ in the Central African Federation. In the medium term, ‘partnership’ looked remarkably like white supremacy, since it justified the disproportionate allocation of resources to the settler community in order to create a skilled (or ‘civilised’) nucleus which could lead the African majority towards higher standards. The Federation’s white prime minister announced in 1956 that ‘in the distant future Africans may earn the right to become equal partners,’ but even this misty prospect meant only ‘that they could have a half share in the partnership but never more than that.’70 His chief lieutenant expected this process to take up to two hundred years, while the leader of the Southern Rhodesian settlers accompanied the opening of civil-service posts to all races in 1959 with the hope that it would take twenty years before an African might rise to the rank of deputy minister.71 Thus, as an exercise

136 / Past Futures: The Impossible Necessity of History in political futurology, ‘partnership’ in the Rhodesias had something of the air of mañana or the Greek Kalends, a destiny to be postponed indefinitely. It is possible, just possible, that white attitudes might have modified had the Federation been given a couple of decades during which the settler minority might have adjusted to the emergence of a qualified African professional class. In the event, the ostensible aim of partnership was vitiated by the indefinitely extendable temporal concept of the ‘foreseeable future,’ during which white settlers would retain control. What is certain is that time ran out fast: within ten years of its creation, the Federation collapsed. In 1960, partnership was condemned as a sham by a British royal commission (whose membership included the unlikely figure of the Canadian historian Donald Creighton, biographer of Canada’s great nation-builder, John A. Macdonald). More to the point, Africans wanted political and social advance in the here and now.72 The failure of the Rhodesian federation might suggest that the best policy for the control of the present is to seek an immediate realization of the hoped-for long-term future. As D’Arcy McGee put it, quoting lines from Macbeth in an interjection into Christopher Dunkin’s dissection of the Confederation scheme, ‘if ’twere done, ’twere well ’twere done quickly.’ (Dunkin shot back that the lines were ill chosen as they referred to a murder plot.) However, this strategy came unstuck when it was attempted a decade later by Lord Carnarvon, the Colonial Secretary who had steered Canadian Confederation through Westminster in 1867. When he returned to office in 1874, Carnarvon was determined to hurry on the same solution for South Africa. Colonial federation seemed to offer a framework that would provide for both the security of imperial strategic interests and the establishment of a settler government strong and confident enough to act humanely towards Africans. The South African historian C.F. Goodfellow argued that Carnarvon was ‘unique’ among contemporary British policy-makers in his intention ‘to increase not the actually existing but the future security of Britain and her Empire ... Carnarvon took one of the major intentions, so to speak, and translated it into the future tense.’73 The theory seems praiseworthy: of course politicians should look to the long term. Sad to relate, the outcome was disastrous. As necessary preconditions to Confederation, the independent Boer republic in the Transvaal was

Past Futures / 137 absorbed into the Empire, while imperial control was imposed upon the powerful Zulu by a bloody war that devastated a proud people at the cost of a mauling to British prestige. Freed from fear of Zulu aggression, the resentful Transvaalers promptly drove out the British and left the confederation policy in ruins. Perhaps Goodfellow’s insight should be turned inside out: Carnarvon had taken a future conditional and imposed it as a present imperative. It was an example of what the Viceroy of India, Lord Elgin, in 1897 called the ‘danger in looking too far ahead and not observing the rocks that may be under the bow.’74 Elgin’s comment is a reminder that futures may be divided into the short and the long term. It is often the case that those who think in the longer term will out-manoeuvre those who have no defined idea of the future at all. Britain’s first prime minister, Sir Robert Walpole, believed that this was the secret of his manipulation of George II and Queen Caroline: ‘[S]hould I tell either the King or the Queen what I propose to bring them to six months hence, I could never succeed. Step by step I can carry them perhaps the road I wish; but if I ever show them at a distance to what end that road leads, they stop short, and all my designs are defeated.’75 There was a similar dynamic in the political conflict between John A. Macdonald, the nation-builder, and George Brown, a newspaper proprietor all-too-inclined to take the view that sufficient to the day were the editorial denunciations thereof. ‘The great reason why I have always been able to beat Brown,’ Macdonald wrote in 1872, ‘is that I have been able to look a little ahead, while he could on no occasion forego the temptation of a temporary triumph.’76 Essential to the success of such a strategy was some degree of congruence between short-term and long-term aims. Where these were in conflict, the past present could prove fraught, as Sir Charles Bagot found when he became governor general of Canada in 1842. Imperial policy required him to work towards two targets, to govern the province in harmony with its elected assembly while at the same time marginalizing its French Canadian population. The problem was that the two objectives were incompatible, since in cultural self-defence the French formed the largest and most solid bloc in the assembly. Before long, Bagot found that he had no alternative to the admission of

138 / Past Futures: The Impossible Necessity of History LaFontaine and his allies to a share of office, a startling repudiation of the grand imperial rhetoric of anglicization. Bagot found it ‘infinitely perplexing’ to be caught in this conflict of futures: ‘Are we to bide our time, and wait till immigration hems in and overwhelms French population, and French Power? This must happen some day or other – but in the meanwhile I may lose my majority in the Legislature.’77 The British soon solved their dilemma in Canada by abandoning anglicization and accepting that the French would share political power. Such an option did not present itself so readily to British policy-makers elsewhere in the Empire. Gladstone concluded that the fundamental challenge in governing India lay not simply in determining who would eventually rule, but in deciding how to manage the transfer of authority: ‘if it lies for the present on the one side but for the future on the other, a problem has to be solved as to preparation for that future.’78 The British never did find a wholly satisfactory solution to the problem of managing the transition of power in India. Inspirational far futures have sometimes been invoked to add lustre to humdrum past presents. The most comic manifestations take the form of huge exaggerations of predicted population growth. When the mid-nineteenth-century New Zealand settlement of Otago had struggled to a total of 65,000 settlers, one agitator claimed ‘that several millions of industrious people might find the means of comfort and independence within our borders.’ In 1925, a socialist visionary talked of a total population for New Zealand of forty millions, more than ten times its modern total. The leader of the 1916 uprising in Dublin, Padraig Pearse, announced that Ireland would contain at least twenty million people within a century of independence, a fourfold increase. Remarkably, he expected that this could be achieved without urbanization. In 1888, Henry Parkes celebrated the centennial of white settlement in New South Wales by predicting that by 1988 Australians would number fifty million people. Parkes did his bit by fathering the last of his many children when he was in his eighties, but even so his estimate proved to be a three-fold exaggeration.79 The same statistic dazzled John Diefenbaker. ‘We’ll build a nation of fifty million people within the lifetime of many of you here,’ he told a cheering audience at Grand Falls, Newfoundland, in 1958, as he invited them to ‘catch a Vision of the greatness and the potential of this nation.’80 The small

Past Futures / 139 print was important. How long is a lifetime? How many is ‘many’? Even if we forward-project the recent buoyant growth rate in Canada’s population, the youngest Newfoundlander in Diefenbaker’s audience will be over ninety years of age before the prediction is likely to be realized. In fact, precisely because inspirational futures operate in the long term, it is rare for them ever to rebound: few people survive long enough to be disillusioned. One exception was Alfred Tennyson, who was in his middle twenties when he ‘dipt into the future, far as human eye could see / Saw the Vision of the world, and all the wonder that would be.’ Half a century later, Tennyson retreated from the optimism of Locksley Hall, his hopes for ‘the Parliament of Man, the Federation of the world’ finally buried under the first Irish Home Rule crisis: ‘Gone the cry of “Forward, Forward!” lost within a growing gloom / Lost, or only heard in silence from the silence of the tomb.’81 The inspirational future is especially appropriate as a device to dignify and provide stiffening of purpose in a moment of present crisis. ‘Let us therefore brace ourselves to our duty,’ Churchill urged the people of Britain in 1940, ‘and so bear ourselves that if the British Empire and its Commonwealth lasts for a thousand years men will still say, “This was their finest hour.”’ The appeal was much more than patriotic verbiage. Churchill had an immediate political motive for insisting that the British people think ‘of the future and not of the past.’ The price of becoming prime minister had been the maintenance in office of many of the old-guard politicians whom he had previously denounced. To distract attention from the petty personalia of high politics, he warned that ‘if we open a quarrel between the past and the present, we shall find that we have lost the future.’82 The inspirational future can be used to clothe defeat as well. In 1902, Smuts urged the routed Afrikaner nation to listen to pleas for peace that came ‘from the womb of the future.’ The Emperor Hirohito similarly spoke of building ‘a grand peace for all the generations to come’ when he called upon the people of Japan to surrender in 1945.83 However, the inspirational future is generally less effective when it slides into the inverse form that threatens catastrophe. Canada’s constitutional debates have shown this to be a high-risk strategy: ‘the sun was never supposed to rise again once the Meech Lake Accord had failed; hell was going to freeze over when Canadians voted to reject the Char-

140 / Past Futures: The Impossible Necessity of History lottetown Accord two years after that.’84 Locate the catastrophe too far ahead, and it may fail to arouse much concern. Threaten disaster in the near future, and the prophets of doom risk looking very foolish if it does not materialize. Long-term futures loomed large in the early years of colonies of settlement, where there was little or no recent past for immediate forward projection. This contrasted with the mindset of an old, established country such as Britain, where continuity could be taken for granted, as an early governor of New Zealand, George Grey, was reported to have commented on his return home in 1854: ‘[H]e was much struck in coming to England with the way in which we lived for the present. In the colony whatever you do or plan is calculated with a view to what it will or ought to be in twenty or fifty or a hundred years hence. Here nobody looks a year before them.’85 Grey’s attitude was not universally shared: New Zealanders, remarked a journalist in 1870, ‘are busy about their own affairs, and are perfectly willing to let time and circumstance shape the future.’86 Indeed, Grey’s obsession with the future sometimes made him too impatient to dwell in the present. Finding conditions in New Zealand less rosy than he had been led to expect, his successor diplomatically commented that Grey had reported ‘in too favourable a light, and often spoke of a future rather than an actual state.’87 The Victorian intellectual James Bryce thought the ‘chief charm’ of his journey through South Africa in 1897 was ‘the curiosity which the thought of its future inspires,’ although he had to confess that he felt ‘the gravest anxiety’ when he looked ‘sixty or eighty years forward.’88 Many white South Africans also found the future much too alarming to contemplate. There was ‘an almost total absence of clear thinking’ about the development of the African majority, the governor of the Cape, Lord Selborne, noted in 1906. Few asked themselves whether they were ‘prepared to lay down a hard and fast line and say that no coloured man or Native is ever to be allowed to rise above a certain line in the scale of civilization, no matter what natural gifts may have been given him.’ Selborne believed if they were forced to confront the question in that form, most settlers would ‘recoil in horror’ at the implications of slamming the door on non-white progress, and would face the fact that the real issue was how best to guide African advance-

Past Futures / 141 ment.89 Selborne’s opinion was shared by a small cohort of white liberals. J.X. Merriman criticized the proposal to exclude Africans from the franchise: ‘What promise of permanence does this plan give? What hope for the future does it hold out?’90 W.P. Schreiner dismissed the notion of ‘building anything permanent in South Africa by the mere goodwill of ourselves, the European descended population.’ A shrewd paternalist, he argued that encouragement of the African majority was the best way of ‘securing in the future our continued healthy dominance.’91 By contrast, Leopold Amery, a British enthusiast for Empire, hoped in 1900 to conjure away the immediate challenge by indulging in a comforting fantasy about the far future. There was no reason why ‘someday’ a distinguished African should not be a leading South African statesman respected and all that, but not invited to meet white ladies and still less to marry them ... In five hundred years’ time I expect the South African white man will contain a strong dark blend, and the end of all things may be a brown South African race ... That doesn’t matter. What does matter is that there should not be too quick a mixture now or for the next few centuries.92

Even five hundred years was too soon for Smuts. In 1906 he confessed that he found himself looking into ‘shadows and darkness’ when he contemplated ‘the political future’ of the country’s African majority. His response was to ‘shift the intolerable burden ... to the ampler shoulders and stronger brains of the future.’93 Smuts was a great Commonwealth statesman, but the major failure in a great career was his utter inability to chart any path of development towards a multiracial South Africa. By default, white leadership passed to apartheid ideologues, such as Hendrik Verwoerd, who inverted the pattern of denial. Where Smuts had despairingly abdicated responsibility for solving current problems to generations yet unborn, Verwoerd was determined to imprison the distant future within a matrix determined by the present – his present. In 1959, he brushed aside those who favoured ‘gradual concessions’ to the African majority. It might be true that relaxation of white supremacy would make it ‘easy during the course of the next ten or fifteen years ... to continue living as always in the past, making money and being prosperous and avoiding unrest. But

142 / Past Futures: The Impossible Necessity of History what then? Are not the children who come after us worth more than ourselves? The question we must ask is, what will happen to South Africa afterwards?’94 Each of these mental devices was designed to avoid the open acceptance of the basic fact so bleakly stated by Anthony Trollope after a visit in 1878. ‘South Africa is a country of black men, and not of white men,’ he wrote. ‘It has been so; and it will continue to be so.’95 By attempting to deny that future, white South Africans eventually lost their ability to control the present. It was the Zulu leader M.G. Buthelezi, who defined an ‘important difference between a black and white perspective of South African history.’ The white perspective, he argued, ‘looks at yesterday as it leads to today,’ but ‘we see yesterday and today leading to tomorrow.’96 ‘There is always something better to be done, something greater to be attained,’ proclaimed George Cartier in 1864.97 Canada’s history has been overshadowed by Canada’s futures, in a way that has been sometimes inspiring, but perhaps more often intimidating. As the year 2000 approached, some Canadians were uneasily unaware that they had failed to live up to Laurier’s prediction in 1900 that they were embarking upon ‘Canada’s century.’98 In the nineteen-fifties, Bruce Hutchison had caught the mood of the time with a best-seller called Canada: Tomorrow’s Giant – but did his title imply that the country was confined in a pigmy present?99 The dominance of an unknown destiny has too often made successive Canadian presents appear inadequate, conveying a sense of bit-part players incapable of acting out a drama worthy of their gigantic stage. If we could adjust our thinking to see the story of Canada not as a cartoon strip of humdrum past presents but rather as a continuous engagement with alternately dazzling and intimidating past futures, we might more successfully combat the entrenched cynicism that so unfairly condemns Canadian history as merely dull. The optimistic notion of a glorious future for Canada formed part of the context for Confederation in the eighteen-sixties. Textbooks are misleading when they portray westward expansion as part of a formal explanatory package, an element in a ‘deal’ struck within the political elite. It was rather that the West was seen as something out there for the taking, ‘a heritage of priceless value,’ held ‘in trust for the great nation that must yet sit enthroned on the Lakes and the St. Lawrence,

Past Futures / 143 and rule from Labrador to [British] Columbia.’100 Yet the extent of that greatness remained shrouded. In 1872, Lord Dufferin ‘doubted whether the inhabitants of the Dominion themselves are as yet fully awake to the magnificent destiny in store for them.’ Dufferin’s claim to solve the mystery was tenuous, since he was addressing an audience in Belfast to celebrate his recent appointment as governor general of a country that he had not yet visited. None the less, if deficient in practical observation, Dufferin was never short of hyperbole. ‘Like a virgin goddess in a primaeval world,’ he told his Ulster audience, ‘Canada still walks in unconscious beauty among her golden woods ... catching but broken glances of her radiant majesty ... and scarcely recks as yet of the glories awaiting her.’101 Unfortunately, there existed another version of the Canadian future that did not expect the ‘virgin goddess’ to develop into what Dufferin confusingly called a ‘young and virile nationality.’ This was the halfexpectation, half-nightmare that the scattered provinces would subside into the strangling grip of the United States. In 1824, the Nova Scotian T.C. Haliburton described annexation as ‘an event (if a politician I could calculate its approach with as much exactness as an astronomer fixes the period of an eclipse) which we all know must happen.’102 Overt and organized annexationist sentiment probably peaked in 1849, but Goldwin Smith revived the prediction in 1877: ‘Canadian nationality being a lost cause, the ultimate union of Canada with the United States appears now to be morally certain.’ That fine Gladstonian adverb, ‘morally,’ signalled both Smith’s claim to omniscience and his refusal to commit himself to a date. The following year, Goldwin Smith crash-changed into reverse gear and backed John A. Macdonald’s plans to create an east-west Canadian economy surrounded by a protective tariff. In 1891, in his sour but celebrated Canada and the Canadian Question, he finally declared himself a continentalist.103 Ambiguity towards political destiny has formed only one aspect of Canada’s uncertain engagement with its futures. Language lies at the heart of another. The second British governor of Quebec, Governor Carleton, predicted in 1767 that, ‘barring a Catastrophe too shocking to think of, this Country must, to the end of Time, be peopled by the Canadian Race, who have already taken so firm a Root, and got to so great a Height, that any new Stock transplanted will be totally hid ...

144 / Past Futures: The Impossible Necessity of History except in the Towns of Quebec and Montreal.’ Carleton was convinced that the lower St Lawrence valley was going to stay French for ever: ‘there is not the least Probability, this present Superiority should ever diminish, on the Contrary ’tis more than probable it will increase and strengthen daily.’104 Eighty years later, the linguistic future appeared very different to the Earl of Durham. Arrogantly misguided Durham may have been, but his Report did at least represent an attempt to read the future from the standpoint of 1839: ‘[B]efore deciding which of the two races is now to be placed in the ascendant, it is but prudent to inquire which of them must ultimately prevail ... [w]e must not look to the present alone.‘ The provinces ‘must ere long, be filled with an English population,’ leaving the French ‘isolated in the midst of an Anglo-Saxon world.’ Even a successful national revolt would give French Canadians no more than ‘a wretched semblance of feeble independence’ which would soon be engulfed by ‘the intrusion of the surrounding population.’ Durham owes his enduring position in Quebec demonology not so much to his undoubted contempt for the French language as to his cold-blooded certainty that its demise was inevitable. ‘It is but a question of time and mode.’105 The retreat from forced anglicization was rapid. By 1848, Lord Elgin was ‘deeply convinced of the impolicy of ... attempts to denationalize the French.’ Even if the strategy could be made to work, it would not convert French Canadians into cricket-playing Anglicans. ‘You may perhaps americanise, but ... you will never anglicise the French inhabitants of the Province,’ warned this far-sighted governor general. It was better to leave them as they were: ‘who will venture to say that the last hand which waves the British flag on American ground may not be that of a French Canadian?’106 It was by grasping (and, indeed, manipulating) this ineluctable truth that John A. Macdonald made himself Canada’s key political figure. ‘No man in his senses can suppose that this country can for a century to come be governed by a totally unfrenchified Government,’ he told an Anglo-Montreal supporter in 1856. In fact, if they were to decline, ‘the French would give more trouble than they are said now to do,’ simply because they would unite against the threat of cultural extinction. ‘I doubt much however if the French will lose their numerical majority in Lower Canada in a hurry ... [T]hey will hold their own for many a day yet.’107

Past Futures / 145 With hindsight, subsequent anglophone attacks on French language rights look like outbursts of Anglo-Saxon intolerance, bullying attacks upon already weak minorities battling for simple cultural survival. Distasteful they may seem, not least because linguistic supremacism was coupled with sectarianism, but we should remember the rebuke of Ron Norris: it is not for us to judge the past in the light of our retrospective knowledge that the French fact would continue to be threatened in Canada. It did not necessarily seem so a century ago. ‘The notion ... that the French element is dying out, is the very reverse of the fact,’ Goldwin Smith announced in 1886. ‘It is difficult indeed, if Canada remains separate from the United States, to see what the limits of French extension will be.’108 Robert Sellar issued a similar warning in 1916 against the French in Ontario. ‘Who dare predict a limit to the increase in numbers or extension of this priest-managed colony?’ Collective anglophone guilt may hasten to discount Smith and Sellar as notorious extremists, but the demon of expanding clerical power also alarmed J.W. Dafoe, usually regarded as one of the sounder and more progressive commentators of his era.109 Some in Quebec, notably the American-born J.-P. Tardivel, eagerly looked forward to a Frenchlanguage state that would absorb parts of New England and Ontario.110 Some of the ferocity behind the conscription crisis of 1917 may be traced to fears, voiced for instance by the leader of Canada’s Methodists, that English-speaking Protestants were being killed while French-speaking Catholics were left to breed unchecked. As late as 1937, the historian Arthur Lower argued against Canadian involvement in another European war for much the same reason.111 Such sentiments may strike posterity as ignoble and illiberal. Even so, they should still be taken seriously as influential perceptions of the future, even if imaginings of a future that never came about. Although a paranoid few mistrusted the ultimate aims behind Canada’s policy of official bilingualism, by the nineteen-seventies demographic trends seemed to support Durham’s analysis rather than Sellar’s. Outside Quebec, the French language seemed to be facing extinction. Within the province, the forward projections made by the demographer Jacques Henripin sharpened a similar sense of threat. By the year 2001, the francophone proportion of the population of Quebec as a whole seemed set to fall perhaps as low as 71 per cent. Governor

146 / Past Futures: The Impossible Necessity of History Carleton had been wrong in his expectation that ‘new stock’ would secure a long-term position in Quebec City, but his prediction for Montreal was proving all too accurate. The provincial metropolis was on track to becoming a city two-fifths of whose people came from nonFrench backgrounds.112 The scene was set for Bill 101 of 1977 and Bill 178 of 1988. Most people in Quebec saw language legislation as a defensive strategy, while elsewhere in a reluctantly bilingual Canada many regarded it as hypocritical and aggressive. Some critics felt that Quebec was fighting not simply the English language but fate itself. In 1979, the historian Donald Creighton predicted that French would be extinct in Canada within fifty years.113 At issue between Creighton and Henripin was a disagreement over the optimum target date for current policy. Did it make more sense to focus on 2001 and aim at shoring up French in the medium term, or to look ahead towards the perceived inevitability of 2029 and smooth its dying pillow? By the close of the twentieth century, most English-speaking Canadians had come to subscribe to the Governor Carleton–John A. Macdonald view that the French language is likely to dominate Quebec as far ahead as anyone can foresee. Yet among sovereigntist intellectuals, the continued confrontation with the future is more subtle. As Pierre Fournier explained in 1991, it is misleading to think of francophone Quebec as divided into two opposed camps committed to rival visions of its future – one confident and business-oriented, the other culturally insecure and fearful for the future of the language. Rather – so Fournier argues – these attitudes coexist in most individual Quebecers, and it is the interrelationship between the two that produces apparently paradoxical language policies.114 ‘Demain nous appartient,’ sang the separatists as they cruised to power in the traumatic Quebec provincial election of November 1976, but tomorrow would only be theirs if they took draconian action to control today. North American free trade in particular provides Quebec with both the opportunity to benefit from a continental market and the threat that linguistic particularism will be attacked as a form of non-tariff barrier. In this tortured dichotomy, Quebecers at least have the advantage over their fellow Canadians of being able to define the challenge that they face. The architect of the policy, Donald MacDonald, described free trade ‘a leap of faith’ for Canada.115 A leap of faith does not merely

Past Futures / 147 imply confidence in the future. It assumes an awareness of the past as well. Far too often, historical analysis begins by pressing the pause button on a videotape of the past. The historian traps the characters to be studied in the awkward attitudes of the moment, and proceeds to explain how they had come to relate one to another in that frozen past present. We need rather to ask which of the characters captured on the screen contemplated some form of future, in the hope of recovering how they shaped their actions in the moment of their past present to meet those futures. Were their perceived futures short-term or longterm in scope, optimistic or pessimistic in character? Were they adjusting to the collapse of previous assumptions about the road ahead or coping with the sudden arrival of a destiny that had previously been perceived only as a distant possibility? By no means all of our characters operated within a constructed vision of past futures. Some were philosophically content to jump off the Pearsonian bridge. Some gambled upon Churchillian uncertainty. A few took refuge in fantasy. We cannot fully comprehend any past present until we begin to distinguish between those who thought they owned a map of the future and those who shrugged it off to mañana. Such an approach means that historians must place less emphasis on explaining static moments in the past, an exercise that is in any case impossible, and more upon locating those events in a longer sweep of time. In other words, we must approach the study of history by seeing past, present, and future as a single continuum. Only thus can we enter into the world view of the people we study, to recapture how they reached their yes/no decisions in response to the propositions that confronted them at that ephemeral instant when they too stood on the moving frontier between their own past and a future that they could rarely foresee but never ignore. If we can apply this type of analysis to the past, we may perhaps begin to locate our own fleeting present within the sweep of time. To accept that the past can only be understood in relation to its own lost futures is one aspect of opening up a new dimension for history and historians. Its essential counterpart is a re-examination of our own relationship with the past. If the future was often distant, does it necessarily follow that the past is always remote?

148 / Past Futures: The Impossible Necessity of History Chapter 6 explores the nature of our temporal relationship with the past, and suggests that the subjective dimension of length of time intrudes more often than we care to admit upon the ethical basis of historical assessments.

6 A Long Time in History

This page intentionally left blank

The twentieth century still had two-thirds of its course to run when the American writer Alvin Toffler coined the term ‘future shock’ to describe the constantly accelerating impact of the unknown. Writing before the full impact of the computer had made itself felt, Toffler predicted that in the last decades of the twentieth century, ‘millions of ordinary, psychologically normal people will face an abrupt collision with the future.’1 So it proved to be. Furthermore, it is abundantly clear that change is continuing with rapid, multiple, and bewildering force into the new century. Indeed, change is so unstoppable on all fronts that in many aspects of modern life it is difficult to predict with confidence any form of medium-term future. Few existing values and even fewer inherited institutions can be projected forward with any certainty. In contrast with earlier periods, when most invocations of the future took for granted substantial elements of continuity, even so apparently fundamental an institution as life-long and heterosexual marriage is ceasing to be the social norm in the Western world. Yet in personal and actuarial futures, people in advanced societies are living longer and facing the need to provide for their old age. The same constant technological change creates longer life expectancy against a background of ever less predictable futures. Young people must buy education to train them for careers which technology may render irrelevant. The middle-aged must invest their savings in economic activities that cannot be guaranteed to return the profits needed to support them in retirement. Meanwhile, in the Third World, life is nasty, brutish, and focused entirely upon survival in the present. If the mission of history is to locate events in a continuum of past, present, and future, it might seem that there can be no role for historians in a world whose future is as shockingly obscure as its past is so often frustratingly irrecoverable. Perhaps we should devote our efforts wholly to heritage tourism so that those who are frightened by the alarming conjunction of today and tomorrow may take refuge in the imagined womb of yesterday. Such a defeatist strategy recalls one of St John Gogarty’s Dublin characters, ‘whose memory goes back so far that he has forgotten his survival in the present.’ It would condemn the historian, in the words of Hayden White, to become ‘a kind of cultural necrophile.’2 If historians are to help our own societies to cope with the overwhelming onrush of multiple futures, we must begin by posing

152 / Past Futures: The Impossible Necessity of History questions designed to locate the fleeting, frightening present in relation to the past. The more uncertain the relationship between the transient present and the awesome future, the more vital it becomes to anchor our own world to the framework of what has gone before. If we are to begin to identify which of our institutions and ideas have the capacity and value to survive, we must equip ourselves with a firmer idea of the relationship between present and past. Have we arrived at some definite and logical point in our trajectory through time, or are we still travelling? Are we entitled to assume that our advanced Western world represents some form of universal and eternal normality that may be regarded as a logical product of our past, and thereby sufficiently durable and virtuous to survive into the future? To pursue such an enquiry, we must settle two qualitative issues about our relationship with the past that are latent but imprecise in much historical writing. This chapter first asks whether we are close to the past or distant from it: what is a ‘long time’ in history? The second question follows from the first and raises an even wider issue: to what extent can we conduct a dialogue with that past without entering into the realm of ethical values? Jointly, these two discussions pave the way for the core question of chapter 7: what is a ‘significant’ fact or event in history? There are neither easy nor obvious answers to these questions, but if they can be taken aboard as a legitimate part of the agenda of historical discussion, then history itself may resume its rightful place at the centre of intellectual debate on the world around us. The difficulties of adjusting to sudden seismic change are compounded by the tacit assumption that the section of the world in which we happen to live represents civilized, decent, and hence eternal, normality. Western societies have become trapped by subscribing simultaneously to two beliefs that are in fundamental conflict. We implicitly assume that our world represents normality, that our values are inherently right, and by implication constitute the standard against which the activities of other eras should be judged. In effect, we have arrived at the culmination of a long march through human history. We do not simply believe in progress; we have arrived. At the same time, we recognize that ever more rapid and bewildering change makes it impossi-

A Long Time in History / 153 ble to envisage the future simply as a comfortable forward projection of the recent past. We do not find it pleasant to face the unpalatable fact that the second assumption entirely undermines the first. It is not that the story of human history has come to an end, but rather that the plot line taking us into the next episode has utterly disappeared. A century ago railways and telegraphs seemed to present almost as shocking a challenge as computers and satellites today, but there were still those who clung to a cosmic framework within which change might be located. A future British prime minister warned in 1864 against ‘the arrogant idea that ... our own generation, our corner of the earth, is the culmination of the Creator’s work.’ The twentieth century only made the Western corner of the planet feel itself all the more superior. Many have interpreted George Orwell’s novel Nineteen Eighty-four as a warning against a totalitarian future, even though the onward march of the calendar has long since dated its fictional location in time. In fact, Orwell was also warning against dangers already threatening the world in which he lived: Nineteen Eighty-four was a play upon the year it was published, 1948. In that year, Orwell protested against the abandonment of the past. ‘One ought apparently to live in a continuous present, a minute-to-minute cancellation of memory, and if one thinks of the past at all it should merely be to thank God that we are so much better than we used to be.’3 Indeed, one of the few themes common to the dominant ideologies of the twentieth century, whether democracy, fascism, or Communism, has been an emphasis upon untrammelled human control of the world around us. While a defensive return to religious fundamentalism offers the blindest of alleys, it can hardly be denied that our enlightened belief in unfettered human autonomy is particularly unhelpful in an era when even the most advanced of societies cannot regulate their own destinies. As a result, we find ourselves rudderless in a series of transient and mystifying presents, unable to comprehend catastrophes and changes that hit us without warning. Of all the features of the second half of the twentieth century, the one that seemed most enduring was the confrontation of the Cold War. When John Darwin closed his magisterial study of British decolonization in 1988 with the thought that in ‘the Soviet empire perhaps a hundred nations wait to be born,’ he seemed speculative and impractical.4 Yet suddenly the Soviet Union crumbled.

154 / Past Futures: The Impossible Necessity of History Most commentators concluded that it was the motive force of Communism that had failed; one even proclaimed that humanity had arrived at the end of its journey with the triumph of democracy and free enterprise. Humility requires us to take account of the possibility that, one hundred years hence, historians may incline to the view that every twentieth-century megastate was heading for collapse under the weight of its own complexity, and that the Muscovite empire was the first to disintegrate simply because it was the least efficient. In other words, our only chance of coping with the onrushing future lies in a humbling recognition that we are in fact still travelling. Can historians help if not to identify specific elements in our world that may prove to be provisional, at least to prepare advanced societies more generally for the unpredictability of change? In 1995 a British journalist suggested to Janet Bloomfield, chairperson of the Campaign for Nuclear Disarmament, that the fifty years that atomic weapons had existed without a nuclear war was ‘long enough to prove both that the nuclear deterrent works and that it is capable of being managed safely.’ Bloomfield replied with the tale of a man who threw himself from the top of a very tall building. Half way down, he was heard to shout, ‘So far, so good.’5 Merely because we are here today, are we entitled to assume that we have arrived at some logical end point in a historical journey? In the nineteen-fifties, the province of Quebec was generally regarded as conservative, priest-ridden, and mildly corrupt. This was no surprise: Quebec under Duplessis was the logical product of its history, exactly what could be expected of an embattled fragment of seventeenth-century France that had succumbed to conquest and resisted modernization. Yet by 1970, that same Quebec society had been turned inside out by the Quiet Revolution. Women exercised rights rather than bore children; the education system was redesigned to create wealth rather than save souls. Presumably Quebec was still the product of the same cumulative history that had apparently imprisoned it a dozen years earlier. In retrospect, it appeared that the Quebec of the nineteen-fifties had not reached the necessary culmination of its history. However fixated they seemed to be with their rearview mirrors, Quebecers had still been travelling. The often unspoken assumption of our own normality not only causes us problems in responding to the sudden irruption of change,

A Long Time in History / 155 but also makes it puzzling to respond to those situations in the present where Western values do not hold sway. ‘We in Europe, accustomed to see the map of Europe divided into countries each of which is assigned to a particular nationality ... fall into a profound misconception. We assume that wherever, inside or outside of Europe, there is a country which has a name, there must be a nationality answering to it.’ It is startling to realize that these remarks, by J.R. Seeley, were published as far back as 1881. As Seeley added, the misconception is compounded by the fact that ‘we take no clear pains to conceive clearly ... what we call a nationality.’6 In the century that has followed we have not only made little progress in defining nationality, but we have imposed a further requirement, that all nations should be democracies. The reorganization of eastern Europe after the First World War was a landmark in this process: the impish Professor Joad, philosopher and universal pundit, once remarked that a policy designed to make the world safe for democracy had simply made it hard for stamp collectors.7 Unfortunately, the fervour of our secular religion is matched by the vagueness of its theological content. We assume that democracy can simultaneously ensure the rule of the majority and the protection of minority rights. Consequently, we regard the horrors that erupt in countries such as Burundi or Bosnia as aberrations because we cannot bring ourselves to accept the possibility that they conform more closely to the overall patterns of human history than do our own benign, confused, and possibly transient Western values. It was the Californian theorist Francis Fukuyama who announced that the nineteen-eighties marked the ‘end of history.’ Some historians in Britain perhaps found it difficult to treat this concept as seriously as it merited, since they had been raised on Sellar and Yeatman’s bitterly comic history of England, 1066 and All That. Published in 1930, it had concluded (in a chapter mock-moralistically entitled ‘A Bad Thing’) that the replacement of Great Britain by the United States as ‘top nation’ had brought History to a full stop. (In this, they echoed Seeley’s earlier view that Dutch and Swedish history could be regarded ‘as in a manner wound up,’ since these countries had ceased to be major powers.)8 Fukuyama, it seemed, was similarly arguing that the collapse of Soviet Communism showed that capitalist democracy was clearly top doctrine (although this, it seemed, was ‘A Good Thing’), and accordingly History

156 / Past Futures: The Impossible Necessity of History was at an end. As he explained, Fukuyama did not mean that ‘important events would no longer happen,’ but that ‘there would be no further progress in the development of underlying principles and institutions, because all of the really big questions had been settled.’ Three main weaknesses may be suggested in Fukuyama’s thesis. Perhaps the most crucial was the possibility that, as he recognized himself, ‘the present trend toward democracy is a cyclical phenomenon,’ in other words that we had not necessarily arrived at the end of the journey in the nineteen-eighties at all. After all, only a decade earlier, many people (and by no means all of them on the left) shared Khrushchev’s belief that it was Communism that was destined to triumph, a view much encouraged by the American defeat in Vietnam. Fukuyama himself quoted the warning of a French theorist, Jean-François Revel, in 1983, that ‘democracy may, after all, turn out to have been a historical accident, a brief parenthesis that is closing before our eyes.’ Three years later, the historian Hugh Thomas pointed out that in the first decade of the twentieth century, democracy’s little brother, parliamentary government, had briefly seemed to be winning the intellectual battle worldwide, with elective institutions apparently taking root in countries as unlikely as Russia, Persia, and Japan.9 Unhappily, the history of the twentieth century as a whole turned out to be something more than a simple forward projection of this promising trend. Second, Fukuyama appeared to sidestep the issue of definition: what is a liberal democracy? To most people, it would be axiomatic that Canada is a prime example of the species. Yet in the nineteen-eighties, the very decade of the ‘end of history,’ Canada took two fundamental steps that placed considerable limits on untrammelled majority rule. First of all, the Charter of Rights, adopted in 1982, not only extended the constitutional limits upon parliamentary government in Canada, but considerably increased the power of the judiciary to make decisions that previously had been regarded as essentially political. Next, the Free Trade Agreement of 1988 transferred control over a broad range of trade-related issues to (supposedly) binding arbitration with the United States. It may be said that the ability of a community to impose upon itself the restrictions of the Charter is the quality that distinguishes liberal democracy from mob rule. Equally, the partial abrogation of democratic sovereignty in the Free Trade Agreement could

A Long Time in History / 157 be defended as the surrender of a theoretical power that had proved of little use in the face of protectionist policies south of the border. However, neither innovation was formally endorsed by a majority of Canadian voters: the 1982 constitution was not put to a referendum, while parties opposed to free trade won a majority of the popular vote at the general election of 1988. The Charter of Rights sought to resolve the grey area between majority rule and minority rights, but as its implications have been explored, it has become increasingly clear that liberal democracy in Canada is still feeling its way towards answering two fundamental questions: what majorities? and whose rights? The political scientist Jean-Pierre Derrienic has pointed out that Article 356 of the Quebec Civil Code requires a two-thirds majority vote to dissolve any legally constituted association within the province. Thus, while a simple majority at the 1995 referendum on sovereignty would have been deemed sufficient to take Quebec out of Canada, the agreement of twothirds of voting members was required to dissolve a fishing club in Tadoussac. The inconsistency adds a new dimension to Goldwin Smith’s flippant definition of Canada as a series of fishing rods tied end to end.10 Most alarming of all is the reflection that Canada has done a great deal more to confront the inherent tensions between popular majorities and individual rights than most other liberal democracies. Around the world, popular perceptions of democracy remain profoundly muddled. If being in love means you never have to say you are sorry, living in a democracy has come to mean that your interests are your rights, and those rights can never be overridden. This is not a healthy theoretical basis for a philosophy of governance under attack from global terrorism. The third weakness in Fukuyama’s analysis is that he appeals not so much to the triumph of an intellectual argument about the merits of democracy as to a practical trend in its favour. He calculated, for instance, that in 1960 there were thirty-six democracies around the world, a figure that had grown to sixty-one by the late nineteeneighties. This statistic benefits from the simple fact that decolonization had created many more states in the interim. Moreover, democracy had made headway in many countries not simply for its moral and intellectual value but because its adoption was a precondition for gaining economic benefits from the United States and the European Com-

158 / Past Futures: The Impossible Necessity of History munity. Fukuyama remarks that his readers ‘live in stable, long-lasting liberal democracies,’ but by his own classification, only three states – the United States, France, and Switzerland – can trace their existence as liberal democracies as far back as 1790.11 Women and black people might challenge the liberal-democratic credentials of the first example, while stability was not the most obvious quality of French political life in 1790 and indeed at half a dozen subsequent dates. A theory that pivots on Switzerland is hardly sufficient to establish the deep-rooted normality of Fukuyama’s global trend towards democracy. We can only begin to answer questions of this kind if we confront the subjective meaning of adjectives such as ‘long-lasting.’ What constitutes a ‘long time’ in history? Despite the warnings of Seeley, we continue to regard nation-states as the norm. Like Fukuyama, we take for granted that it is no accident that the most successful of them are democracies. We regard it as normal that most people live in large cities, that Western firepower and Western values will dominate the planet. In Britain and the United States and across much of Canada, it seems natural, even morally obligatory, that everyone should speak the international language, English. If asked to identify the defining characteristics of their countries, most Americans would probably refer to world leadership while many in Britain and Canada would appeal to the decent and caring qualities of a welfare state. Perhaps confusingly, each of these defining characteristics is simultaneously regarded as a modern point of arrival, evidence of the superiority of our own times, and as an inheritance from a successful, civilized past. Here we are now and hither we have been heading for a long time. If we go back two hundred and fifty years from the start of the twenty-first century, we find ourselves in a very different world, one that challenges most of our assumptions of normality. In 1750, most European countries were absolutist monarchies, nation-states only doubtfully, and if so largely by accident. Germany ‘consisted of nearly as many separate states as there are days in the year.’12 The United States did not exist, and the American colonists showed no interest in its creation: it was left to the British government to convene a pioneer but fruitless intercolonial congress, at Albany in 1754. The first major

A Long Time in History / 159 British town in what is now mainland Canada, Halifax, was founded in 1749. Europeans knew almost nothing of Australia and New Zealand. In South Africa, it would be twenty years before roving Dutch farmers collided with their first powerful African neighbours, the Xhosa, on the eastern frontier of the Cape colony. The British occupied only a few trading posts in India, and those by courtesy of indigenous rulers. The Chinese regarded all outsiders as barbarian inferiors, as the first British trade mission was to learn when it got to Beijing forty years later in 1793. The Japanese walled themselves off so effectively from the outside world that it was not until 1808 that their government learned of the independence of the United States.13 In 1750 the Union of England with Scotland was barely half a century old, although it had successfully weathered the serious shock of the Jacobite uprising in 1745–6. Sir Robert Walpole, Britain’s first prime minister, had been ousted in 1742, and many of his critics were disappointed that his fall from power had not been accompanied by the traditional ritual of beheading. (Walpole’s long tenure of power remained a record for any parliamentary system until Mackenzie King overtook him in 1948.) The British were brash rather than confident. It was the decade of ‘Rule Britannia,’ with words by a graduate of Edinburgh University that proclaimed their most positive aim as a refusal to become slaves. Thirty years later, the first edition of the Encyclopaedia Britannica (another Scots project) had to admit that as a national tongue, English was ‘less known in every foreign country, than any other language in Europe.’14 Above all, it was a rural world. With around half a million people, London was a gigantic anomaly, three times the size of Dublin and ten times larger than any other town in Britain. Throughout the Atlantic world, cities were minuscule. The Boston of 1773, the year of the famous Tea Party, had about the same population as Truro, Nova Scotia, or Midland, Ontario, two centuries later. Several of the major urban concentrations that dominate Britain and Canada today, Birmingham and Toronto, Manchester and Vancouver, were still little more than villages or Amerindian encampments. Indeed, if we reorient our world picture to think of large-scale urbanization as a relatively recent phenomenon, we might begin to ask whether every single manifestation of urban growth is necessarily a one-way journey.

160 / Past Futures: The Impossible Necessity of History ‘Belfast is a big city,’ wrote the surrealist comedian Spike Milligan in Ulster’s quieter times of 1963. ‘At one time it was quite small, even worse, there has been an occasion when there was no Belfast City at all. Thank heaven, those days are gone and there is now a plentiful supply of Belfast.’ Milligan’s gentle invocation of the absurd contains more than a germ of historical truth. Some sort of Belfast has existed for hundreds of years, but the town hardly registers in history textbooks before the eighteenth century: a local census in 1659 counted just 589 people. In 1801 the population was 19,000, a figure that had more than trebled by the time a fifty-two-year-old Presbyterian minister celebrated Belfast as a symbol of Protestant success forty years later: ‘I remember it almost a village. But what a glorious sight does it now present.’15 Even then, Belfast had barely begun to show its potential. As shipbuilding ousted the supremacy of the linen trade in the halfcentury after its introduction in 1857, the city’s population rocketed above a third of a million. In complex ways, the changing economic base of the city brought about a shift in its internal demography. Not only did Belfast become more Protestant, but political leadership shifted away from the Episcopalians of the Church of Ireland, a minority and mainly landowning community spread across the whole island, towards the more commercial Presbyterians, Irish of Scots descent who were concentrated in the northern province of Ulster. It was those Presbyterians who provided the driving force that enables us to locate in time the Partition of Ireland, an episode that belongs to the decade of the First World War, essentially the product of half a century of rapid growth of wealth and confidence in one industrial metropolis. The political rupture of Ireland between 1920 and 1922 was largely a Belfast-driven process, an objective that the city did not seek earlier and one that it would probably have been unable to impose at a later stage in the twentieth century as its economic dynamism appeared to sap.16 With or without the endemic hatreds of the city’s sectarian ghettos, it must be open to doubt whether there will always be such a plentiful supply of Belfast in the future. Once we begin to think of the processes and institutions of our own time as creations of particular moments in the past, and maybe of a recent past at that, we can confront our own assumptions that the mere fact of their survival today confers upon them some form of historical

A Long Time in History / 161 squatter’s right to continuity. In the nineteen-eighties, one of the most painful aspects of deindustrialization in Britain was the virtual closure of the coal-mining industry. Predictably enough, the running down of the industry was resisted, and for some the process seemed to offend the very laws of nature. Coal-mining had indeed been carried on in Britain for many centuries – so ancient were its origins that in Scotland miners remained in legal serfdom until 1799 – but not until the nineteenth century did it become a staple industry: the 1.1 million workforce in coal-mining in 1913 was twenty-five times larger than its predecessor of 1800. The closure of pits stimulated much sentimental rhetoric about the destruction of communities, implicitly claiming that length of past history had conferred upon them a right to indefinite future survival. Was this defensible? The social historian John Benson regards as ‘typical’ the Durham mining village of Hetton-le-Hole. Hetton dates from 1819, yet 1819 is hardly an eternity ago.17 As with Hetton and the British coal industry, so with democracy and the welfare state. Determining exactly when Britain or Canada became full democracies might make for an agreeable scholarly party game. In Britain, the Reform Act of 1884 marked the point at which the majority of the adult male population could exercise the right to vote, but it was not until 1928 that women acquired the franchise on the same basis as men. Second votes, in respect of business premises and some university degrees, remained until 1949. In Canadian history, parliamentary government was established well in advance of a mass electorate: in the province of Canada in the eighteen-sixties, only about half the adult males could vote (and by no means all did). Canada may have been a ‘new’ society, but there was no onward and upward movement towards democracy. Indeed, during the preceding decade, several colonies had formally closed legal loopholes that permitted the occasional female to qualify for the franchise. Canadian women acquired the vote at Dominion elections in 1918, but they had to wait until 1940 for the same rights in the province of Quebec. Status Indians were excluded from the political process until 1960. In short, full democracy as we know it in Britain and in Canada is an achievement of the twentieth century – and, arguably, relatively far into that century. The flowering of the welfare state in both countries is an even more recent development: Britain’s National Health Service was inaugurated in 1948, while

162 / Past Futures: The Impossible Necessity of History Canada’s Medicare dates from 1965. Of course as citizens we are entitled to regard both democracy and the welfare state as among the greatest achievements of the twentieth century. Still, it does not follow that as historians we have any right to insist that institutions so recently crafted in our own countries must be regarded as eternal norms applicable to the whole world. To note that democracy is a recent, although we may hope enduring, development may render us more humble in assessing its potential for export. Unfortunately, it does not follow that the corollary is true: mere antiquity is not enough to ensure indefinite survival into the future. The first major Viking raid on Britain was the unexpected and devastating sack of the great monastery on the offshore island of Lindisfarne in 793. The scholar Alcuin of York was not merely horrified. He could hardly bring himself to believe that such a disruption of continuity could occur. ‘It has been nearly three hundred and fifty years that we and our fathers have lived in this most beautiful land and never before has such a terror appeared in Britain and never was such a landing from the sea thought possible.’18 Canada’s Native peoples also rudely experienced the transience of eternity. The Micmac were promised in 1761 that the British would protect them for ‘as long as the sun and moon shall endure.’ In the case of Manitoulin Island, where a similar promise was made to the Ojibwa in 1836, this translated into a timespan of just twenty-six years. For Canada as a whole, the nineteennineties saw the suspension of one of the country’s most basic and time-honoured economic activities, the Atlantic cod fishery. The eastcoast fishery represented the longest and oldest activity carried on by people of European descent in Canadian waters, dating back at least to Cabot’s voyage in 1497. It was an article of faith that however else Canada might disappoint those who dreamed of instant riches, the fishery would be a guaranteed source of eternal wealth. ‘The fecundity of the ocean may be estimated,’ said Joseph Howe in 1865, ‘by the fact that the roes of thirty codfish annually replace all the fish that are taken ... on the banks of Newfoundland.’19 It was an almost unbelievable act of discontinuity when, in 1992, the commercial fishery was closed. By the perspectives of Canadian history, an activity pursued since 1497 would seem to have acquired the quality of permanent security that Alcuin of York evidently assumed was owed to the monks on the

A Long Time in History / 163 island of Lindisfarne. Yet five hundred years may seem an impressive span of time only because the study of Canada’s past is insulated from the much longer-lasting histories of countries such as Greece or China, in much the same way that the people of Lilliput never thought of themselves as small until they met Gulliver. It can be a sobering reminder of transience to appreciate that the time span of Canada’s recorded history has only recently exceeded that of long-vanished Roman Britain.20 So what do we mean by a ‘long time’ in history? ‘Two centuries is a very long time in the history of any country,’ writes the Canadian intellectual John Ralston Saul. ‘In human terms a century and a half is a long time,’ says Golo Mann of German history between 1648 and 1789. It is not necessarily so in all cultures. In the late nineteenseventies, there was a lively debate in Ireland over a remarkable Viking settlement at Wood Quay in Dublin, discovered when the site was cleared for the construction of a new municipal building. Campaigners demanded that Wood Quay be preserved, arguing that because Vikings had occupied the place for three hundred years, it was part of Ireland’s heritage. Unfortunately, three hundred years was exactly the span of time that Protestants had been settled in Ulster, and the nationalist mentality in the south of Ireland was only slowly escaping from massive denial of their existence. In the circumstances, three hundred years was perhaps the worst possible definition of a ‘long time’: honouring Dublin’s Vikings might imply acceptance of Belfast’s Protestants.21 In 1831, that unusual combination of clergyman and wit Sydney Smith pronounced thirty years to be ‘an eternity in politics.’ The writer Peter C. Newman believes that Canada was fundamentally changed by a revolution of an unpleasant character during the decade that ended in 1995. Looking back from that year, he remarked that ‘1985 now seems as distant as the Boer War.’ ‘Ten years is a long time,’ remarks the medieval historian Maurice Kean of Richard II’s banishment of Henry Bolingbroke in 1398. ‘Five years is a long time in human life,’ said Winston Churchill as he marked the anniversary of his premiership in the week that saw victory in Europe in 1945. Harold Wilson memorably remarked that a week was a long time in politics.22 In August 1914, time became even more compacted. On the fourth of

164 / Past Futures: The Impossible Necessity of History August, the pacifist philosopher Bertrand Russell tried to mobilize intellectual opinion in favour of neutrality. He believed he had enlisted the support of the newspaper editor H.W. Massingham. Twenty-four hours later, with Britain actually at war, Massingham wrote to Russell, reversing his position. His letter began: ‘Today is not yesterday ...’23 Evidently, the notion of a ‘long time’ is highly subjective, but getting to grips with the concept may be highly important to our own prospects for survival. The belief that fifty years without global war proves that nuclear weapons are safe places a dangerous emphasis on the assumption that those fifty years represented historical normality. In many respects, an epoch in which the whole world was dominated by a Manichean confrontation between two superpowers was highly atypical. Nor are historians alone in the need to determine just how long is a long time. Meteorological observation has established that each year a ‘hole’ appears in the ozone layer above the Antarctic. There is also evidence that world temperature levels are on the increase. Yet when it comes to forging causal links that might account for these phenomena, the greatest scientific brains are aware that their explanations may be no more than post hoc, ergo propter hoc. For all we know, the ozone layer may wax and wane over many centuries, while temperature ranges have probably always been subject to cyclical fluctuation. Observation over a much longer period is needed to establish conclusively whether it is our pollution of the planet that causes these changes – and even then it will probably still be a matter of value judgment as to just how many decades are needed to settle the issues one way or the other. The truth is that all disciplines, whether scientific or humanistic, would gain from a serious historical appraisal of time. It is surely curious that historians have so little to say about the nature and interpretation of something so basic to their studies. Literature departments in universities are full of people obsessed with the nature of text, while the concept of space is central to the work of geographers. Yet most historians are content to retreat into their specialized ‘period’ of research, seeming not to care whether they are swimming in an ocean or a rock pool. One of the few exceptions was Fernand Braudel, who offered a threefold classification of time. First, there is the underlying concept of geographical time, in which continents drift and mountains rise and

A Long Time in History / 165 wane at a pace imperceptible to human observation. Then there is a social time, the ‘gentle rhythms’ and ‘deep-running currents’ in which change occurs over generations. Lastly, there are the ‘short, sharp, nervous vibrations’ that make up individual time. Thus, time resembles Ol’ Man River: the ripples of individual time are of little importance, for below the surface social time keeps rolling along, while deeper down the riverbed may be carving a wholly new course which none of us will live to see.24 Unfortunately, technological change has compacted Braudel’s three categories into one worrisome amalgam. Continents may not be drifting any faster across the earth’s crust, but jet travel, satellite communications, and e-mail have virtually imploded the process by which we experience the relationship between time and space. Scientific advance has similarly speeded up social change: the Pill revolutionized family life, the microchip is transforming the economy. The shattering of Braudelian categories of time should open up new vistas of discourse for historians. This does not mean that we should abandon our archives and our footnotes. Rather, we must harness our scholarly methodology to understanding the new fluidity of time both as quantity and as concept in order to cope with the turmoil of constant confrontation with the future. Because the passage of time is a subjective experience, what seems a long time at one phase may be very short at another. The Second World War was indeed a long and gruelling experience for those who lived through it, the more so as they lacked our retrospective knowledge that it would last for six years and no more: George Orwell’s Nineteen Eighty-four catches the despair of a people who expected war to drag on for decades. Yet a case can be made that the identical period of six years that immediately preceded the outbreak of war in 1939 should be seen as a remarkably short historical preface. Hitler burst upon the scene in 1933 and set in motion his European war almost immediately after he had come to power. The losers of history usually have longer memories than the winners. Patrick O’Farrell remarks that ‘there is a sense in which Ireland has no history ... [A]ll history has remained current affairs.’ In 1865, a nationalist politician, John Blake Dillon, told a parliamentary enquiry that relations between landlord and tenant had been affected by ‘very recent ... and very extensive confiscations’ of Irish property. The con-

166 / Past Futures: The Impossible Necessity of History fiscations, it transpired, had occurred in the time of Cromwell, two hundred years earlier. ‘Any act that has an important bearing upon the present condition of the country,’ Dillon explained, ‘is sufficiently recent to justify me in describing it as very recent.’25 On a visit to Scotland in 1936, the Australian politician Robert Menzies encountered a guide at Jedburgh Abbey ‘who spoke of attacks made on the Abbey 500 hundred years ago with an indignant glint in her eye, just as if she was describing something that had happened to her own family the day before.’26 Contrasting attitudes to the subjective passage of time lie at the heart of the dispute between Argentina and Britain over the Falkland Islands. There is in fact a respectable British claim to sovereignty over the islands that antedates the slight Argentinian pretensions to occupation between 1820 and 1832. Essentially, however, the British case rests upon the fact that the islands have been under British rule since 1833, and by implication that 1833 is a long time ago. ‘The Administration of the Falkland Island by Britain has now continued for 164 years,’ as an official document put it in 1997. By contrast, it suits some Argentine politicians to treat the reappearance of the British in 1833 as a recent event. In 1998, the Buenos Aires government submitted to a United Nations committee a petition from a citizen who claimed special standing in the dispute because she was the greatgreat-granddaughter of the only Argentinian official ever to set foot on the Falklands.27 One can only wonder how many Mexican citizens regard the loss of Texas as a near-contemporary injustice. Subjectivity is also evident in the selection of arguments based on the march of time. Goldwin Smith combined a dogmatic sense of his own infallibility with an astonishing ability to change his mind, notably about the destiny of Canada. In 1877 he was on one of his swings towards continentalism, arguing for the inevitability of the annexation of Canada to the United States. Inconveniently, there was no sign that Canadians actually wished for such an outcome. To override this omission, he invoked the way in which the ‘great forces’ of history could be diverted ‘by the action of secondary forces.’ One ‘remarkable instance’ was ‘the long postponement of the union of Scotland with England.’ Smith attributed this to ‘the antipathies resulting from the abortive attempt of Edward I’ to conquer the northern kingdom, followed by ‘the subsequent train of historical accidents, such as the absorption of

A Long Time in History / 167 England in continental or civil wars.’ This was indeed a neat way of dismissing four whole centuries that followed the battle of Bannockburn in 1314. However, ‘the union came at last, and having the great forces on its side, it came for ever.’ Thus, to Goldwin Smith, normality was to be found in the 170 years that followed 1707 (‘for ever’), not in the hundreds of years during which historical inevitability had consistently suffered ‘postponement.’ More recently, three Edinburgh scholars also set the demand for a Scottish parliament in the context of the Anglo-Scots Union, but in a way that evidently did not occur to Goldwin Smith. They too discerned a long historical symmetry, arguing that pressure for devolution represents continuity of ‘a belief that Scottish interests have been best served when the country ties its destiny to its more powerful neighbours. The main novelty is that these neighbours are now seen as lying beyond England.’28 With the establishment of a Scottish parliament in 1999, it may be necessary to turn Goldwin Smith on his head, and conclude that it was the Anglo-Scots Union that constituted the transient diversion from a larger process of European integration. As an intellectual somersault, it will be no more athletic than some of the contortions that Smith performed himself. Goldwin Smith was not alone in his convolutions. It is tempting to swivel the telescope around when we look at the past. Are we, for instance, close to the world of 1500 or distant from it? Conventional phraseology suggests that we distance ourselves from the past: ‘we are not living in the Middle Ages’ is our coded cliché for repudiating barbarism. One of the boards responsible for administering school examinations in England reported in 1996 that 52,000 students had entered for its paper on world history, which dealt exclusively with the twentieth century, but only 97 for the one-time staple period of ‘Tudors and Stuarts.’ Even the relatively near past of three hundred years ago was vanishing into the mist.29 In 1999, there were further proposals to modify Britain’s National Curriculum in order to spare pupils the burden of dates and facts altogether. Children would be permitted to study Elizabeth I’s costume but be denied any reason to suspect that she might have helped to shape the English national experience. Instead, a past unscaffolded by dates or facts is turned into a sanitized tourist pantomime. In Britain, commendable enthusiasts of the Fellowship of the Sealed Knot don Roundhead and Cavalier costumes

168 / Past Futures: The Impossible Necessity of History and re-enact the battles between Charles I and Cromwell, complete with musket fire and gunsmoke. No blood is shed, there are no massacres of the vanquished, and no prisoners are sold into slavery in the Caribbean. Spectators are entitled to feel that they have witnessed something of the reality of the seventeenth-century Civil War, but they have certainly not experienced all of it. Unless we grasp the horror of those conflicts, we cannot begin to appreciate why, in the century after 1660, the English and most of the Scots sought above all to create strong and (in elite terms) consensual political structures that would preclude recourse to internal warfare. In short, by sanitizing the past, we rob it of its own reality. At Louisbourg in Nova Scotia, Parks Canada serves tourists eighteenth-century food on pewter dishes. It does, however, provide modern washrooms, and it has yet to promote the Cape Breton economy by re-enacting on the Louisbourg quayside the gruesomely protracted public executions that were once carried out there during the French colonial regime.30 An academic minority has no right to condemn these superbly professional operations. The costumed attendants who send us on our way by urging us to ‘have a nice day’ are by no means the most objectionable part of the leisure industry, and heritage commemoration is at least a start in making people aware of the past. The problem is that it is an incomplete past, one that has been forced into dialogue on our terms, a past laundered to leave out the nasty bits. Historians resemble astronomers: they provide glimpses of distant worlds and, like astronomers, their claim to public respect lies in their ability to penetrate extreme remoteness. Yet the most appropriate role for historians today perhaps lies in explaining not that the past is far away, but rather that in so many ways much of it remains frighteningly close. The romantic distancing of a past re-created to fit our own requirements protects us from the need to question the origins of our own attitudes and values. No such self-examination is required, we implicitly tell ourselves, for we live at the culmination of history. Our values are the appropriate values, we are civilized people who have arrived at the end of the journey and have no need for either introspection or retrospection. This is a dangerous complacency. We may not be living in the Middle Ages, but – assuming for one moment that the Middle Ages were indeed more barbaric than the century of

A Long Time in History / 169 Hiroshima and Kosovo – can we be so sure that the Middle Ages are not living in us? A plaque at Edinburgh Castle commemorates unknown hundreds of women who were burned to death nearby for the alleged crime of witchcraft. Although much smaller than the adjacent statue of that symbol of twentieth-century rationality Field Marshal Sir Douglas Haig, it is an honourable tribute to the victims of persecution. Yet that plaque implicitly conveys a double message: the present is tolerant and decent; the past was a remote and different world altogether. Can we be sure that the impulse behind the witch hunt has been eradicated from our decent societies? During the second half of the twentieth century, most Western countries have seen a struggle by enlightened liberal opinion against attempts by the Catholic Church to obstruct access to contraception and divorce, campaigns which as a citizen I have strongly supported and do not regret. We liberals have believed that our values reflect our point of arrival, that there was no place for clerical domination of twentieth-century bedrooms. Yet I cannot be sure that my enlightened political position was not in some way coloured by the undercurrent of anti-Catholicism that has characterized much of English society since the Reformation which gave it birth. It was a sentiment that was certainly current in my childhood in London, and has still not entirely vanished from some parts of Scotland. The most lavish sequence in the hilarious and wacky film Monty Python’s The Meaning of Life, the song-and-dance routine ‘Every Sperm Is Sacred,’ was a bitter satire on the whole Catholic Church, from prancing cardinals to high-kicking nuns. Not only did the sequence demonstrate a clear link between modern liberalism and long-standing sectarian prejudice, but it is far from clear which sentiment was validating the other. In their defence, it can be said that critics of Catholic sexual morality were irked by the Church’s claim to eternal verity backed by long institutional survival. However, by assuming that the past was too remote to matter, those critics may have robbed themselves of the self-knowledge to analyse the extent to which they too had inherited their antagonism. ‘The past is a foreign country,’ remarked the novelist L.P. Hartley. ‘They do things differently there.’31 More to the point, they think differently. We are not particularly surprised to encounter James I claiming

170 / Past Futures: The Impossible Necessity of History that ‘kings are not only God’s lieutenants upon earth ... but even by God himself they are called gods,’ though we suspect that, even in 1610, that was a little bit over the top.32 Still, we can shrug and reflect that people thought that way in an era that was, so we assume, very long ago. It comes as rather more of a shock to find George VI, father of Elizabeth II, thanking God for a good people, suggesting attitudes towards divine intervention and social cohesion that we find unexpected and bizarre in a year apparently so recent as 1940. In a hackneyed barb, George Bernard Shaw described Britain and the United States as two countries divided by a common language. Likewise, we are alive to the possibilities of misunderstanding across different cultures, but blithely ignore the dangers of hearing the wrong messages from our own past. As an undergraduate at Oxford in the eighteen-forties, the young Lord Dufferin decided to befriend a student from the Middle East and, in a revealing piece of cultural stereotyping, thought that he might wish to learn about Henry VIII and ‘the history of his seven wives.’ (Dufferin never could resist exaggeration.) The overseas student was indeed interested ‘and was particularly inquisitive into the number which his majesty had at once.’33 We may experience just as much incomprehension dealing with our own past. For instance, a mass public education system was one of the achievements that modern Canada has inherited from the mid-nineteenth-century British North American colonies, and here it would seem that their aims and ours must have been the same. But in Nova Scotia in the eighteen-sixties, one argument for universal free schooling was that ‘it wars with that greatest, meanest foe of all social advancement – the isolation of selfish individuality.’ The function of education was to make ‘each man feel that the welfare of the whole society is his welfare – that collective interests are first in order of importance and duty, and that separate interests are second.’34 Of course, this stress upon society over the individual (irrespective of gender) is diametrically opposed to most modern educational theory, which places much greater emphasis upon personal development and self-expression. Nor is this the only example where the Canadian past may seem to resemble the Canadian present that has inherited its institutions while in reality being animated by very different values of modern times.

A Long Time in History / 171 When tuning into the messages from the Canadian past, it is easy to blot out areas that we find incomprehensible or embarrassing. One of these is loyalty to the British connection and especially to the monarchy. Another is religion, particularly when manifested as sectarianism. It is tempting to write off attachment to Britain as a coded expression of that essential quality of being not-American that forms the core of the Canadian identity. In the Maritimes, where the political culture was set by Loyalists who were several generations from the old homeland, there may be something in this interpretation. However, the danger of explaining away Canadian enthusiasm for Britain in this comfortable way is that it fails to take account of the fact that, until the explosion of a more multicultural Canada after the Second World War, large numbers of English-speaking Canadians had either been born in Britain or were the children of immigrants who reared them to believe that the centre of Canada’s universe was located somewhere external to the country. Bizarre it may seem to some among us today, but devotion to the British monarchy was indeed a genuine sentiment. Joseph Howe began a two-hour denunciation of Confederation at Yarmouth in Nova Scotia in 1866 with ‘an eloquent eulogy on the character of Her Majesty Queen Victoria, as a child, a wife and mother, a queen and a widow.’ The Queen herself praised just this quality in her colonial subjects: ‘You have all exhibited so much loyalty,’ she told Macdonald when Confederation was achieved in 1867. Canada’s first prime minister replied: ‘We have desired in this measure to declare in the most solemn & emphatic manner our resolve to live under the sovereignty of Your Majesty and your family forever.’ ‘I could die for her!’ exclaimed a Canadian officer after meeting his sovereign in 1900 on his way to fight in the Boer War, where several hundred Canadians did indeed make the ultimate sacrifice. When Queen Victoria’s long life came to its end soon afterwards, some of her colonial subjects were surprised to realize that she was not immortal. ‘Who ever thought that Queen Victoria could die?’ Maud Montgomery confided to her diary in January 1901. ‘The sense of loss seems almost personal.’35 Attachment to the monarchy transferred itself to Victoria’s descendants. The future King George V and his wife toured Canada in 1902. As the royal train passed through Aurora, a small boy named Lester Pearson was hoisted on to his father’s shoulders to get a better view.

172 / Past Futures: The Impossible Necessity of History Fifty years later, as Canada’s external affairs minister, he recounted the episode to Queen Mary. Gravely she assured him that she recalled the greetings of the good people of Aurora, but courteously apologized that she could not recall the small boy sitting on his father’s shoulders. In 1906, the Canadian parliament invited Edward VII to make a return visit (he had toured British North America as a youth in 1860) in order that Canadians might express their ‘profound admiration’ at first hand for his ‘kingly virtues and truly humanitarian deeds,’ a highly misguided but undoubtedly sincere sketch of King Edward’s character.36 Canadians were not in fact to set eyes on a reigning monarch until George VI crossed the country in an astonishing torrent of emotion in 1939. Some of the peaks of that royal tour were celebrations of nostalgia by exiles: it was a Scotsman who was heard regretting that Hitler was not on hand to witness the imperial solidarity of the ecstatic crowd at the dedication of the national war memorial in Ottawa. However, Canada’s affair with the monarchy was something more than a device to maintain immigrant roots. ‘French Canada has gone slightly mad,’ commented the governor general, Lord Tweedsmuir. Mackenzie King’s Quebec ally, Raoul Dandurand, took pride in addressing his sovereign in the language of Champlain and, in a backhanded allusion to the king’s embarrassing stutter, one Quebec commentator thought that George VI expressed himself in French more effectively than in English.37 Canada’s affair with the monarchy may perplex those who feel that we have arrived at a logical end to our travels through time, and so no longer need such flummeries. Unfortunately, we cannot tune into the wavelength of the Canadian past unless we recognize that until recent times the British monarchy formed a real element in a Canadian world picture. Religion, too, was something that mattered to earlier generations: indeed, to many, the precise brand or sect was all important. ‘The Feud between the protestants and their Catholic fellow subjects grows stronger every hour,’ wrote Susanna Moodie in 1856. She even feared that the confrontation would end in civil war. The Toronto Globe was only too happy to fan the flames. On the fifth of November that year, Guy Fawkes Day, it marked the 251st anniversary of a plot by English Catholics to kill King James I with a vigorous reminder that the Church of Rome was ‘a hungry horse leech,’ clinging to ‘her impudent assump-

A Long Time in History / 173 tion of sovereignty, temporal and spiritual.’ Historians look back on the Globe’s campaign for representation by population and assimilate it to a modern democratic belief in fair representation through equal electoral districts. This misses the point that the Globe saw redistribution of political power primarily as the means by which ‘the deep schemes of Romish Priestcraft are to be baffled.’38 The paranoid conspiracy theorists of the mid-Victorian Toronto Globe might well not be surprised could they know that, with the fleeting exception of Kim Campbell, every prime minister of Canada since Trudeau has been a member of the Catholic Church. However, they would be astonished to learn that the people of Canada neither knew nor cared about this record of Romish ascendancy. In this respect, we may indeed feel that our values are superior to those of the nineteenth century, but our selfsatisfaction does not entitle us to blot out their world picture from our historical re-creation. From a modern perspective, Canadian politics in the era of Confederation sometimes appear as little more than a squalid scramble for railway subsidies. Only by appreciating just how great was the risk of unbridgeable division into sectarian warfare can we appreciate the notable success of the Canadian political system in motivating diverse groups to work together at all. Similarly, those concerned with the survival of Canada’s bicultural partnership will look back on the conscription crisis of 1917 and see it, in the words of two distinguished modern historians, as a ‘clash of nationalisms.’ Yet, as already noted in chapter 5, there was another dimension to the disagreement between French and English over the extent of participation in the First World War, at least in the mind of a leading Methodist, Dr Dwight Chown. ‘Is it fair to leave the province of Quebec to retain its strength in numbers,’ he asked, ‘ready for any political or military aggression in the future, while our Protestants go forth to slaughter and decimation?’39 In a recent study of the process that brought Newfoundland into Confederation, the historian Raymond B. Blake seeks to play down the role of sectarianism in the crucial second referendum of 1948, and probably rightly so. (The second referendum was a run-off between Confederation and a return to self-government, after a third option, continued direct rule from London, had been eliminated in the first vote.) There was an undercurrent of confrontation between Catholics

174 / Past Futures: The Impossible Necessity of History and Protestants, with the former being counselled by their archbishop to vote against joining Canada. However, while the island’s population was two-thirds Protestant, Confederation scraped home with just 52 per cent of the popular vote. As so often in Newfoundland history, the picture was more complicated than a straightforward sectarian head-to-head would posit. To mobilize Protestants to vote ‘yes’ in the crucial second referendum, campaign leader Joey Smallwood delivered a radio address in which he simply read lists of outports that had voted against Confederation first time around, followed by those that had voted in favour. He did not mention that the first group predominantly consisted of Catholic communities, while the second was overwhelmingly Protestant. In the context of the Newfoundland of 1948, where religion provided an automatic means of labelling both people and places, he did not need to do so. Thus, a later and (so we should insist) more enlightened generation might well miss that crucial message from a past that is very recent, interpreting Smallwood’s lists as evidence of his love of Newfoundland locality, not to mention his attachment to oratorical verbiage.40 We may be closer to the past than we allow ourselves to assume when we isolate and sanitize it behind the thick plate glass of a museum display. Yet that uncomfortably close past was driven by values that we may not share, do not always understand, and may even fail to notice. If we may find our present in much greater temporal proximity to the past than we have chosen to accept, to what extent can we, or should we, even try to conduct a dialogue in ethical terms? Most scholars would instinctively counsel against even making the attempt; in principle, no doubt rightly so, if only because there is not much fun in a dialogue in which one interlocutor cannot answer back. Writing in 1951, Herbert Butterfield warned that leaving ‘the realm of explanation and description’ and entering into ‘the world of moral judgments’ placed the historian upon a slippery path. ‘The morality comes to be worked into the organisation of the narrative and the structure of the historical scene in a manner that is illegitimate.’ In this, he reflected the assumptions of a conservative Englishman of his day that morality was a worthy but inconvenient sentiment that could be turned on like a tap, and was best prevented from gushing. In an ideal world we

A Long Time in History / 175 should conform to Butterfield’s demand, but unfortunately few of us manage to be consistent in achieving it. As C. Behan McCullagh has recently observed, ‘it is difficult to avoid moral judgements altogether as so many of the words we use have moral overtones, suggesting at least approval or disapproval.’41 At the very least we should be sensitive to the fact that the past was animated by ethical values different from those prevailing in liberal Western societies today. More fundamentally, historians need to confront the fact that we often interpret the past through the filter of our own values, even when we pretend to be ethically neutral. For a cautionary tale of the difference between history and hell-fire sermonizing, we need only to turn to the lectures delivered by Charles Kingsley as Professor of History at Cambridge in the eighteen-sixties. Kingsley lectured on American history, a surprisingly modern subject, but his aim was distinctly Victorian, as he told his audience in his closing lecture: ‘[I]f I can have convinced you that well-doing and ill-doing are rewarded and punished in this world, as well as in the world to come, I shall have done you more good than if I had crammed your minds with many dates and facts from Modern History.’ Disillusioned modern university teachers might well be impressed by Kingsley’s influence on students: wonder to relate, it was even reported that his lecture class went off to the library and sought out books ‘which undergraduates had never asked for before.’ However, and astonishing though it may seem today, his major impact was not educational but moral. Students were said to have left his lectures proclaiming, ‘Kingsley is right – I’m wrong – my life has been a cowardly life – I’ll turn over a new leaf, so help me God.’42 From the perspective of our fleeting present, the oddest aspect of Kingsley’s moralizing was that he was a fervent supporter of the South in the American Civil War. Kingsley was untroubled by the detail that Southerners were slave-owners – even though most Victorians were anti-slavery. There are always dangers in back-projecting terminology from later political vocabulary, but it is unlikely that Kingsley himself would have objected to being branded as a racist. Indeed, he would probably have gloried in both the concept and the term. A superficial reading of Darwin persuaded him of ‘the immense importance of Race,’ and he simply dismissed any claim to humanity on the part of

176 / Past Futures: The Impossible Necessity of History inconvenient non-white peoples, such as the Native American (‘not a natural man’) or the Dyaks of Borneo (‘beast life’). The only interesting twist in his colour-prejudiced world view was the way in which he managed to classify the Irish as ‘human chimpanzees.’ Revealingly he added: ‘[I]f they were black one would not feel it so much, but their skins ... are as white as ours.’ (The rest of his analysis of Irish ethnicity is probably too offensive nowadays even to quote.)43 More recent historians have generally learned the lesson of Kingsley’s inappropriate moralizing and sought to avoid pronouncing ethical verdicts on the past. Humility, not to mention prudence, suggests that the values of each fleeting today may be condemned as pompous or vicious in the tomorrows that lie fifty years ahead, and so it is better to avoid judgments of any kind. For instance, few denunciations of the Nazis have been so comprehensive, so angry, and so devastating as William L. Shirer’s Rise and Fall of the Third Reich, first published in 1960. An American journalist based in Berlin during the nineteen-thirties, Shirer made no effort to hide his contempt for ‘the weird assortment of misfits’ who swarmed around Hitler, and who among us would wish to demur? But Shirer was equally outspoken in proclaiming that many top Nazis were ‘homosexual perverts.’44 A later generation may feel that if their sexuality had been accepted, many of them would have been neither misfits nor Nazis at all. We endorse Shirer’s condemnation of the political philosophy of the Nazi leaders, but we reject his disgust at their private lives – a distinction that Shirer, in his generation, might have found impossible. Perhaps, however, a later generation would have preferred to steer away from the issue altogether. John Cannon has contrasted the emphasis of recent historians upon ‘immediate, ephemeral, and contingent factors’ in seventeenth-century English history with the sweeping theories of Victorians such as S.R. Gardiner, who termed the English Civil War the ‘Puritan Revolution,’ and Macaulay, who saw the conflict as a foreordained step towards the parliamentary government of his own era. Cannon suggests that the shift away from such grand theories reflects ‘a decline in intellectual confidence among twentiethcentury British academics.’45 It is probably misleading to invoke the mental neuroses of a declining Britain, but Cannon is insightful in appreciating that an age that prides itself on being value-free is in real-

A Long Time in History / 177 ity engaged in back-projecting its ideology every bit as energetically as did Kingsley and Gardiner. Far from suffering from a decline in intellectual confidence, we take for granted that our impeccable values are so uncontroversial that they do not have to be defined. In an ironic twist, this sometimes leaves us unable to cope with violent challenges to those values. There was a time when the mere act of terrorism discredited a cause, however noble its inspiration: it is difficult now to grasp the horror felt across English Canada when the Metis leader Louis Riel ordered the shooting of a defiant opponent, Thomas Scott, after a drumhead court martial at the Red River in 1870. Until September eleventh 2001, there was a tendency to measure the legitimacy of any terror group by the number of people they kill. After all, we are rational people who eschew violence. If others are driven to kill, they too must be assumed to be as rational as ourselves. Hence the bloodier their resort to violence, the greater must be the injustice against which they protest. Thus, the implicit but undefined claim to the superiority of our modern Western way of life, especially its tolerance and acceptance of diversity, has the dangerous effect of leaving us intellectually unable to defend the values that we do indeed most deeply prize. In one of Disraeli’s novels, a character remarked that ‘sensible men are all of the same religion.’ On being challenged to specify the faith that they shared, he replied that sensible men never tell.46 Historians are sensible men and women who share a set of enlightened beliefs while assuring themselves that their writings are simultaneously value-free. For instance, most historians of Ireland are unsympathetic to the Ulster Protestants unless, like A.T.Q. Stewart or Paul Bew, they are themselves products of that tradition.47 Experienced through the unflattering medium of television, Ulster Protestants do indeed often come across as grim people, the tormented diphthongs in their vowels giving an unpleasantly sharp meaning to the term ‘sound bite.’ Some of them scorn our liberal-democratic belief in the sanctity of majorities by proclaiming that they will tolerate only the wishes of a majority among themselves. A few even defy our secular dogma that religion has no place in politics by voicing sectarian prejudice of a kind that enlightened opinion associates with the seventeenth century, an epoch that historians implicitly consign to a long time ago. They embarrass our cool cynicism towards inherited institutions by proclaiming their

178 / Past Futures: The Impossible Necessity of History fervent loyalty to the Crown, a sentiment that we have mentally pigeonholed in the sepia-photograph world of the nineteenth century. Such brazen multiple defiance of our liberal consensus seems to leave us with no other intellectual strategy but to reject them as lunatics or hypocrites. Cumulatively, Ulster’s Protestants are portrayed with subtle hostility that would be denounced as racist if directed against blacks and homophobic if applied to lesbians and gays. Enlightened opinion is simultaneously value-free and non-judgmental – but it has no compunction in celebrating the superiority of the values of our own arrival at the end point of a historical process by rejecting anachronistic hangovers from the bad old days of time past. It is no role of the historian to defend bigotry, but surely we have some responsibility to explore why so many Ulster Protestants apparently continue to hold opinions which enlightened good sense revealingly condemns as ‘outdated.’ As already suggested, a Canadian theory of democracy, at least since the adoption of the Charter of Rights in 1982, would point to majority rule as only one side of the coin; minority rights are equally important. There was nothing morally unworthy about the Protestant Irish wish in 1920–2 to retain the British identity that they had enjoyed since 1801, even if by that time governments in London found Ulster, as the Protestant leader Carson sarcastically remarked, ‘so unreasonable that she values her citizenship in the United Kingdom.’48 It would be a daunting task to draw up a moral balance sheet for the Union between Britain and Ireland, and what would be the point? We should still have to confront the basic fact that in 1921 some people in Ireland wished to be British and others did not. In my capacity as a citizen of Great Britain and a resident of County Waterford, I am glad that most of Ireland was able to assert its independence in 1921. In my work as a historian, I have no right to assume that such an outcome was either inevitable or morally correct. Similarly, while we may deplore Ulster sectarianism, we might be well advised to hesitate before engaging in one-sided condemnation. In 1908, Rome issued the Ne Temere decree, warning Catholics that a marriage was only valid if celebrated in due form by a priest of their own church. One historian has ridiculed the subsequent outcry that ‘the papal viper could wriggle its way into the nuptial bed’ for the very plausible reason that few Ulster marriages crossed the religious line

A Long Time in History / 179 anyway, although it was alleged that in 1910 a Belfast Catholic abandoned his Protestant wife after being assured by his priest that their marriage could be disregarded as invalid.49 Yet the fact that the reaction to Ne Temere seems disproportionate does not entitle us to ignore the episode altogether. Rather, historians should note that during the crucible years of the early twentieth century that projected a sectarian split on to the map of Ireland, the Catholic Church did nothing to present itself in an ecumenical light. Selective narrative tends to interpret Ulster protestations of loyalty to the Crown as a demand that the British government should be slavishly obedient to Ulster demands. The year 1916 stands out as one of the landmarks in Irish history, a date that even non-specialists can often identify with the nationalist Easter Rising in Dublin. Historians may prefer not to enter into discussion of the ethics of the proclamation of the Irish Republic: nine men signed a document demanding the allegiance of everyone living in Ireland, and much of the violence that has plagued Northern Ireland since 1968 has been carried out in the name of that pure republic. There were 3000 casualties in the Easter Rising, 450 of them killed, thirteen of those by British firing squads. Since it was a military failure (although that was probably not the past future that had been envisaged by the insurgents), the Rising is usually described as a blood sacrifice, by unspoken implication conferring upon the thirteen executed leaders a posthumous moral claim to shape Ireland’s destiny. The Rising’s token socialist leader, James Connolly, a lapsed Catholic married to a Protestant, has acquired a new lease of symbolic life as the embodiment of secular modernity and decency. To his inspiration is attributed the noble aside in the Proclamation of the Republic which spoke of ‘cherishing all the children of the nation equally.’ That Connolly was reconciled to the church of his birth as he awaited a British firing squad is surely a matter of private judgment which even posterity should respect. However, it is less widely known that Connolly demanded that his wife convert with him. Thus, it is not merely the nationalism of the Easter Rising which dominates the way we have come to see Ireland in 1916, but one particular noble aspect of its motivation, to the exclusion of anything that might tarnish the image. Far less attention is devoted to the carnage that overwhelmed the Ulster Division at the battle of the Somme just two months later,

180 / Past Futures: The Impossible Necessity of History when units largely composed of Protestants from the paramilitary Ulster Volunteer Force suffered 5500 casualties in two days of fighting. Just as April 1916 in Dublin can only be understood in terms of the religiosity that coupled the terms ‘Easter’ and ‘Rising,’ so it is necessary to grasp that Thiepval Wood was Ulster’s blood sacrifice to the British Crown. As late as 2001, Northern Ireland’s Unionist First Minister forced a showdown in the peace process by submitting his resignation on 1 July, to coincide with the Thiepval commemorations. It is open to historians to suggest that there comes a time when we should draw a line under the moral debts of the past, but surely the moratorium should wipe out the claims of both sides.50 Thus, it is possible to begin to understand the mindset of Ulster’s Protestants. Unfortunately, to do so is to start down a slippery slope which may carry us further than we realize. Just as ‘why’? trenches upon ‘wherefore?’, so ‘understanding’ carries with it not simply mental comprehension of the logic behind somebody else’s position, but a partial endorsement of its ethical claims. We are expected to ‘understand’ the anger that drives terrorists to kill people: the more people they kill, the angrier they are assumed to be, and the greater therefore must be the injustice against which they protest. Thus, to ‘understand’ is not simply to follow the logic of an argument but to enter into its ethical imperative. (The often-quoted aphorism of the French writer Madame de Staël, that to understand all is to forgive all, is to some extent the product of creative translation that does not fully capture her use of the French term ‘indulgent’: it certainly does not describe her attitude to Napoleon.) The problem becomes more complex when we are confronted with practices from other cultures which outrage our Western sense of decency. Clitoridectomy is widely practised in some parts of Africa. The practice offended Western observers as diverse as missionaries and feminists who both believed that their superior values entitled them to eradicate the practice. However, Kikuyu resisted abolition – even Jomo Kenyatta, who stands high in the secular version of Disraeli’s religion of all sensible men (and women) – believing that it was ‘the secret aim of those who attack this centuries-old custom to disintegrate their social order in order thereby to hasten their Europeanisation.’51 Thus, two elements of modern Western thought came into conflict: belief in the civilized superiority

A Long Time in History / 181 of our own ideas on the one hand and value-free respect for the integrity of other cultures on the other. Add a third dimension of time past, and the result can be complicated moral confusion. Unfortunately, it is difficult to write about areas of the human experience which have been managed through systems of morality without to some extent entering into debate with those values. One illuminating example of this process can be found in the work of Ronald Hyam on the relationship between sexual behaviour and the British Empire. His research was soon denounced for its ‘white, male, imperialist-centred approach’ marked by a ‘failure to question patriarchy.’ Untroubled, Hyam replied that he was ‘puzzled’ by the demand that he should ‘express continuous moral outrage’ at the material he handled. Rather, he declared that he would not ‘make simplistic value judgements about people’s sexual interests, unless they have demonstrably adverse behavioural consequences.’ In one respect, however, he made his own position abundantly clear, condemning feminism as ‘so fundamentally hostile to sex’ that it could hardly be ‘useful for understanding the history of sexuality.’52 Thus, it appeared that even if Hyam refused to indulge in ethical judgments, he was not shy about making intellectual pronouncements. Was it possible to maintain the dividing line between the two? As a historian of sex, Hyam adopted the view that human behaviour was highly complex. As a student of the British overseas, he was prepared to accept that some pretty rum things happened on the margins of Empire. The weak point in Hyam’s high-minded stance of moralistic selfdenial lies in its allusion to ‘demonstrably adverse behavioural consequences.’ Continuous moral outrage may be inappropriate, but at what point does the religion of all sensible men require the historian to declare that particular actions were undesirable, possessing ‘demonstrably adverse behavioural consequences’? In any case, on what basis could such a verdict be reached? One of the most basic reasons for disapproving of somebody else’s sexual behaviour is that it is cruel or exploitative, but the historian who takes account of such complaints has taken a step along the road towards ‘understanding’ the point of view of the exploited. In any case, there was a huge value judgment at the core of Hyam’s thesis: on balance, he concluded, sexual relationships ‘between the British and non-Europeans probably did more

182 / Past Futures: The Impossible Necessity of History long-term good than harm to race relations.’53 The terms ‘good’ and ‘harm’ refer less to ethics than to efficiency: Hyam argues that race relations operated more harmoniously because some British people (mainly male) shared beds with their non-European subjects (usually, but not exclusively, female). Yet the argument for efficiency is itself characteristic of modern thought patterns in which all forms of cleanliness have become the secular equivalent of godliness: harmonious race relations were a Good Thing. It might seem hard to disagree with this worthy sentiment – but were they a Good Thing because it is right that people from different cultures should love each other (in whatever way they may freely choose) or because it was easier to run the British empire if the white man became somebody else’s burden in bed? The tensions within this coalition of values may be seen in Hyam’s handling of the case of Hubert Silberrad, a colonial official who was posted to a remote part of Kenya in 1903. He promptly ‘acquired’ two Kikuyu girls from a departing colleague, who had respected local tradition by paying eighty goats to the local tribe for their services. Although both girls had seemed happy enough with Silberrad’s predecessor, one of them, who was about twelve years of age, ‘was reluctant to be passed on.’ An African policeman supplied a replacement of about the same age, although there was subsequently a disagreement and indeed a scuffle between the two men over her custody. At this point, a local settler couple, the Scoresby Routledges, felt obliged to intervene. Mrs Scoresby Routledge impounded the girls, and her husband commenced a campaign of denunciation. A local judicial investigation criticized Silberrad for ‘poaching’ (in Hyam’s ironic term) from the policeman, and generally disapproved of the episode as ‘detrimental to the interests of good government,’ a mortal sin in the colonial service. Although Silberrad was quietly eased from the scene, the disapproving Scoresby Routledges engaged in the ultimate weapon of English moral outrage, writing to The Times to ask whether representatives of the Crown should be allowed ‘to withdraw ignorant girls ... from the well-defined lines of tribal life’ for the debasing purposes of sexual gratification. In London, Colonial Office bureaucrats felt that this was a low blow. Such arrangements were ‘exceedingly common’ in Kenya and they complained that it was entirely wrong to imply that Silberrad’s behaviour was ‘as bad as that of a man who commits a similar offence with a young girl in England.’

A Long Time in History / 183 Although Hyam’s handling of the moral issues is relentlessly neutral, as shown in his use of such noncommital terms as ‘acquired’ and ‘passed on,’ his portrayal of the Scoresby Routledges is highly partisan. The husband was ‘a self-styled anthropologist’ who devoted his life to ‘enjoying the sport and claiming to study African life and interests.’ Some might conclude that in publicizing the Silberrad case, Scoresby Routledge had proved a commitment to African interests. Hyam, however, condemned him as ‘a Purity-minded, interfering busybody.’ Indeed, the term ‘victim’ is applied to only one participant in the Silberrad case – Silberrad himself. One of Christopher Marlowe’s characters defended himself against a charge of fornication by pleading, ‘[B]ut that was in another country; and besides, the wench is dead.’ Hyam more benignly implies that at least one of the wenches did not seem to mind.54 In short, continuous ethical neutrality proves as difficult to sustain as continuous moral outrage. Worse still, through its potential for ‘understanding,’ it provides a tacit endorsement of one side of the Silberrad case to the exclusion of any other. It is certainly a complication for historical analysis of the Silberrad case that it should be located in the relatively near past, and centres on an issue, sexual exploitation of children, that constitutes a major contemporary concern. It may have happened ‘in another country,’ but it cannot be dismissed as having happened a ‘long time’ ago. Hence, it is all the more important to disentangle the parallelogram of values involved: Victorian morality, African custom, the personnel practices of the British colonial service, and the hidden judgmentalism of continuous scholarly neutrality. The way in which a sense of a ‘long time’ can influence our response to historical issues can be illustrated by comparing British responses to two crises in eastern Europe. In 1619–20, King James I (Scotland’s James VI) refused to support his son-in-law, the Rhineland princeling Frederick V, in his bid to take the throne of Bohemia. Contemporaries censured James for failing the cause of European Protestantism and for the unnatural act of abandoning his own flesh and blood. In our enlightened times, opinion is left cold by the notion of a Europe divided into armed camps of Protestants and Catholics, and we should probably be inclined to give James some credit for setting his face against nepotism. It seems perfectly sensible to accept that the king of

184 / Past Futures: The Impossible Necessity of History England (and Scotland) could not exercise any practical influence over events in such a distant land. By contrast, historians cannot help feeling uneasy at Neville Chamberlain’s abandonment of the Czechs in 1938, for much the same motive. No doubt he was correct to call the Sudeten dispute ‘a quarrel in a far-away country between people of whom we know nothing,’55 but somehow our sense of what is fitting is outraged by brutal realism from the leader of a democratic nation in the near past. No such ethical considerations intrude upon our assessment of the policy of King James in similar circumstances, because it all happened so long ago. Ironically, Frederick’s disastrous adventure in Bohemia was indirectly responsible for saddling modern Canada with a huge ethical challenge. Reared in penurious exile, his son, Prince Rupert, became a soldier of fortune, hungry for the crumbs of patronage. In 1670, Rupert sold the use of his name to the newly chartered Hudson’s Bay Company, which fastened it upon its claimed furtrading domain to proclaim sovereignty over ‘Rupertsland.’ Three centuries later, Canadians awoke to the problem that aboriginal people echoed John Blake Dillon in their insistence that 1670 was not a ‘long time’ ago, and that the rights of native people retained priority even though they had been ignored for three centuries. In short, because time operates in a continuum, it leaves vague the zone of ethical transition. How should historians respond to the fact that hard-headed Palmerstonian Liberals had few problems in abandoning Denmark to Prussian militarism in 1864, while just fifty years later Asquithian Liberals were tormented by considerations of national honour in defence of Belgium? The hollow ring that attaches to Chamberlain’s claim that he had secured ‘peace with honour’ at Munich is largely absent from the association of the same phrase with the equally cynical carving up of the Balkans by Disraeli at the Berlin conference sixty years earlier. At the very least, historians have a responsibility to report that the people of the past were often driven by moral imperatives of their own, even when we may regard them as busybodies and bigots. It is possible to analyse the Second World War largely as a traditional conflict among the great powers, taking the view that ‘there was nothing wrong with Hitler except that he was a German,’ but it is perverse to ignore the obvious fact that moral outrage against Nazi ideas, and Nazi actions, played an important role in the outcome. Even

A Long Time in History / 185 so, it is impossible to understand Quebec in the nineteen-forties without appreciating that there were intelligent people who refused to see any moral difference between the British Empire and Hitler’s Reich. One Quebec intellectual subsequently sought to excuse his ethical obtuseness with the lame explanation that he ‘had been taught to keep away from imperialistic wars’ and that ‘he scarcely paid any attention to the news of the world conflict at that time.’56 Thus, the first step towards locating events in not only the sweep of time but the cosmos of values is the need to recognize the ethical dimensions that motivated the decisions of people in the past. The necessary corollary is that scholars must be more open about their own attitude to those values. It is profoundly wrong for historians to pretend that they have no opinions about the conduct of those they study, when they are silently imposing the fundamental beliefs of Disraeli’s sensible men. Rosalind Mitchison was no doubt insensitive in her attitude to ex-nuptial birth, but there is something refreshing in her trenchant description of a sixteenth-century Earl of Orkney as ‘a bastard in every sense of the word.’57 If we are to locate ourselves in time, we must openly confront the way in which our own values may distort our interpretation of the past. Sometime around 1960, A.J.P. Taylor suddenly realized that the experience of the Second World War was passing from ‘today’ to ‘yesterday,’ moving out of the current orbit and into ‘history.’58 Yet there can never be a precise cut-off date, for all our yesterdays continue to crowd upon us. The past continuously overlaps, not least because every society is a coalition of generations: even as the War of 1939 was making that tacit transition into history, there were still people who could not talk about its forerunner of 1914 without weeping at the memory of fallen soldiers who lived on in their inward imagination. We think of earlier centuries as being distant from us simply because we have chosen to regard them as remote. This leads us into a curious paradox in our assessment of our own times. Some part of our mental furniture possesses a special validity on grounds of antiquity, an inheritance of unbroken continuity from the past, even if in the case of democracy or the welfare state such an assumption makes a travesty of chronology. Other aspects of our world are morally superior precisely because they seem to have turned their backs on their past: so com-

186 / Past Futures: The Impossible Necessity of History monplace was it to reassure ourselves that ‘we are, after all, living in the twentieth century’ that we were obliged to invent an overblown anniversary and call it the Millennium when the calendar ran out on the nineteen-hundreds. The motto of the province of Quebec, ‘Je me souviens,’ fully translates not simply as ‘I remember’ but as ‘I am determined not to forget.’ We would dupe ourselves if we assumed that merely because most of us pretend that we have managed to forget the past, we are indeed emancipated from it. We cannot simply shelter behind the undefined notion of a ‘long time in history’ to relegate the past to entertainment and thereby distance it from ourselves. To locate ourselves in time we must begin by recognizing that in many respects we are very close to the past, a proximity that imposes upon historians a far more sophisticated attitude to ethical dialogue than is customarily recognized. Only then can we proceed to identify the relationship of past events to each other by exploring perhaps the most basic concept of all, that of significance.

7 Significance

This page intentionally left blank

One day in June 1944, the Allies invaded Normandy and began to drive the Nazi armies out of France. One day in June 2001, I ate breakfast and started to mark examination papers. The two sentence are similar in syntax, but even someone who takes breakfast seriously must recognize that the first statement describes a more noteworthy historical event than the second. If we are to discern a structure within the sweep of time, to identify landmarks that can relate past events and trends to each other and to our fleeting present, we require some approach and terminology that will highlight individual frames within the cartoon strip of historical narrative. Thus did the Scottish historian Jenny Wormald seek to underline the import of the accession of James VI of Scotland to the English throne, which he inherited on the death of his childless cousin, Elizabeth I. Wormald stressed that in 1603 ‘something happened that is so well known that its startling and dramatic nature is forgotten.’ Two countries that had been ‘actively or passively hostile since the late thirteenth century’ were brought into dynastic alliance simply because the Virgin Queen had died without producing children. Year by year during three antecedent centuries, no Scottish sovereign had inherited the throne of England and no English monarch had managed to swallow the northern neighbour, although there had been times when dynastic convergence or political merger had seemed possible.1 ‘The moving finger writes; and, having writ, moves on,’ sang the poet Fitzgerald in Omar Khayyam. The nimble fingers of the historian should at least pause at the word processor and hit the bold command to stress that an event of unusual importance happened in 1603. It is only right to acknowledge at this point that there is a powerful philosophical argument against viewing history in this way. Oakeshott argues that it is meaningless to claim that any single episode ‘changed the course of events,’ since that particular episode was itself part of a course of events that it is alleged to have altered by the happenstance of its supposed intervention. Oakeshott’s formulation can be reconciled with an approach to history that attempts to relate the importance of events to each other rather than one that seeks ‘A caused B’ explanations. However, not only does it lead Oakeshott into his impossible demand for a narrative reconstruction of all events antecedent to an episode (whatever ‘antecedent’ may mean), but its philosophical

190 / Past Futures: The Impossible Necessity of History purity flies in the face of common sense. (We shall encounter this problem again shortly, in the thinking of a Canadian scholar, William M. Baker.) Compare two events that took place in the same year and formed part of the same conflict. On 11 May 1941, a Victorian church in the London suburb of Gidea Park was destroyed by a German bomb. On 11 December 1941, Hitler’s declaration of war on the United States brought the full might of America upon the Nazis. Both events are equally valid as historical facts, but the second was obviously of far greater importance than the first. E.H. Carr’s view that a past episode only rises to the dignity of a historical ‘fact’ when it has been mentioned in print by at least two professional practitioners may be dismissed as a piece of foolishness. All Saints Church, Gidea Park, was bombed just as certainly as Hitler invaded Russia, even though its fate will probably never be mentioned in the Ford Lectures at Oxford. Even if scholars were omniscient and all-wise in their selection of material, it would make about as much sense to put them in charge of the definition of historical facts as it would be to allow murderers to determine the admissibility of courtroom evidence. Oakeshott himself warned against ‘a belief that facts are and remain independent of the theory which is said to connect them.’ Thus we arrive, albeit by a different route, at the obiter dictum that made the Manchester Guardian a great newspaper: ‘Comment is free, but facts are sacred.’2 If historians are to abandon the knowing pretence of explanation for the more subtle technique of locating events in time, they will require terminology capable of underlining those episodes that constituted the essential scaffolding, in order to make sense of the past. There are several candidates. One phrase that helps to convey that sense of structure is the ‘turning point.’ Its weakness lies in the problem that past landmarks straddle obstinate substrata of continuity, as seen in Trevelyan’s celebrated description of the revolutionary year, 1848, as a turning point at which history failed to turn. Solemn intellects may be tempted to talk instead of the underlying ‘meaning’ of events. The difficulty here is that the concept is simultaneously too profound and too superficial. At one extreme, philosophers argue over its interpretation: C.K. Ogden and I.A. Richards even wrote a book called The Meaning of Meaning. At the other, its derivative ‘meaningful’ is so pretentiously vague that it is almost devoid of content. ‘The meaning doesn’t mat-

Significance / 191 ter,’ sang W.S. Gilbert’s fleshly poet, Reginald Bunthorne, ‘if it’s only idle chatter of a transcendental kind.’ Most academics will reject ‘relevance’ with shuddering memories of the excesses of the nineteensixties, when the past was to be studied not in its own terms but to feed the enlightened prejudices of the present. In locating ourselves in time, in measuring our own closeness to the past, we do indeed need to identify those events that have particularly shaped our own world. Yet ‘relevance’ is a blind alley, a present-centred approach that naturally privileges the near past, relegating the whole of the Middle Ages to a story-so-far prologue that has little to say to our own times. Furthermore, by focusing on the tributary status of the past to the present, ‘relevance’ entirely misses the crucial element of the relationship of past events to each other. Another superficially tempting process for testing historical landmarks is the counter-factual approach: for instance, exploring what might have happened if James VI of Scotland had not become James I of England in 1603. The counter-factual approach has a long and respectable pedigree. ‘Historians are sometimes ridiculed for indulging in conjectures about what would have followed in history if some event had fallen out differently,’ remarked J.R. Seeley in his Cambridge lectures. His reply was common sense. ‘To form any opinion or estimate of a great national policy is impossible so long as you refuse even to imagine any other policy pursued.’ Even so traditional a scholar as G.M. Trevelyan once whimsically imagined what would have happened had Napoleon won the battle of Waterloo. More recently, Niall Ferguson has pointed out that the counter-factual is part of the human learning process: we profit from our mistakes by analysing how we might have done better had we acted more wisely. Ferguson argues that ‘decisions about the future are – usually – based on weighing up the potential consequences of alternative courses of action ... [I]t makes sense to compare the actual outcomes of what we did in the past with the conceivable outcomes of what we might have done.’3 We may doubt the assumption that decisions are actually made in such a rational manner, but they are indeed located between a partially known past and a range of perceived futures. If history is indeed a collage of individual decisions, surely this intensely human device, ‘what if?’, can be extended into a generalized analysis of the past.

192 / Past Futures: The Impossible Necessity of History The principal shortcoming of the counter-factual approach lies in the unpredictability of past futures, as Keppel-Jones unwittingly proved in his fantasy history of South Africa after Smuts. A perceptive historian, J.J. Lee, warns that ‘the “if only” school of history’ suffers from ‘an abiding temptation ... to assume that the alternative to what really happened would have been an ideal situation.’ If Hitler had been assassinated in June 1944, the war in Europe would have ended on terms much less favourable to Stalin, although even that hypothesis must be regarded as probable rather than certain. Beyond lay uncertainty and perhaps disaster. Lee points out that victory in Europe could well have encouraged the Americans to attempt an invasion of Japan. Perhaps such a strategy might have spared Hiroshima and Nagasaki, but perhaps, too, without their terrible example, the subsequent Cold War would have lapsed into full-scale nuclear conflict. The killing of Hitler might have prevented actual horrors, but only at the price of opening the way to abominations which we can barely imagine. (One of the many factors that inhibited conspirators inside the Third Reich was the fear that by killing Hitler they might simply reinforce the ‘stab in the back legend’ that had led many Germans to attribute the defeat of 1918 to internal treachery, so leaving the way open for another dictator and a third European war.)4 Thus, one major weakness of the counterfactual approach is that it is reassuringly selective in its choice of alternative scenarios. But it suffers from a deeper, theoretical weakness: it simply gets the whole issue the wrong way round. To take stock of what happened in 1603, we should certainly note the likely negative implications for Anglo-Scots relations had James not united the two Crowns, just as we may permit ourselves to note the opportunities created by the earlier elimination of the Führer. Still, this should not be an invitation to fantasize about what might have come about, but a device to help us reinforce the significance of what actually did happen. ‘Significance,’ remarks the prolific historian D.H. Akenson, is ‘a word employed by historians only at their own peril.’5 In fact, his categorization of the various ways in which it is employed suggests that it is the word itself that is in danger from random usage. For intellectual historians, a significant thinker is somebody highly intelligent, and so to be insignificant is to be stupid. Quantitative historians treat significance

Significance / 193 as a statistical concept: thus, Japanese textile exports in 1913 were ‘insignificant’ because they amounted to only one-fiftieth of the British total. For social historians a significant fact is one that symbolically captures a wider phenomenon. It was not because Grace Morris made any sensational contribution to the shaping of modern Canada that a textbook describes her grief at the deaths of her brother and her fiancé in the First World War. It is rather that her bereavement was replicated by tens of thousands of other women who coped with tragedy as their menfolk perished.6 ‘A significant fact for most historians is one which helps them to make a case for their explanation and to communicate its nature to the reader,’ remarks Fischer. The danger here is similar to the trap inherent in determining which are the antecedent events that account for a historical episode: the identification of significance may proceed backwards from the adoption of the causal hypothesis that has already determined some events to be more important than others. Fischer seems to be aware of the pitfalls. ‘Significance,’ he remarks, is ‘the sort of word which sends shudders down the spine of every working historian.’7 There is evidence enough that ‘significance’ is a word that is very loaded but poorly targeted. Some historians are oddly tongue-tied in their use of the term, while others become absurdly bombastic. Assessing the founding of a nationalist party as a contribution to Welsh politics in the inter-war period, K.O. Morgan oddly remarks that ‘Plaid Cymru’s immediate significance lay rather in the future.’ For A.J.P. Taylor, Sir Hugh Dowding’s placing of his pencil on the cabinet table was ‘a warning of immeasurable significance,’ to which we must surely add ‘of a transcendental kind,’ since we cannot be sure that the incident actually happened.8 Shakespeare warned against such trivialization when he dismissed so much human life as ‘a tale told by an idiot, full of sound and fury, signifying nothing.’ The Bard’s usage takes us back into the realms of ‘wherefore?’, seeking cosmic profundity in the form of a sign or portent. For biblical scholars, the word ‘significance’ retained this sense at least until recent times. When the easily forgotten E.H. Horne lectured on The Significance of Air War in 1937, he was only incidentally concerned with the argument that the bomber would always get through. What mattered to Horne was that war in the air fulfilled the prophecy of the seventh vial in the sixteenth

194 / Past Futures: The Impossible Necessity of History chapter of the Book of Revelations, and so had to be taken as a sign of the imminence of the Second Coming.9 By imposing our utterly neutral but indisputably superior modern values, we may comfortably brush Horne aside as intellectually insignificant. It is harder to dismiss Canada’s longest-serving prime minister, William Lyon Mackenzie King, as merely stupid. Yet throughout the long decades of his political ascendancy, King took special note in his private diaries of such matters as the way in which the hands of the clock formed a straight line at the very moment of major cabinet decisions, or his recording of important events in his diary. ‘What significance he attached to the occurrences is difficult to determine,’ concluded one puzzled biographer, since King never spelled out their import, although he seemed to regard such incidents as evidence that ‘some presence was making itself known to me.’ Indeed, he hardly ever wrote of ‘significance’ at all. There is a rare but revealing example in his diary account of receiving George VI in his hotel bedroom after falling ill during a visit to London in 1948. On being told that his sovereign was on his way from Buckingham Palace, Mackenzie King opened at random a book of poems, ‘thinking that I would find something that would be significant.’ In poem number 95 on page 68, his eye fell upon two phrases: ‘Mother and Father’ and ‘come from the palace.’ Since he had been (as usual) thinking of his mother, King found this conjunction ‘quite remarkable.’ Furthermore, poem number 95 on page 68 confirmed this curious sign. ‘Here is another strange thing: 9 and 5 added together equals 14. 6 and 8 added together equals 14. Four 7’s in all.’ It is alarming to find that on that same ‘eventful and memorable’ day, this same Canadian statesman received a confidential briefing about Russia’s capacity to manufacture an atomic bomb. However, when he retired to bed at midnight, noting that the hands of the clock were ‘exactly together’ (which, at twelve o’clock, is hardly surprising), King’s abiding impression of his day was ‘how often the hands of the clock have been together when I have looked at it.’ King’s confidential staff were well aware of their master’s foibles. Among themselves, they required a code that could take account of his rambles in the netherworld without letting slip to the Canadian public that their leader sometimes departed from the strictly rational. In the days before

Significance / 195 the word processor, ministerial offices processed their correspondence through typing pools, and overworked stenographers sometimes made minor errors. One day, a draft letter inadvertently added an extra letter to the word ‘significant,’ making it read ‘siginificant.’ The mistake gave his staff the code word they needed to warn each other when King was in transcendental mood.10 Historians need to make a distinction between major events and underlying trends that are genuinely significant and the detritus of assertive pencils and straight lines on clock faces that are, at most, siginificant, if not altogether devoid of importance. Surprisingly few seem to have made the attempt. Rather, historical research has headed, ostrich-like, into ever smaller and more specific pockets of ultraspecialization, by practitioners who seem to take perverse pride in refusing even to admit the possible existence of a big picture. A commendable exception was William M. Baker who, in the nineteeneighties, found himself wondering just why he was so immersed in studying the Lethbridge coal-miners’ strike of 1906. Although Baker was a distinguished faculty member at the University of Lethbridge, somehow he felt that it was inadequate to account for his enthusiasm in terms of ‘Well, he would, wouldn’t he?’ Various overarching reasons presented themselves in justification of the project. It was one of the longest strikes in prairie history, a precursor (whatever, in terms of relevant antecedence, that might mean) of the Winnipeg General Strike of 1919. It was a landmark in the development of class confrontation in Alberta, a point which excited those who saw such conflict as the motor force of history. The intransigence of the participants – or so it has been claimed – was the ‘catalyst’ for the first important Dominion legislation in the field of industrial relations. Lethbridge even played a part in the rise of Mackenzie King, who used the strike to press for the creation of machinery for arbitration. But although King, in the words of two of his biographers, ‘was not unaware of the significance of what was occurring,’ Baker felt that he needed something more to justify his own involvement as a historian. Baker found the trigger he needed to set off his own thought processes in the analogy of the human ear. The ear is part of the body, but it is composed of component parts itself. ‘Anything is a sub-set of at least one other thing and anything has its own sub-set(s).’ Within this

196 / Past Futures: The Impossible Necessity of History biological continuum, there is ‘no hierarchy of intrinsic importance.’ Therefore, ‘what one focuses upon is not by definition more or less significant or important but simply a vantage point ... Significance lies in the relationship of an item to other aspects of the continuum or totality.’ The bad news was that the ‘totality of relationships (i.e. significance) can never be completely described or explained.’ Much of this can be embraced wholeheartedly as congruent with the argument of these pages: explanation can never be total and its incompleteness may even be misleading; the true task of the historian is to identify the relationship between past episodes, to locate them in time. Unfortunately, in one important element, the Baker package flies in the face of common sense. It may be that, in theological terms, dung beetles and bureaucrats are equal in the sight of the Almighty, but however intrinsic the integrity of their individual existences, events do not all punch the same weight. Hitler’s destruction of a church in Gidea Park cannot be claimed to rank equal with his declaration of war upon the United States. Today the world; tomorrow Lethbridge – the strategy encourages an Alberta community to see its own past in a wider context. ‘The perception of broadened horizons is another way of saying that “significance” lies in the demonstration of relationships.’ True enough, but that demonstration will surely reveal that some of those relationships are more significant than others. My breakfast, even though a statistically repeated daily event, cannot command the same historical importance as the one-off invasion of Normandy.11 Significance is approached here through three discussions. The first relates to underlying elements whose importance is fundamental but whose presence as part of the historical landscape is so basic that they may be easily overlooked. In this way, a role may be identified for those ‘factors’ that Elton wished to consign to the management of Scottish estates.12 The second focuses on negative significance, the silent importance of what did not happen – provided always that we are not indulging in the fantastical counter-factual but can reasonably contend that something else might very well, perhaps even more logically ought to, have taken place. The third unites these two to identify examples of long-term underlying developments, those belonging to Braudel’s concept of social time, that often slip through the mesh of

Significance / 197 standard periodized historical ‘explanation.’ In a tailpiece, all three approaches are brought together in a discussion of the Second World War, an example chosen not to launch a firestorm of archival research, but rather because the outline of events is widely known, and the war itself is assumed to have shaped the following half-century. If history is to be placed where it belongs, at the centre of intellectual debate, then techniques of assessment are required that enable the non-specialist as well as the expert to measure the significance of past events. Individual landmark events acquire their significance, establish their quasi-biblical status as portents, because they are seen in retrospect to conform with the underlying historical geology: thus, the concept of significance is related to the conundrum of a long time in history. Goldwin Smith blithely assumed that the accession of Scotland’s James VI to the English throne had ‘the great forces on its side,’ and so the union of the Crowns proved to be enduring. Perhaps this was true, but it was a truth that did not become evident for at least a century. The Scots refused to follow England and become a republic in 1649, and it was no foregone conclusion that they would march in step with their southern neighbours in accepting William III in 1689 or the Hanoverian succession after 1702. Even in 1706, the Treaty of Union was accepted by the Edinburgh Parliament not because it was backed by the ‘great forces’ but because it secured the support of the Squadrone Volante, a splinter group of greedy Scottish politicians whose horizons took little account of the tides of history.13 All this may merely prove Smith’s own saving clause that secondary forces can prove remarkably persistent obstructions, but there was enough uncertainty in AngloScots relations in the century after James VI became king of England in 1603 to place us on our guard against isolating that ‘startling and dramatic’ event from the longer-term context necessary to assess and confirm its significance. Obviously, if one hundred years was required to measure the import of an event that occurred in 1603, there are considerable problems for the identification of significance in times closer to our own. The Chinese Communist who claimed that it was too early to assess the importance of the French Revolution of 1789 was perhaps unduly cautious, but it is impossible to be sure of the significance of most episodes in the twentieth century, simply because we do not know whether we have

198 / Past Futures: The Impossible Necessity of History arrived or are still travelling. Worse still, an underlying assumption that a significant event must also be a dramatic episode carries the risk that we may fail to see the wood for the trees. Linda Colley draws attention to the ‘absolute centrality of Protestantism’ in eighteenthcentury Britain as something ‘so obvious that it has proved easy to pass over.’ Historians, she complains, dislike stating the obvious and ‘have preferred to concentrate on the divisions that existed within the Protestant community.’14 This goes a little beyond the traditional division of historians into rival camps of splitters and groupers to focus on the underlying strata that we take for granted. Historians of Canada accept that the four provinces of 1867 grew to the ten of 1949, but rarely pause to consider the implications of the resulting jumble of uneven area and population. Over a century ago, Sir John A. Macdonald was troubled by emergence of two mega-provinces, and the dominance of Ontario and Quebec has elevated the notion of ‘asymmetry’ into a constitutional theory. It was no mere accident that Canada’s provincial structure did not evolve into half a dozen regional blocs: on more than one occasion, human decision has blocked the option of Maritime Union, and it was by no means predestined that two provinces rather than one would be carved out of the prairies in 1905. Equally, the example of New Zealand, where provinces subdivided and quickly collapsed, points to the significance of a Canada made up of ten provinces rather than twenty. In central Canada, the arbitrary boundaries of 1791 proved to be remarkably durable, so much so that one of the most logical of all ‘Laurentian’ developments, the creation of a riverine province spanning Montreal and Kingston, vanished altogether after making a wraithlike appearance in Alexander Galt’s federation resolutions of 1858. One of the most notable contrasts in the politics of western Canada for much of the twentieth century was to be found in the divergence between left-wing Saskatchewan and right-wing Alberta. Since in each case their predominant ideologies seem to be rooted in the nature of local society, it is easy to take for granted that their creation as two distinct provinces in 1905 represented some natural and inevitable outcome. This was not so. Until the last minute, several alternative possibilities were actively canvassed. What became Alberta and Saskatchewan might indeed have split into two provinces, but horizontally to reflect patterns of

Significance / 199 agriculture and rival transcontinental railway routes. They might have been incorporated into a Greater Manitoba, or entered Confederation as a single province, a sort of ‘Alchewan.’15 Sometimes, restating the obvious – that Montreal never became a provincial capital and Canada did not acquire a Saskberta – may open the way to some fruitful reinterpretation. Historians are remarkably indifferent to two very basic reservoirs of underlying significance – population and mortality. It is one thing to reject the Viking Syndrome, with its pompous invocation of ‘demography’ as an explain-all device to solve historical problems. It is another to overlook the factor of population altogether. Surprisingly few histories of a country or region bother to start with a description of the people who lived there: how many of them?, were they urban or rural?, were they divided by region or language or religion? Somehow this seems to be almanac stuff, beneath the dignity of the creative and analytical scholar. A major history of nineteenth-century Japan lets slip for the first time on page 571 that the islands contained about thirty million people.16 A useful history of the Pacific mentions on page 153 that the population of Tahiti fell from over 50,000 in the mid-eighteenth century to perhaps 8000 by 1800.17 Just as the measurement of time requires the subjective assessment of whether it is ‘long’ or ‘short,’ so assessment of the significance of population must involve deciding whether it is ‘large’ or ‘small.’ It becomes much easier to grasp how Europeans came to take control of Tahiti, which surely falls into the second category, but were kept at bay in Japan, which belongs in the first. The comparison between Japan and Tahiti in turn throws into relief that startling and dramatic aspect of the European imperial experience, one which so inspired and perplexed the Victorians, the British empire in India. Here population provides a clue of a different kind, not simply to the origins of the Raj, but also to the way in which the British exercised sway and to the kind of regime which succeeded them. Both the Mughal and the Ottoman empires created highly bureaucratic systems of government. Europeans took control of the first but not the second, even though India contained five times as many people as the Turkish empire. A Cambridge historian of India, C.A. Bayly, suggests that this may be connected to the fact that Istanbul, a city of 600,000

200 / Past Futures: The Impossible Necessity of History people, could exercise firmer control over its 30,000,000 subjects than was possible from Delhi, whose 20,000 people were lost in an empire of 150,000,000.18 To focus exclusively on population is of course to come in half way through the story: administrative devolution helped explain why Delhi was so much smaller, while north India’s recurrent droughts made it difficult for any capital city to maintain a bureaucracy large enough to centralize power. ‘How was it possible that 760 British members of the ruling Indian Civil Service could as late as 1939, in the face of the massive force of the Indian national movement led by Gandhi[,] “hold down” 378 million Indians?’ asks Anthony Low, a historian of the British empire with a talent for posing awkward questions. The answer lies more in the good fortune of the British in taking over at the top level of an existing machine of government than in their superior manipulation of the stiff upper lip. Of course, this does not tell us why the British proved more effective at operating the Mughal machinery of government than the Mughals themselves. However, it does render less surprising the fact that the illusion of control could not be indefinitely prolonged. It followed that when imperial hands were prised from the driving wheel by Indian nationalism in 1947, the British found that they had little scope to control the process of decolonization on the subcontinent. By contrast, John Darwin notes that, at the same time, ‘the British carried through the transfer of power in Ceylon with flamboyant ease,’ but his thoughtful discussion nowhere mentions that in 1947 the population of Ceylon was about eight million, one-fiftieth the size of India.19 Of course, population cannot tell the whole story. There had been no easy flamboyance in dealing with a far smaller nationalist population of Ireland, while the fact that the British had twice conquered Burma in the sixty years before their departure reduced to minimal levels the amount of collaborative sweetness and light to be found among the eighteen million Burmese in 1947. None the less, to compare decolonization in India and Ceylon without any allusion to their respective populations is akin to wondering why the elephant is less nimble than the monkey. Population is one of the underlying but easily missed themes of Canadian history: Canada’s population was small both when considered absolutely and when placed alongside its usual contextual com-

Significance / 201 parators, the United States and Britain. Take, for instance, native people. Cautious estimates of the total number of aboriginal people in what became Canada before the arrival of Europeans range from 200,000 to half a million. In the absence of formal counting, there must be an element of guesswork in such figures, and Australian experience suggests that increased sensitivity to contemporary race relations tends to push up assessments of indigenous populations in earlier times. None the less, it seems reasonable to impose the subjective tag ‘small’ in relation to a population no larger than late-twentieth-century Calgary spread across half a continent.20 The exercise is, of course, artificial and, in a sense, even absurd, since there was no such entity as a transcontinental ‘Canada’ at the time of first contact with Europeans. Even the concept of ‘contact’ is arbitrary, since germs probably moved faster than explorers and missionaries. None the less, there is a sharp contrast with the colony of Natal, which has the area of New Brunswick, where in 1910 there were over a million Africans. Comparing total indigenous populations in this way can be misleading, since grouping all native people into a single category represents the imposition of a European perspective. Most Africans in Natal were Zulu but, measured by their own classifications, Canada’s diverse First Nations were often very small indeed. The two greatest tragedies of the European impact upon aboriginal Canadians were the destruction of the Huron in the seventeenth century and the extinction of the Beothuk by the early nineteenth. Recent estimates place the population of the Huron at around eighteen thousand when they first encountered the external world, and the Beothuk at perhaps two thousand. When the naturalist Joseph Banks visited Newfoundland in 1766, he was assured that the Beothuk in the interior numbered about five hundred although, as he sensibly noted, since no European ever saw them it was difficult to grasp the basis of the calculation.21 By any standards, these were small populations, and numbers were not markedly larger further west. The entire native population of the Plains in 1809 has recently been calculated at 43,000, and one sympathetic historian doubts whether it had ever exceeded 50,000.22 Of course, to categorize aboriginal populations as ‘small’ is not to claim that they possessed no rights to the soil on which they lived, and still less to imply that it was justifiable to sweep them aside. Far too often, indigenous people were

202 / Past Futures: The Impossible Necessity of History treated with injustice that was all the more marked when we consider how cheaply generosity could have been shown to so few. Yet if we seek to separate ourselves from the burden of ethical censure, turning off the tap of morality as Butterfield cautioned us, it is not difficult to see how Canada’s native peoples so rapidly became numerical and cultural minorities. It is not only Canada’s aboriginal peoples whose experience has been shaped by smallness of population. The Red River Metis numbered about 6000 when they turned to the twenty-five-year-old Louis Riel for leadership in 1869, largely because he was the only member of their community with experience of the puzzling outer world that was Canada. Riel’s leadership eventually proved disastrous to his people, but he was probably the best they could find. Similarly, only a very small pond containing some very strange fish could have thrown into prominence the pioneer British Columbia politician William A. Smith, especially after he had changed his prosaic name to the trilingual Amor de Cosmos. The white population of the province had reached 18,000 by 1881, not long after Lord Dufferin had despairingly concluded that ‘the truth is British Columbia is hardly a large enough Community to have as yet developed a conscience.’23 Untrammelled by any such inconvenience, the early settlers decided that the easiest way to deal with the issue of native title was to ignore it altogether. British Columbia is still plagued by the amorality of the de Cosmos era. Recognition of the smallness of the Canadian population is a step towards underlining the huge significance of at least two aspects of the country’s history which, to adapt Colley’s phraseology, are so obvious that we may easily pass over them. The first is the survival of Quebec as a French-speaking society. In the litany of significant small populations that characterize the landmarks of Canadian history, the plus or minus sixty thousand remnants of the French regime swept into the British Empire in 1759–63 must stand out. That they did not follow other North American linguistic minorities, such as the New York Dutch, into anglicized oblivion remains one of the core facts of Canadian history. At six million, the modern francophone population seems secure enough, but intermediate figures along the way serve to underline the consistent fragility of the French fact. There were fewer than half a million of them when Lower Canada made its desperate bid for

Significance / 203 independence in 1837, around a million at the time of Confederation, two million by the nineteen-twenties. Until the eighteen-eighties, more people spoke Welsh in Wales than spoke French in Canada. When coupled with the hostility, on the one side, and the doubts, on the other, that have surrounded the past futures of Canadian French, its survival undoubtedly merits recognition as an achievement of remarkable historical significance. This, however, is not to accord miraculous status to the existence of Quebec. Rather, it should point to another area of significance that is easily overlooked. It is a commonplace that Quebec was cut off politically from France after 1763. This should not obscure the enormous cultural underpinning provided by a metropolitan French population that rose from thirty million at the time of Waterloo to forty million a century later. Even in the nineteen-nineties, as separatists urged their claim to full nationhood, the Paris region contained three million more francophones than the whole of Quebec. Unlike Wales, Quebec could import the books and the films needed for comprehensive cultural survival. This, in turn, points to another aspect of significance in the French identity of Quebec. The Dutch settlers of South Africa shook off the far weaker cultural influences of their small European motherland and elevated the taal, the Boer equivalent of joual, into a distinct language. If it is a fundamental fact that Quebec resisted the destiny of anglicization, it is also significant that Canadian French did not evolve into a North American equivalent of Afrikaans. The gens du pays remained mainstream francophones thanks to the massive external presence of the enfants de la patrie. The second aspect of the country’s history which seems disproportionate to its small population is Canada’s enormous contribution in manpower to the two world wars, neither of which was fought anywhere near Canadian soil. As a simple demographic sum, it remains staggering that Sir Robert Borden could have committed Canada in 1916 to maintaining an overseas fighting army of 500,000 from a country whose total population of men, women, and children did not exceed ten million. Politicians may get their targets wrong, but even more striking is the fact that 450,000 Canadians had voluntarily enlisted by the end of 1917. The extent of Canadian commitment becomes yet more startling when it is remembered that around onethird of the country’s population was caught in a conflict of loyalties,

204 / Past Futures: The Impossible Necessity of History either to Quebec or to recent family origins in enemy countries. Canada’s role in the First World War is a story full of sound and fury that signifies a great deal. It confirms that the centre of gravity of Canadian identity for the majority of its population lay outside the country itself. Those 450,000 personal commitments to fight constitute an aweinspiring reminder that Canada’s history is a collage of individual decisions – and, as discussed in chapter 4, even in the cases of men as articulate as Arthur Lower and Lester Pearson, the precise reasons for enlistment could not always be put into words. None the less, if the Canadian government’s target of half a million soldiers was unrealistic, then its 90 per cent achievement by the Canadian people remains little short of sensational. Canada’s Second World War is usually seen as less divisive since the conflict relied more on technology than upon manpower. Hence, from 1941, Ottawa’s role was largely to act as a hyphen in the Atlantic alliance. After all, Britain had five times the population of Canada and the United States contained almost three times as many people again. All the more startling, then, that when the Allies invaded Normandy in June 1944, one out of the five D-Day beaches was assaulted by the Canadians, leaving two apiece to the United States and Britain. Canada’s fighting role in the Second World War only comes into full focus when the significance of its far smaller population is underlined. Indeed, a missing thread in the evaluation of the Canadian experience is the failure of historians to ram home the simple statistical point that for much of the country’s history, there have been ten times as many people living to the south of the border as to the north. In each of the first four decades after Confederation, the increase in the population of the United States alone was twice the total number of Canadians. In the North American league, Canada has had to run fast in order to stand still. The 130 million Americans of 1940 had doubled fifty years later: in demographic terms, the United States had managed to squeeze four and a half Canadas within its borders in just half a century. Living so close to the cultural suffocation of American influences, Canadians are tempted to denigrate their own national identity as inadequate and insecure. In the light of the staggering statistics of American population growth, it makes more sense to invert the assumption. The real significance of the Canadian collective

Significance / 205 identity lies in the strength that has enabled it to survive – and even to flourish. For it is not only in comparison with the United States that Canada has been outgunned in sheer numbers. The United Kingdom could be squeezed into the area of Labrador and still leave enough room for most of Nova Scotia, yet even at the opening of the twenty-first century, Britain contained twice the population of the whole of Canada. It is not surprising that so many Canadians saw their country for so long as an outstation of a larger imperial community, nor that some of the ablest and most ambitious were lured to careers back ‘home.’ The British orientation was reinforced by surges of immigration which had massive effects on the small numbers of the receiving community. Large-scale immigration from Britain contributed to the explosion of settlement that formed the background to the collage of individual decisions that brought about the rebellions of 1837. Another surge played a large part in creating the regional imbalance that culminated in Confederation thirty years later. The torrential but rarely noticed influx of about a million people in the half-dozen years prior to 1914 provided an important underpinning of Canada’s rush to the colours in the First World War: when we study the origins of a Canadian sense of cultural nationalism, we should remember that even two of the founder members of that definingly nationalist group of painters, the Group of Seven, were barely off the boat. Thus, not only do historians study periods of time without attempting any subjective evaluation of duration, but they describe communities of people without bothering to enumerate them by size. Perhaps even more surprising, given that most of the people we study are dead, historians devote little attention to that most unavoidable aspect of the human experience, mortality. Just as many actors have played James Bond but there remains only one 007, so monarchs and prime ministers are allowed to come and go like characters in a cartoon strip. Historians may take account of changes in personality and concomitant shifts in policy, but of the significance of the fact that Edward VI died at the age of sixteen while Elizabeth I survived to sixty-nine, we hear virtually nothing. The king is dead; long live the king. In the less squeamish past, greater significance was accorded to indi-

206 / Past Futures: The Impossible Necessity of History vidual mortality. Since by far the cleverest of Charles I’s advisers was the Earl of Strafford, the king’s opponents were anxious to find the best means of removing him from the scene. In fact, there was only one guaranteed way to be permanently rid of the indispensable Strafford: as the Earl of Essex succinctly remarked, ‘Stone dead hath no fellow.’24 After meekly agreeing to the execution of his minister in 1642, Charles spent the rest of the sixteen-forties assuming that however inept his leadership, ultimately he would prove to be necessary to the political structure simply because he was the king. In a sense, the Restoration of 1660 proved that England could not manage without its monarchy. Unluckily for Charles, his enemies had sought to simplify matters by managing without that particular monarch, whom they consigned to the block in 1649. If death is one person’s finality, it can also be another’s opportunity. Thomas Howard, Duke of Norfolk, was awaiting execution for treason in the Tower of London on the January day in 1547 when Henry VIII’s death unexpectedly secured his reprieve. Since the new king, Edward VI, was only ten years old, there were good reasons for hurrying on his coronation to discourage any challengers. By great good fortune, Norfolk was the hereditary official charged with organizing the ceremony, and the simplest procedural solution was to let him go free. Norfolk’s son was not available to take over. Less lucky than his father, his turn to go to the block had come a few days earlier. In 1822, George Canning was on the point of leaving for a new career in India when the suicide of his political rival, Castlereagh, suddenly opened the way for him to become Foreign Secretary. Clusters of deaths can also have significance. Canning himself died shortly after succeeding Lord Liverpool as prime minister in 1827. His faction, the Liberal Tories, rallied around Lord Goderich, who scrabbled hold of power for just long enough to inscribe his name on the map of Ontario before it became clear that leadership was not his forte. The Liberal Tories now looked to the heavily-built William Huskisson, but in 1830 he somehow symbolized the indecision of centrist politics when he failed to get out of the way of an oncoming steam engine during the opening ceremony of England’s first major railway line. After losing four leaders, three of them terminally, in three years, the surviving Liberal Tories were left with little alternative but to collapse into fusion with the opposition Whigs, so

Significance / 207 opening the way to the Reform Act of 1832. In a later generation, it was the deaths of two of his contemporaries and political allies, Sidney Herbert and the Duke of Newcastle, both in their early fifties, that helped to ‘unmuzzle’ Gladstone as an advanced reformer after 1864. It is only the modern Western world that has taken for granted that most people will live to a ripe old age. Noting a report that a minor author had died of overwork, an English newspaper commented in 1849 that ‘it hardly seems necessary to accuse that or anything else of killing an aged man of 74.’ The free-trade campaigner Richard Cobden was just thirty-four when he wrote in 1837 that ‘we are not made for rivalling Methuselah, and if we can by care stave off the grim enemy for twenty years longer, we shall do more than nature intended for us.’ (‘And all the days of Methuselah were nine hundred and sixty nine years,’ says the book of Genesis, ‘and he died.’ No wonder.) ‘At my age, death is always near,’ wrote a gloomy Susanna Moodie in 1856: she was fifty-two at the time. By contrast, a century later, when the British Labour leader Hugh Gaitskell died (admittedly of a rare virus) in 1963 at the age of fifty-six, there were persistent rumours that he had been murdered by the Russians. Few Victorians survived long enough to be put out to grass: indeed, jobs in the civil service and the world of education were usually for life. The Ottawa civil servant Edmund Meredith was able to retire, by coincidence on his sixty-first birthday in 1878, largely because the outgoing Liberal government needed a vacancy for a last-minute patronage appointment. As a ‘Superannuated Man,’ he felt ‘somewhat uncomfortable’ as he walked the streets in perfect health. It was still an unusual event when, after forty years of energetically flagellating his pupils at an English public school, the sadistic headmaster H.W. Moss delighted his many victims by retiring to the Herefordshire village of Much Birch in 1908. When old-age pensions were introduced in Britain the following year, a mere 3 per cent of the population lived to the qualifying age of seventy. Consequently, modern historians too easily overlook the fact that so many of their subjects died young. Thus, Dale C. Thomson dismisses the appointment of the ‘ageing and harmless’ A.J. Fergusson Blair to the Canadian cabinet in 1866, presumably because Blair was to die a year later.25 Lord Monck had tried without success in 1864 to appoint Fergusson Blair as premier, which perhaps accounts for the label ‘harmless.’ No

208 / Past Futures: The Impossible Necessity of History doubt, too, we are all constantly ageing. However, Fergusson Blair’s death at the age of fifty-two was unexpected and, for John A. Macdonald, a little awkward. Macdonald sought to portray his government as a coalition, and since Fergusson Blair was one of the few Reformers willing to coalesce, his early death was of some political significance. To assess the significance of untimely deaths is to flirt with the counter-factual and, like most flirtations, this can be a dangerous business. The murder of Abraham Lincoln at the age of fifty-five, with four full years of his second term as president ahead of him, is undeniably a dramatic moment in American history. Should we go further and label it as a significant event as well? The chief speculative riddle here is whether Lincoln would have handled Reconstruction more successfully than Andrew Johnson, the vice-president who was catapulted into office by the assassination. This in turn is partly an ethical question: pace the role of sex in the British Empire, is ‘success’ to be defined in terms of efficiency or morality? In nineteenth-century terms, a successful policy of Reconstruction would probably have been one that persuaded Northern white Republicans to accept the rapid reintegration into the political fabric of Southern white Democrats. Johnson, who was both a Southerner and a long-time Democrat, carried unhelpful baggage, and of course he lacked the aura of successful wartime leadership that Lincoln would have mobilized. Viewed in that sense, the murder at Ford’s Theater was indeed significant. If, however, we impose a late-twentieth-century definition of a successful policy of Reconstruction, one that integrated all Americans regardless of race, it is less likely that Lincoln would have scored high marks, and from that point of view his replacement by another poor white from the same log-cabin background was probably not significant at all. Early death is not always a historical landmark: nowadays only conspiracy theorists see the murder of John F. Kennedy as anything more than a gruesome footnote to the nineteen-sixties. If Dallas was significant at all, it is probably because Kennedy’s successor, another Johnson from the South, was effective in manipulating Congress into endorsing a program of social reform to a degree that could hardly have been expected from the former junior senator from Massachusetts. By contrast, Gaitskell’s biographer insists that the ‘political repercussions’ of his

Significance / 209 death ‘were of much greater significance’ than any conspiracy theories about its cause. For the British Labour party, far more than for the American Democrats, the unexpected death of a relatively young leader in 1963 forced it to confront its political ideology and electoral strategy.26 Two untimely deaths in the eighteen-nineties confirm the distance between significance and mere poignancy. The Irish nationalist leader, Charles Stewart Parnell, died suddenly in 1891 at the age of forty-five. The Canadian prime minister, Sir John Thompson, was forty-nine when he suffered a fatal heart attack while lunching with Queen Victoria at Windsor Castle in 1894. Parnell towered above his party; Thompson was rapidly acquiring a political ascendancy to fill the vacuum left by the passing of Sir John A. Macdonald. Could Parnell have ended the developing split that went on to bedevil the Irish nationalist cause for over a decade? Would Thompson’s leadership have enabled his Conservative party to escape electoral defeat in 1896? It is not necessary to stray too far into the counter-factual to become doubtful. Parnell’s party was already angrily divided over their Chief’s involvement in a divorce case. The fact that the split in Irish politics endured for so long after Parnell himself ceased to be its living focus suggests that it had much to do with deeper disagreement over political tactics. Similarly, the Canadian Conservatives came to grief because they first attempted and then failed to impose a Catholic school system on the province of Manitoba. Thompson’s successor, Mackenzie Bowell, was Grand Master of the Orange Order, but even that dubious asset could not save the Conservatives from the challenge of a Protestant third party. There is no reason to assume that Thompson, an English-speaking Catholic, would have inspired greater trust either in Protestant Ontario or in French Quebec. However, the limitation in this analysis is that, of necessity, it focuses upon the short-term events of the eighteennineties. In an era before youth became the required hallmark for political leadership, Parnell and Thompson – each of them dominating figures – might have returned to the forefront. Parnell would have been sixty-eight at the time of the Irish crisis of 1914; Gladstone at the same age in 1877 was denouncing the misgovernment of Bulgaria and was destined to fight four more general elections. Thompson would have been seventy-one in 1917, when Canada split over conscription:

210 / Past Futures: The Impossible Necessity of History John A. Macdonald at the same age had just weathered the second Riel uprising, presided over the completion of the Pacific railway, and was still to lead his party to two more victories at the polls. Of course, Parnell and Thompson were not there to hold Ireland together for Home Rule and unite Canada in the cauldron of war: stone dead hath no fellow. But the British prime minister H.H. Asquith, who had known both Parnell and Gladstone, remarked that ‘as long as some men die at forty-five and others live to be ninety [a slight exaggeration in Gladstone’s case], political prophecy will be a fond and futile art.’ To write history long after the event and fail to underline the significance of untimely death is not only futile but obtuse.27 None the less, given the intractability of death and the unknowability of alternative versions of the past, some historians may continue to relegate untimely death to the margins of counter-factual fantasy. To allow kings and prime ministers to succeed each other without comment as ripples in the pages of the textbooks, it will be argued, represents not so much mindless narrative as the recognition that history is the product of forces far greater than mere individuals. This was the view of A.L. Rowse, though he accepted that ‘the whole surface pattern’ of English history would have been different if Richard II had not been deposed or if Queen Anne had produced a son to succeed her on the throne. Even so, ‘it is probable that the underlying story of England would have been much the same ... for that depends upon much more profound forces.’28 There is, however, one example in the surface pattern of English history where the death of a queen merits more than a passing mention. Mary I had ruled England for just five years when she died in 1558. The ‘Bloody Mary’ of folk memory had persecuted Protestants, burning several hundred of them for heresy and forcing thousands more into exile. To copper-bottom England’s subordinate role in the European Catholic world, she had married Philip of Spain and persuaded parliament to recognize him as king of England, although only for her lifetime. Since nowadays we all emulate Disraeli’s sensible men and live in a secular, post-nationalist world, we may feel inhibited in hailing 1558 as a crucial year on England’s road to becoming a Protestant nation and an imperial power. Even so, it cannot be denied that the death of Mary and the succession of Elizabeth proved to have both medium- and long-term consequences.

Significance / 211 Mary’s death was remarkable in two respects. The first was that the very same day also saw the passing of the Archbishop of Canterbury, Cardinal Reginald Pole, the pope’s strong man in England. Elizabeth, who was in any case a cautious reformer in religious matters, would have found it much harder to break with Rome had the determined Pole remained at the head of the English church. (Henry VIII’s earlier attempt to take control of the English church in the fifteen-thirties had been hampered by the wonderful durability of an earlier archbishop, the octogenarian but low-profile William Warham. Pole, at fifty-eight, was a far tougher ideologue.) Second, Mary herself was just forty-two when she died. To precipitate a right-angled turn in religious policy by dying at forty-two, surely, constitutes a fact of historical significance. Sir John Neale, Elizabeth’s admiring biographer, could hardly wait for his heroine to become queen. ‘Mary’s years were running out,’ he wrote, as if she were challenging Archbishop Warham’s longevity.29 In one sense, Mary was lucky to be alive at all: even if they kept their necks in one piece, the Tudors did not make old bones, and most died in childhood. Yet having survived to her middle years, Mary’s death at forty-two does seem so untimely as to be historically significant. Her mother, Catherine of Aragon, lived to be fifty, despite the public heartbreak of being dumped by her husband. Henry himself managed to struggle to fifty-six in defiance of an indulgently unhealthy lifestyle. What tentative conclusions might be drawn if we were to give Mary even the eight extra counter-factual years that her wronged mother managed to notch up, let alone the additional quarter-century enjoyed by her successor, Elizabeth, who lived to be sixty-nine? Mary’s ruthless persecutions could only have further weakened Protestantism. Recent scholarship throws doubt on the picture of an England straining at the leash for religious revolution, and three of the four major rebellions in mid-Tudor times were partly inspired by loyalty to the old ways in religion. Given more time, Philip might well have built up enough support to have managed the succession to his wife. A later foreign cosovereign, Dutch William, managed to hold on as England’s king William III after the death of his wife and throne-ticket, the comfortably unsanguinary Mary II. Above all, after another eight years of Bloody Mary, there might have been no Elizabeth to take over. Even without the smallpox epidemic that almost killed her in 1562, the life

212 / Past Futures: The Impossible Necessity of History expectancy of inconvenient heirs to the throne was not great in sixteenth-century England. Of course, none of these things happened. Mary did not extirpate Protestantism. Philip did not take control. Elizabeth was not eliminated. Yet to consider the import of these possibilities is not to make play with the past, for the alternatives remain sufficiently plausible to justify the conclusion that Mary’s death in 1558 was something more than a Braudelian ripple on the underlying story of English history. It was not simply Mary’s death that was significant, but Mary’s death at the early age of forty-two. Events become historically significant if we can reasonably assume that a markedly different outcome was plausible. It is significant that Mary I died at the age she did and not at 69, like her half-sister Elizabeth. It is of no significance that Mary died at the age of 42 rather than at 969 since, as Cobden knew all too well, the human body is not programmed to survive to the age of Methuselah. Similarly, the significance of the succession to the English Crown in 1603 lay in the fact that James VI was king of Scotland rather than ruler of France or Spain. It would be straining the notion of significance to suggest that it mattered that he was neither a Maori chief nor a Hindu raja, since neither could conceivably have been in the running to inherit the English throne. Counter-factual speculation is only useful insofar as it underlines the importance of what really did happen. To persuade us of the significance of the fact that another outcome did not materialize, the postulated alternative must be realistic and not merely fantastical.30 The classic expression of the significance of the negative is found in a short story by Arthur Conan Doyle. Silver Blaze, a much fancied racehorse, vanishes from its stables on Dartmoor shortly before a big race. Its trainer is found dead on the moor, his skull smashed in a violent attack. Assuming that the trainer has been murdered while attempting to foil a robbery, the Devon constabulary search for a gang of horse thieves. Sherlock Holmes is not so sure. When the local police inspector asks the great detective if there is any point in the case that deserves special attention, Holmes invites him to consider ‘the curious incident of the dog in the nighttime.’ The inspector points out that the dog did nothing in the nighttime. ‘That,’ replied Holmes, ‘was the curious incident.’ Holmes deduced from the animal’s silence that it had

Significance / 213 recognized the person who had taken Silver Blaze out on to the moor. From this, Holmes concluded that, for some malign purpose of his own, the trainer had decided to maim the horse, but the sensitive thoroughbred had panicked and kicked him to death. No doubt to the mystification of the uninitiated, historians have a habit of referring to the dog in the night, the dog that did not bark. Thus Oliver MacDonagh’s verdict that the year 1783 ‘is significant in Irish history in much the same fashion as the dog’s not barking was significant for Holmes.’31 MacDonagh sought to draw attention to the curious fact that between 1775 and 1782 Ireland’s Protestant political elite had successfully conducted a campaign against British legislative interference similar to that of the American colonies. However, they had stopped short of emulating the United States in insisting upon recognition of complete independence. Thus, the strength of MacDonagh’s analysis is that it is not so much counter-factual as implicitly comparative. As applied by Sherlock Holmes, the counter-factual approach assumes that in any other racing stables, an argumentative dog would have engaged in noisy confrontation with intruders. In MacDonagh’s implied comparison, the hypothetical racing stables are replaced by the actual American colonies, to underline the point that the logical culmination of Irish colonial-style assertiveness did not in fact come about. It is a technique that is especially, although not exclusively, illuminating when applied to the history of Ireland. ‘It is a significant fact,’ remarks A.T.Q. Stewart in his analysis of the problems of Northern Ireland, ‘that even at the worst stages of the troubles there was never a major evacuation of population.’32 The implied alternative is historically plausible: in August 1969 the Irish government made preparations along the Border to receive the influx of frightened and homeless Catholics who were expected to flee from Unionist bigotry. In other turbulent, terrorist-ridden trouble spots, violence invariably uproots thousands of frightened people from their homes. While there have been local movements by threatened minorities that have consolidated Northern Ireland’s urban ghettos, it is salutary to place the ‘Troubles’ (the term itself is significant in its indeterminacy) in a faintly reassuring context by noting that the anticipated past future of a major refugee crisis has not occurred. Indeed, local denominational clusters within Ulster have barely shifted in three centuries.

214 / Past Futures: The Impossible Necessity of History The application of the comparative dimension of the Ulster Troubles throws into sharp relief a little-noticed but highly positive aspect of Canada’s internal divisions. It is likely that large-scale pogroms were averted in 1969 by the decision to send the British army on to the streets of Belfast and Derry to protect the nationalist minority. The British army was greeted in Catholic areas by the ritual common to all communities in Europe’s offshore archipelago, the gift of cups of tea. Unhappily, within months, the protectors and the ostensibly protected were locked in confrontation, and in January 1971 terrorists from the nationalist ghettos began killing soldiers. Meanwhile, Canada had experienced an episode so dramatic that it surely ought to be just as significant. In October 1970, Canadian forces were sent into Quebec to challenge an apprehended insurrection, which they did by seizing several hundred suspected separatists, few of whom were charged with any criminal offence. If an intelligent Martian had been asked in 1970 to guess which situation would provoke three decades of bombing and shooting, he would surely have picked the one driven by the War Measures Act over the one lubricated by the cups of tea. In the event, while Quebec separatism proved to be no poodle, the decades which followed resounded with nocturnal canine silence: indeed, the low-level terrorist campaign of the FLQ came to an end at just the moment when its counterpart burst forth in the north of Ireland. Happy is the divided country where the most violent act is an occasional flag-burning. Comparison with an ostensibly similar situation brings home the huge significance in the Quebec of the Quiet Revolution of the silence, and indeed probably the virtual absence, of the mad dogs. The nocturnal landscape of Irish history swarms with spectral hounds of negative significance. Indeed, between the middle of the eighteenth century and the end of the nineteenth, it was not merely a single dog but an entire language that fell silent. In 1750, the majority of the people of Ireland spoke the Irish version of Gaelic, and most of them were probably unilingual. By 1900, the Irish language had retreated to a few enclaves of poverty and isolation. The decline of Irish was part of a larger process of anglicization throughout the island group: in 1750, it is likely that as many as one-quarter of the population of Britain and Ireland spoke Celtic languages – thus, at the historical moment when English was poised for world domination, it did not

Significance / 215 monopolize even its own heartland. By 1900, Cornish and Manx were extinct, Scottish and Irish Gaelic in sharp retreat, and Welsh was fighting a rearguard action. One by-product of this process was the stark simplification of language confrontation in Canada. Scottish Gaelic took root in a few places: in the mid-twentieth century it looked more likely to hang on in Cape Breton than in the Scottish Highlands themselves. Pockets of Gaelic lingered on, largely unnoticed, in eastern Quebec until the nineteen-nineties. A Welsh-speaking settlement in New Brunswick quickly scattered, while just a few whispers of Irish speech were preserved in the richness of Newfoundland English. The most delightful example is the Irish word ‘ainniseoir,’ meaning a weakling, which was anglicized to ‘hangashore,’ since in Newfoundland the ultimate sign of wimpishness was reluctance to go fishing.33 Overall, however, a Canadian of British descent was a Canadian who spoke just one language, English, a point so taken for granted that its significance is overlooked. The politics of modern Canada would have been more diverse and perhaps less confrontational if the country had inherited the language pattern of its European homelands as they had existed in 1750 rather than in 1850. The driving force behind the Ontario schools dispute of the early twentieth century, the first such confrontation to concentrate on language without the complication of religion, was Michael Fallon, a Catholic bishop of Irish parentage. A people who had made large sacrifices to switch into English saw no reason to make concessions to French.34 The abandonment of the Irish language was an underlying process so fundamental that, paradoxically, it rarely merits more than a page or two in historical textbooks. Hence, its huge significance for the history of Ireland is hardly ever addressed, and its knock-on effects overseas missed altogether. The Irish changed their language, but they did not redefine their nationality: if anything, nostalgia for the sacrifice of so much of their culture sharpened the sense of identity expressed in the Gaelic revival. Equally, the adoption of England’s language was accompanied by emphatic rejection of England’s Protestant religion. There are plenty of examples of people who grappled with a second language in order to deal with an alien government or an external market. Finns learned Russian, Slavs spoke German, while English was

216 / Past Futures: The Impossible Necessity of History adopted as lingua franca throughout India and Africa. Although these subject peoples became bilingual under economic and political constraint, they did not abandon Finnish or Czech, nor stop speaking Hindi and Hausa. The virtual parricide of the native tongue is surely one of the most significant aspects of modern Irish history. To change the language of an individual, let alone of a whole community, is a huge task, almost as large as the challenge to the historian who seeks to understand how it could have happened. The rapid decline of Irish is an example of the way in which standard forms of historical explanation cannot cope with individual decision-making on a large scale. The old language died because millions of Irish people not only resolved to speak English at home as well as in the marketplace but further determined that they would not pass on their mother tongue to their own children. The political effects of this societal change were unpredictable: Michael Collins and Tomás MacCurtain, two of the leaders in Ireland’s war of independence, were anglophone children of Irish-speaking parents: their nationalism was driven by loss of their heritage. By their nature, the motives behind millions of individual decisions are beyond explanation. The most we can attempt is the location of the process in time. The large-scale abandonment of the language coincided with the era of mass emigration: parents knew that most of their children would end up in Englishspeaking countries where they would be mocked for their ethnicity and accent. It was best to prepare them by dumping the linguistic heritage that impeded fluency in the dominant language. By appreciating the vast significance of the popular language shift in Ireland, it is possible to underline anew the centrality of Irish Catholicism and Irish national identity throughout the nineteenth century. Language changed; religion and nationality did not. The paradox is that the same modernizing thrust that brought Ireland into the English-speaking world also helped to take it out of the United Kingdom. To the idealists who sought to revive the Irish language, it still seemed worth fighting in 1916 to 1922 to recapture the Gaelic world of their grandparents. Yet this is not the only way in which Ireland’s war of independence can be located in time. The Treaty of December 1921 is too often portrayed as the culmination, and therefore the morally inevitable outcome, of eight centuries of uneasy relations between the

Significance / 217 two islands. A more helpful location in time would view that earlier round of Irish ‘Troubles’ in the context of the enormous dislocation of the First World War – compounded in the Irish case by the fact that the outbreak of war froze the Home Rule crisis of 1912–14 at its most agonizing point. Yet here too a notable dog failed to bark at a dark hour. It was not necessary to invoke the Book of Revelations to appreciate the way in which the First World War had demonstrated the potential of air power. Britain created an independent air force in 1918. Two years later, with just eight aircraft, the Royal Air Force had reasserted British authority in Somaliland by bombing the strongholds of the Mad Mullah (whose perceived insanity lay in his reluctance to accept imperial rule). Similar measures were contemplated but never fully implemented in Ireland.35 As a result, the ‘flying columns’ that dominated the low-level campaign of insurgency were composed of Republican irregulars travelling by bicycle unobserved and unscathed across often bleak and treeless landscapes. There may be little point in getting into counter-factual speculation about the effectiveness of British air power in Ireland: bombing does not always break people’s resistance and it certainly does not win their hearts and minds. What is significant is that the Irish war of independence flared up at almost the last moment in time at which a major power would fight a military campaign without using its air force. In 1957, a Catholic bishop commented that the Irish notion of history ‘has tended to make us think of freedom as an end in itself and of independent government – like marriage in a fairy story – as the solution of all ills.’36 Yet there was no such fairy-tale ending for twentieth-century Scotland. By implication, if the Irish national experience represents normality, then Scotland would seem to constitute ‘an awkward, illfitting case,’ in the words of a sociologist with a sense of history. However, a challenging young scholar, Graeme Morton, has recently suggested that we have all been reading the story the wrong way round. He argues that ‘the significance of Scotland’s missing nationalism in the nineteenth century’ should be interpreted in a different sense, one that offers consolation to those modern patriots who are embarrassed by the spineless failure of Victorian Scots to produce their own Garibaldi. In terminology familiar to the Canadian debate on Quebec, Morton suggests that the search for an ethnic nationalism has

218 / Past Futures: The Impossible Necessity of History missed a much more crucial element of civic nationalism, an ideology that converts Scottish membership in the United Kingdom state into a positive and willing participation: ‘[W]e can’t wish that it was something else, namely the actions of a Scottish state acting on behalf of a Scottish nation within the British empire.’37 Like the watchdog in the stables of Silver Blaze, the Scottish terrier preferred to wag its unionist tail rather than bark in nationalist defiance. The implied dog-in-the-nighttime comparison with Ireland’s national revolution can also underline just how remarkable is the fact that, alone of the major countries of Europe, the rest of the United Kingdom came through the experience of modernization without violent upheaval. Until well into the twentieth century, the absence of revolution was widely interpreted as evidence that British history was still travelling, that it would eventually reach a denouement that would dwarf even the French and Russian experience of transcendent violence. As G.K. Chesterton warned, ‘[W]e are the people of England, that never have spoken yet.’ With hindsight, however, the very notion of a British revolution on the scale of France in 1789 or Russia in 1917 seems like a St Bernard that has taken a Trappist vow of silence. Various explanatory strategies seek to account for this negative aspect of British history. One would argue that violence was not necessary to secure sweeping political change in Britain, that revolution occurred without anybody noticing, although this sits uneasily with the overall stability of institutions and continuity in the distribution of power that has characterized modern British history. Perhaps a country gets the revolutionaries it deserves: Britain’s first socialist leader, H.M. Hyndman, confessed that the greatest disappointment in his life was his failure to be selected to play cricket for Cambridge University.38 Another is to assert that Britain (or at least ‘England’) is fundamentally different from the rest of Europe, although this is as much a patriotic assertion as an analytical statement. Even the argument that the ‘Troubles’ of 1919–21 demonstrate that Ireland is historically closer to continental Europe than to Britain falters when it is remembered that the Troubles were predominantly confined to the countryside. Most European revolutions have been powered by dominant cities: Paris in 1789 and 1848, Brussels in 1830, Frankfurt in 1848, St Petersburg in 1905 and 1917. The attempt to follow this model in Dublin in 1916 was a spectacular fail-

Significance / 219 ure. Much more surprising and even less noticed is the fact that no serious bid for revolution was ever launched in London, even though by 1800 London was probably the largest city in the world. Certainly it dominated England with a demographic ascendancy unrivalled throughout the rest of Europe. Although it was not until 1793 that an over-ambitious governor conceived the idea of naming a second London in the depths of Upper Canada, from early times the English metropolis was seen as the quintessential city: as early as the fourteenth century, tiny but ambitious Oswestry (its population was about 3000) was hailed as the ‘London of Wales.’39 By 1800, with a population around a million, London had overtaken every other city in Europe and was probably the largest urban community on the planet. The remarkable aspect of the size of London was its dominance over the rest of the country. Paris claimed only one Frenchman in forty, but in England and Wales, London was the home of one person in every nine. Remarkably, London’s population never seemed to stop growing: over three million by 1861, past seven million by 1911, by which time it had swallowed one-fifth of the population of England and Wales and onward beyond eight million by 1951. By that time, Cobbett’s cancerous Wen had become an ‘imperial city’ which was beginning to dissolve into more distant dormitory towns.40 By any standards, the growth of London was spectacular, although many textbooks manage to give the impression that urbanization in Britain was something to do with Birmingham and Manchester. Indeed, perhaps even more remarkable was the fact that, unlike the smaller provincial cities, London’s growth was not primarily driven by the factory system and not at all by coal-mining. Overall, the capital grew because it was a great trading port and centre of finance and government. This had implications for the development of class-based politics in twentieth-century Britain. Before the First World War, the emerging Labour party had established only precarious footholds in the capital, in the Thames-side manufacturing districts of Woolwich and West Ham whose economic profiles resembled those of northern industrial cities. It was less a case of Labour capturing London in the inter-war period as of an older-style London radicalism infiltrating the party. The British version of socialism expressed itself, not through

220 / Past Futures: The Impossible Necessity of History revolutionary activity, but rather through municipal ownership of the gas supply. London’s growth extended its cultural and economic influence over the rest of the country. By the sixteenth century, the invention of printing had set London’s dialect on the road to becoming standard English. In a landmark article, E.A. Wrigley argued that as early as 1650, London was driving change throughout the rest of the country. The theme of Daniel Defoe’s Tour Through the Whole Island of Great Britain in 1722 had been ‘how this whole kingdom, as well as the people, as the land, and even the sea, in every part of it, are employed to furnish something, and I may add, the best part of every thing, to supply the city of London with provisions.’ The unhealthy metropolis devoured people as well as produce: to maintain London’s population growth, it was necessary each year for at least eight thousand country folk to make the individual decision to migrate to London, valuing the past future of short-range opportunity ahead of long-term life expectancy. The drain of labour from rural areas forced farmers to become more efficient, paving the way for agrarian improvement. The growth of the insatiable London market may also have stimulated the early stages of industrial growth elsewhere in Britain. Two of the hallmarks of the industrialization, railways and the steam engine, first took hold in the northeast of England where the coal miners relied heavily on coast-wise trade to an increasingly smoky metropolis.41 Indeed, the size and glamour of London have exercised an influence well beyond Britain. Until 1914, there were more people in Greater London than in the whole of Canada, while Australia’s population did not catch up until the nineteen-fifties. The Australian expression for an overwhelming certainty, ‘London to a brick on,’ has not been traced before 1965. At that time, disrespectful Australian nationalists vented their resentment at the political dominance of Robert Menzies by singing, to the tune of The Bells of St Mary’s, an ironic rhyming tribute to the size of his testicles, ‘curvaceous, capacious as the dome of St Paul’s.’42 It would be hard to find a more grudging compliment to the worldwide cultural ascendancy of London. The methodology of the Baker Street detective can be used to juxtapose the sheer size of London alongside the silent hound of the British revolution. Eighteenth-century Londoners showed their talent for vio-

Significance / 221 lence, most notably in the Gordon Riots of 1780, in which several hundred people were killed amidst widespread destruction of property – in the same decade as Parisians stormed the Bastille. Anarchic and (until 1829) under-policed, a giant city at the heart of a nation that maintained a very small army, none the less London never spearheaded a national revolution. It has been argued that Londoners showed at least a potential for revolution in their disciplined confrontation of the opponents of parliamentary reform in 1831–2.43 London certainly had good reason to demand Reform. At a time when 8 per cent of the whole population of Britain and Ireland lived in the metropolis, Londoners elected just 10 of the 658 members of the House of Commons. The only recognition of special status was the concession of four seats, instead of the usual two, to the City itself (a privilege bizarrely shared with Weymouth in Dorset), while metropolitan growth had absorbed the nearby constituencies of Middlesex, Southwark, and Westminster. Of course this did not stop wealthy Londoners from buying their way into parliament elsewhere. Indeed, a central if not always emphasized aim of the unreformed constitution was to ensure that the government was supported by London money but protected from the London mob: the slide towards social chaos after 1646 was a warning of what could happen if the capital city became the master rather than the ally of the system. The first two instalments of parliamentary reform sought to continue this balancing act. The metropolitan area was allocated a mere ten additional seats in 1832, with property qualifications for the vote designed to curtail the electorate, a provision quickly undercut by London’s perennial inflation in house prices. The 1867 act added a miserly four additional seats: on a population basis, the under-representation of London was equal to sixty-three missing MPs.44 London, it might seem, had both the motive as well the potential to lead the British revolution that never happened. Indeed, the reverse seems to have happened: the capital city was co-opted to become a conservative force in the national equation. This crucial change had occurred by 1885, when the Conservative leader, Lord Salisbury, insisted on cutting up the metropolitan area into mainly single-member constituencies that for the first time roughly reflected its population base. When his party won fifty out of London’s sixty-one seats in

222 / Past Futures: The Impossible Necessity of History 1886, Salisbury himself looked back over thirty years in politics and marvelled that the capital city that had once seemed so threatening should have become a Tory bastion.45 Two years later, the party that had blocked Home Rule for Ireland created the first London-wide elected local authority. There was never the slightest risk that the new London County Council would become a revolutionary directorate, as it showed in electing the dilettante young aristocrat Lord Rosebery as its first chairman. By now, it could be confidently assumed that revolutionary London would never bark at all. This was a surprising, and surely a significant, shift in attitudes at a time when the metropolitan area was growing by almost a million people a decade – one of those historical paradoxes that defy explanation, and perhaps remain unnoticed for that very reason. London continued to contain turbulent elements: there were riots in Trafalgar Square in 1886 and 1887, and a long and determined dock strike in 1889. The most that can be suggested is that London had simply become too big to sustain any organized threat to the system. Even forty years earlier, the Anti-Corn Law League had based its activities in Manchester partly because it found London too diverse to mobilize as a single political force.46 It is surely revealing too that as professional football captured working-class enthusiasm in late Victorian times, no London United emerged to challenge Birmingham City or Liverpool or Newcastle United. Fans cheered instead for the urban villages, the Chelseas and Charltons that were being buried under metropolitan growth. Rugby Union made the point in a different way, with three major teams demonstrating the outreach of the capital throughout the island group: London Irish, London Scottish, and London Welsh. As with the accession of a Scottish king to the English throne, the size of London is so basic an underlying feature in British history that its importance is easily overlooked. Like the silent dog in the nighttime, the absence of a violent British revolution spearheaded by the largest city in the world is a fact of considerable historical significance, and one which is fundamental to the location of modern Britain in time. Through the curious incident of the dog in the nighttime, Conan Doyle’s phrase-mongering artistry provides a valuable tool for the exploration of negative significance. Yet J.J. Lee’s warning against the

Significance / 223 temptation to select the most convenient alternative scenario should draw attention to a weakness in the tale of Silver Blaze. Sherlock Holmes was correct to identify the animal’s silence as a significant fact, but the great detective rushed to embrace what was merely one of several possible explanations for its non-performance. We are told that the stable-boy charged with guarding the racehorse had been drugged, so why not the dog too? Perhaps it was one of those loveable but stupid dogs that wag their tails at complete strangers. Sometimes the most obvious comparative approach to significance may not be the most illuminating. For instance, it is undeniable that the existence of a French-speaking population in Quebec and of a Protestant community in Ulster are significant facts in Canadian and Irish history. The most obvious context in which to identify that significance is by implicit comparison with the alternative possibilities, entirely feasible in the history of both countries, that Quebec would have ceased to be French and that Ulster might never have become Protestant. Yet it may be that we should invert the approach and suggest that their real significance lies in the fact that Quebec is the only large French majority community in Canada, that Ireland contained just one sizeable regional enclave of Protestants. It is surely not necessarily a wild venture into the counter-factual to suggest that one of the most significant elements in Canadian history was the failure of Manitoba to play the same role in French–English relations that the Transvaal played in the DutchEnglish balance among white South Africans. In 1870 many expected that Manitoba would become Canada’s second French-majority province, and there was no gigantic difference in population at that time between the 6000 francophones of the Red River and the 30,000 Boers living north of the Vaal. That crucial identity shift, from FrenchCanadian to Québécois, that threatened the foundation of Canada throughout the last quarter of the twentieth century would at least have been muted and perhaps would not have occurred at all, had Winnipeg become as French as Montreal. In Ireland, partition was a sufficiently tortuous process even with a relatively easily identifiable Protestant bloc confined to the northeast of the country. However, there was a potential candidate for a second Ulster to the south of Dublin. In the early nineteenth century an area of about two-hundred square miles in north Wexford was about 20 per

224 / Past Futures: The Impossible Necessity of History cent Protestant. ‘This fact and its importance has generally been overlooked.’47 In some adjoining pockets of Wicklow, Protestants constituted up to 40 per cent of the population. It is a matter of historical record that Wexford and Wicklow did not evolve into another majority Protestant region, but equally it is a matter of historical significance that they might have done so – especially if Irish Catholics had been as willing to change their religion as they were enthusiastic in the abandonment of their language. Of course, the example of Pakistan, which was originally created as two blocks of territory, two thousand kilometres apart, makes it impossible to claim that the existence of a second Ulster would have prevented the partition of Ireland. None the less, the significance of partition in Ireland may lie not so much in the intractable challenge that Ireland contains one Ulster but rather in the haunting possibility that there might have been two. The American Civil War so dominates the landscape of that nation’s history that it seems simple enough to identify the implied comparative dimension defining its significance. The most obvious significance of the Civil War lies in the fact that it did indubitably happen, as against the possibility that it might have been avoided. Yet the towering dominance of the clash between Yankees and Confederates may obscure the possibility that its real significance lies in the fact that a country with such an explosive cultural mix of democracy and violence should have experienced only one civil war, and that on a northsouth basis. In 120 years between Bacon’s uprising of 1675 and the Whiskey rebellion of 1794, there were five actual or potential clashes between east and west. (Pennsylvania’s Paxton Boys were talked into going home by Benjamin Franklin without the threatened sack of Philadelphia in 1764.) Three of those five armed protests were motivated by the complaints of frontier settlers that effete tidewater authorities were failing to protect them against aboriginal people, the other two by resentment against the financial policy of remote governments. As settlement crossed the Appalachians, it might have been expected that these feelings of vulnerability and alienation would have multiplied, especially during the first half of the nineteenth century, when westward expansion ran far ahead of the railroad network. It is obviously a significant development in American history that in 1787–8 the Thirteen States established a strong central government, thereby overcom-

Significance / 225 ing the strong rival tendency to quarrelsome fragmentation that would have reduced America to balkanized impotence. It may be equally significant that those original founding states tacitly agreed that they would not monopolize power among themselves, but would accept the admission of new states on the same basis as the old. Less farsighted, Canada withheld control over public lands when it created new provinces on the prairies, a decision that played a part in stoking the west-east distrust which by the late twentieth century was threatening to paralyse its politics. How might these approaches to the notion of significance be applied towards an understanding of a specific experience in the past? The Second World War provides a tempting subject for the experiment. For the middle-aged, at least, the war remains sufficiently recent and arouses so much general interest that it should be possible to analyse both its overall significance and the importance of one episode – the D-Day invasion of Normandy in 1944 – even though we may not be experts in international relations or military history ourselves. Indeed, if ‘expert’ status implies omniscience, it would be humanly impossible for anybody to hold forth on the Second World War, so massive is the published research. One 1200-page general textbook on the war adds a thirty-eight page bibliography in small type merely to identify the ‘headline’ studies.48 To argue that it is necessary to digest most, if not all, the published work in order to have an opinion on one of the greatest upheavals in human history would be carrying scholarly integrity too far. The austere corollary of such a position would be to debar us from attempting to locate our transient present in the longer sweep of time by tracing any connection at all back to the time of Hitler and Stalin. In short, academic history as it is practised may actually be cutting us off from a past that we vitally need to comprehend. At the very least, an analysis aimed at establishing historical significance may serve the intellectually democratic counterpart required for the wider validation of precise archival research. In research terms, we historians have planted a vast forest of trees. What we need now is a device that will help everybody to identify the wood. As with Canada’s 1837 rebellions, the term ‘Second World War’ requires considerable reflection and some definition. The war has to be

226 / Past Futures: The Impossible Necessity of History located in its own time. Some historians deny that it possessed an independent historical existence at all, seeing it rather as the second round of a Thirty Years War to settle the question – or, in the light of the country’s post-1945 economic recovery, the nature – of German hegemony in Europe.49 Naturally enough, this issue is interwoven with the debate on the origins of the war: was Hitler the cause of the second instalment, or merely the by-product of the first? For Hitler himself, the war was a continuation of the struggle in which he had served as a corporal in 1914–18. Similarly, the Pacific war grew (at least in the minds of the Japanese high command) out of a longer-running struggle for supremacy in east Asia dating back at least to the Japanese invasion of Manchuria in 1931, if not to their brief clash with China in 1895.50 Even when distinguished from the conflagration of 1914–18, the Second World War encompassed at least three major conflicts. The first was a war centred in western Europe beginning in September 1939 between Britain and Germany, in which France participated until May 1940 and the United States became the dominant combatant after December 1941. The second was a war in eastern Europe between Germany and Russia that erupted in June 1941 and concluded in May 1945. The third was a Pacific conflict between Japan and an Americanled coalition between December 1941 and August 1945. Subsumed in the two European conflicts were various internal civil wars, sometimes ideological, as foreshadowed by the conflict in Spain from 1936, but more often driven by nationalisms and ethnic hatreds. These overlaps complicate the ethical dimension, the values which we should wish to read into the conflict. Seen as a war in western Europe and the North Atlantic, the conflict can be viewed in straightforward terms of good and bad – democracy and freedom on the one side, fascism and tyranny on the other. But the stark simplicity of this heroic interpretation only partly transfers to the war in Asia, which the Japanese sought to turn into a struggle against European colonial rule. Yet, as was shown by the leftward shift in both British and Canadian public opinion, the idea of fighting for a better world constituted a powerful element in the psychological armoury of the Allied powers. It was only possible for such sentiments to coexist with the Soviet alliance by fostering the myth of Uncle Joe and his gallant Russian people. The second part of

Significance / 227 the image was indeed true, but in terms of mass murder, Stalin was as great a monster as his Nazi counterpart. As the war receded into yesterday, this ethical inconsistency became all the sharper: the countries of eastern Europe which Britain and France had gone to war in 1939 ostensibly to protect from the domination of Berlin were condemned by the terms of the eventual victory to seemingly endless subordination to Moscow. The three general areas of conflict only marginally intersected. In practical terms, beyond supplying equipment, the Western Allies could do little to ease the pressure on the Russian front until the invasion of France in the summer of 1944. Not until the final weeks of the war did tension between Russia and Japan move beyond armed neutrality. The triple conflict created some inconsistencies, most notably in the picture of a struggle between freedom and tyranny. The Royal Navy sank the French fleet in July 1940; democratic Finland fought alongside Nazi Germany; Canada maintained diplomatic relations with the puppet Vichy regime in France. After a delayed start, Italy fought successively on both sides. Brazilian soldiers fought Germans in the Mediterranean; Mexican airmen flew against Japan. Despite their post-war orientation to the Pacific, both Canada and New Zealand directed their main effort away from the Japanese war. Even neutrality could prove to be a heavily qualified status. Although Ireland remained punctiliously out of the fray, its leader even astonishingly offering formal condolences on the death of Hitler, the Dublin government gave covert support (but not vitally needed antisubmarine bases) to the Allies. According to Dublin legend, a protest by the German ambassador that the repair of a damaged RAF plane was a violation of international law provoked the response, ‘Ah, but who are we neutral against?’51 Despite these anomalies, the Second World War can be located in time as the last major conflict in which periods of official ‘peace’ alternated with declared phases of ‘war.’ Formally communicated declarations of war, such as that transmitted by Churchill to the Japanese ambassador on 8 December 1941, over the signature ‘Your obedient servant,’ would virtually vanish after 1945.52 None the less, even acknowledging the overlapping conflicts of the Second World War, a number of aspects and landmarks can be labelled ‘significant.’ The first is that the war broke out in 1939 rather than at

228 / Past Futures: The Impossible Necessity of History some later date and that, in effect, it erupted first on Germany’s western flank, since Poland quickly succumbed to aggression. This is an argument for significance at variance with the verdict produced by the generation whose lives were disrupted by the war. In Britain, at least, there was a tendency to argue that firmer resistance to Hitler would have avoided war at all, and much blame was placed upon the allegedly pusillanimous Appeasers of the nineteen-thirties. As argued in chapter 5, condemnation of Chamberlain’s tactics endorses one highly doubtful past future, that Hitler could not only have been outfaced but stopped for good. This view ignores the far more likely alternative, that the German-speaking Sudetenland would have remained a flashpoint within an artificial state. Even in 1941, the Czech government in exile, usually grouped among the good guys, protested against the inclusion in the Atlantic Charter of a right of self-determination for minorities. Overall, it seems incontestable that Hitler was engaged in a headlong rush that would have led sooner or later to war. When a German diplomat of the old school remarked after the absorption of Austria in 1937 that Bismarck would have opted for a period of consolidation after such a move, Ribbentrop told him that he had ‘no conception of the dynamics of National Socialism.’53 Hellish though the war of 1939 proved to be, it is no mere counterfactual horror story to suggest that it might have been far more devastating if its outbreak had been staved off until 1942 or 1945.54 The German economy was not fully harnessed for war in 1939. Of course, military technology advances more rapidly in wartime. Still, although a massively expensive project managed to detonate three atomic bombs in 1945, it is not safe to assume that a delayed European war would necessarily have been a nuclear conflict. Similarly, we cannot make too much of the fact that the Luftwaffe had no heavy bombers when it attacked London in the Blitz, even though the tonnage of German bombs that caused Britain’s ordeal in 1940 was only 3 per cent of the load dropped by the Allies on Europe in 1944. Even if war had been postponed, Hitler’s preference for blitzkrieg might still have diverted resources away from the development of a heavy bomber, a project for which he never showed any enthusiasm. However, it is far more likely that the Nazis might have commanded a functioning terror-weapon against which Dowding’s pencil would have proved no

Significance / 229 defence: a workable prototype of the long-range rocket flew 120 miles in October 1942. Perhaps most crucial of all, Hitler would almost certainly have wielded a submarine fleet far larger than the U-boat force that almost brought Britain to its knees in 1941. Thus, to adapt and paraphrase Jenny Wormald, simply to say that war broke out in September 1939 is to fail to appreciate the enormous significance that it came at that moment and not a few years later. Britain’s survival in 1940 was also enormously significant. We can dismiss the counter-factual speculation that the real significance of 1940 lay in the fact that Churchill was not forced to negotiate an armistice. No belligerent dropped out of the Second World War except under pressure of imminent catastrophe. Democracies, which required the mobilization of public opinion to make war at all, were much less likely to make a compromise peace than dictatorships. In any case, by 1940 only the very naive would have trusted Hitler to abide by the terms of any treaty. An approach to historical analysis that focuses on decisions will note that the formal yes-no question of negotiating an armistice never arose in the British cabinet. Despite the inspiring rhetoric of ‘standing alone’ in 1940, Britain was backed by its Commonwealth allies. However, the Dominions could not have successfully fought Hitler in Europe without the offshore base that was Churchill’s island – any more than the far more powerful United States could subsequently have grappled with Hitler at long-distance, transatlantic range. Arguably, the most significant moment in Canada’s participation in the Second World War may have been those months in 1940 before the United States had become the arsenal of democracy, when Canadian troops stiffened British defences ravaged by Dunkirk and the vital Halifax base kept open a thin supply line across the ocean. A war between Hitler’s Nazis and Communist Russia was always likely, but the precise date of its outbreak, on 22 June 1941, is significant in several respects. It is easy to assume that Britain’s moment of crisis was confined to 1940. This is to overlook what John Keegan has called ‘perhaps the boldest strategic stroke of the war,’ Churchill’s decision ‘to transfer what remained of British striking power to the Middle East, even while the home islands lay under threat of invasion.’55 In fact, London suffered its worst air raids in April and May 1941, the period in which the Luftwaffe roamed at will by night to

230 / Past Futures: The Impossible Necessity of History devastate industrial centres as far away as Belfast and Clydebank. It was the diversion of German air power to the east that brought respite to the British. The other side of the coin of significance was that the British intervention in Greece in April 1941, although in itself a disaster, was enough to delay the Nazi attack on Russia until June. The location in time of the German invasion so late as mid-summer proved to be hugely significant, since it limited the effects of the onslaught before winter began to give the Russians their traditional defensive advantage. The outbreak of war in the Far East in December 1941 was obviously a landmark event, but arguably its significance lies in four very specific aspects. The first, certainly from a British point of view, is that the Japanese chose to launch a double attack, moving eastwards against the Americans at Pearl Harbor as well as southward to attack the British in Malaya. Churchill was acutely aware of the ‘awful danger’ that the Japanese would ‘carefully avoid the United States’ and single out British territories for attack. Second, just as the significance of the outbreak of the European war was that it came as early as September 1939, so the significance of the timing of the Pacific war lay partly in the fact that it was delayed until the end of 1941. Ten months earlier, Churchill had feared the ‘enfeeblement’ of trying to fight both Germany and Japan.56 Had the Japanese attacked before Hitler tied his own hands in Russia, it is hard to see how a total British collapse in both hemispheres could have been avoided. Third, the Japanese attack upon Pearl Harbor was sufficiently devastating to unify a shocked American public, but failed in its aim of destroying the U.S. Pacific fleet, especially the aircraft carriers which soon afterwards were to turn the tide at the battle of Midway. The fourth strand of significance is perhaps the most important of all, and can be identified in a single human decision. Four days after Pearl Harbor, Hitler declared war on the United States. The American constitution requires a two-thirds vote of the Senate to approve of any declaration of war by the United States. In the aftermath of Japanese aggression, American politicians were highly unlikely to initiate formal war with Germany as well. The low-level naval conflict between the United States and Germany on the North Atlantic gave the Nazis a formal casus belli, but the savaging of the Pacific fleet made it likely that the Americans would have to move

Significance / 231 warships to protect their west coast. Hitler had ignored his Japanese ally in concluding his pact with Stalin in August 1939, and there was little realistic prospect of large-scale military cooperation between the two Axis powers. It has been argued that since the Americans were ‘virtually making war in defiance of the rules,’ there was little for Hitler to gain from maintaining formal neutrality.57 None the less, even in the perspective of December 1941, there was a world of difference between putting up with an undeclared naval war in the North Atlantic and inviting the full wrath of American outrage to descend upon the Third Reich. As with the accession of a Scottish king to the throne of England in 1603, it is easy to overlook just how remarkable was Hitler’s blunder in so unnecessarily drawing American vengeance upon himself. To paraphrase K.O. Morgan’s convolution over the founding of Plaid Cymru, the immediate significance of Pearl Harbor perhaps lay a little in the future. Certainly, in the short term, realization of its implications was delayed by the high tide of German and Japanese aggression during 1942. Yet here, too, we may also discern at least two aspects of significance. Their initial successes obscured the fact that neither the German nor Japanese war machines were fully equipped to achieve the conquests they attempted. The Japanese in particular were technologically under-prepared for modern warfare, a fact disguised by the suicidal bravery of their troops. Both aggressors were seriously hampered by limited access to oil supplies. Also burdensome were the constraints inherent in their ideologies of racial superiority. The Nazis failed to capitalize on their potential role as liberators of Russia’s subject peoples. Many Ukrainians were willing to collaborate, but Nazi dreams of lebensraum were inconsistent even with the creation of the sort of buffer state that the Kaiser’s regime had attempted to impose at Brest-Litovsk in 1918. In the Far East, the contrast between the rhetoric of ‘co-prosperity’ and the reality of Japanese cruelty was equally destructive. It was probably suspicion of Japanese intentions that helps to account for one of the most notable, and least observed, of all the silent dogs in the darkest hours of the Second World War – the determination of India’s Congress leaders to maintain a predominantly constitutional agitation against the British Raj, even if they sometimes teetered on the verge of a national uprising. The fact that it was a sec-

232 / Past Futures: The Impossible Necessity of History ondary figure, S.C. Bose, who defected to the Japanese, and then not until 1943, points to the significance of Gandhi as an asset to the Allied cause, however little he may have appeared so at the time. To claim that the Allied invasion of France in June 1944 was one of the pivotal episodes of the twentieth century, it is necessary to confront at least two objections.58 The first stems from the revelation of the secret of ‘Ultra,’ thirty years after the end of the war: that the British had broken the German codes and were reading all their messages. Coming at the time when the war was receding into ‘yesterday,’ this new explanatory element chimed with a generational change in attitude that tended to debunk a story centring upon heroism and sacrifice. If we knew Hitler’s secrets, why did it take so long to defeat him? Pointing out that Ultra was an operational and not a strategic resource, Ralph Bennett notes that its decrypts played ‘no part in the greatest command decision in the west – the choice of the time and place for the invasion of Europe.’ Moreover, historians need to weigh the manner in which the intelligence from Ultra was used. In the summer of 1944, the boffins at Bletchley Park warned the generals that the Germans planned to destroy the port facilities at Antwerp, and had tanks in place to defend the Rhine crossing at Arnhem. This intelligence ought to have cast doubt upon the strategy of a rapid advance into the heart of Germany, but the generals chose to take the gamble, perhaps for the political motive of supporting the Russian advance while limiting the territory Stalin might conquer. The outcome was the disastrous airborne landing at Arnhem in September 1944.59 The second objection to the centrality of D-Day is that the campaign in the West was secondary to the struggle between the Nazis and the Soviets, and that Hitler had already lost that war in the devastating defeat at Stalingrad during the winter of 1942–3. As Geoffrey Roberts has recently remarked, ‘there was no mistaking the significance of the three days of national mourning that followed the German surrender at Stalingrad in February 1943.’ In his opinion, ‘after Stalingrad the question was when and how the war would be won, not whether it would be won.’ But even if Hitler could not win his war, it was still possible for the Allies to lose it. One of the British commanders involved in planning the Normandy landings, General Ismay, privately warned in March 1944 against those who assumed that the war

Significance / 233 was ‘all over bar the shouting.’ At the very least, D-Day is part of the location of the ‘when and how’ of victory.60 ‘We know what happened eventually,’ Roberts writes of Stalingrad, ‘but we also know that things could have turned out very differently.’ Perversely, celebrations of successive D-Day anniversaries, and of the bravery of those who fought on the Normandy beaches, have conveyed the notion that the invasion was the predestined prelude to the triumphal crushing of Hitler. ‘Because the D-Day landings succeeded, there seems a certain inevitability about their success,’ writes A.W. Purdue, who adds the bleak reminder that ‘disaster was always a possibility.’ In fact, the assault upon Normandy came just twenty-two months after the catastrophic raid on German coastal defences at Dieppe – hardly ‘a long time’ by any standard of judgment. As Overy has remarked, the Channel ‘had defied all attempts at invasion from Europe since 1066, a fact so well known that its significance is sometimes overlooked.’61 Among the D-Day planners there was a large element (mainly British) whose anticipated past future led straight to disaster. Just hours before the invasion was due to begin, the British chief of staff, Sir Alan Brooke, confided to his diary that ‘it may well be the most ghastly disaster of the whole war.’ ‘I knew too well all the weak points of the operation,’ he later recalled. It was ‘entirely dependent’ on the weather: what if there was a sudden storm? ‘Then the complexity of an amphibious operation of this kind, when confusion may degenerate into chaos in such a short time.’ At such a distance, there was ‘a lack of elasticity in the handling of reserves,’ so that it would be difficult to control the operation once it had begun.62 Indeed, one of Eisenhower’s tasks as his army crossed the Channel was the drafting of a statement announcing failure. It was luck, as much as destiny, that ensured that the communiqué did not have to be used. The more D-Day is examined, the more remarkable (and maybe even foolhardy) the whole enterprise appears. It was impossible to turn southern England into a massive invasion base without advertising the general intention to launch a cross-Channel attack. From February 1944 onward, when travel was banned to neutral Ireland, the invasion force was steadily sealed off from the outside world. These successive steps made it harder to draw back from a decision to go. Historians have rightly emphasized the massive campaign of decep-

234 / Past Futures: The Impossible Necessity of History tion to persuade the Germans that the main thrust of the Allied invasion would attack the Pas de Calais.63 It is much less emphasized that nobody had any right to expect that the Germans would not get wind of the planned landings in Normandy, even if they were misled into interpreting them as a secondary feint. As D-Day approached, thousands of people had to be briefed about the tactical objectives of the invasion force. The sketchiest Nazi spy network, any leakage of information through neutrals, even a slight capacity to conduct aerial reconnaissance over southern England, would have been sufficient to alert the German high command to the size, preparedness, and likely destination of the Normandy operation.64 Moreover, the Germans, like the British, could – and did – work out the most probable dates for a landing by moon and tide. ‘One day in June 1944, the Allies invaded Normandy and began to drive the Nazi armies out of France.’ The opening sentence of this chapter is an example of the way in which language can simplify and distort. It is deeply misleading to think of a superior Allied force slipping gracefully, almost instantaneously, across the Channel, taking the Nazis by surprise, and effortlessly punching its way into the heart of France. The reality of the invasion was straggling and long drawn-out, through night hours in which the Allies needed the clock to stand still, but which proved to be a very long time in history indeed. The operational commander, General Montgomery, took for granted that the Germans would realize that a major assault was planned by nightfall the previous day. Minesweepers began to arrive off the Normandy coast, inadvertently in daylight, eleven hours before the invasion ‘Hhour.’ American paratroops dropped on Ste-Mère-Eglise soon after midnight, six hours before the first troops were due to hit the beaches, and local German units quickly reported a major incursion. Three hours later, the British 6th Airborne Division seized Pegasus Bridge, at the eastern end of the invasion zone, thereby indicating the scale of the planned operation. Some soldiers reached deep into their past to locate the moment in time. In the nerve-racking tension, lines from Shakespeare ran through the mind of the paratroops’ commander, MajorGeneral Richard Gale: ‘And gentlemen in England now a-bed / Shall think themselves accurs’d they were not here.’ Out to sea, as the massive invasion fleet was slowly crossing the Channel at its widest point,

Significance / 235 Major ‘Banger’ King too was reciting Shakespeare to help his Tommies cope with fear and sea-sickness.65 The invasion fleet had left from a crowded rendezvous point south of the Isle of Wight, aptly code-named ‘Piccadilly Circus.’ From this point on, it was hugely vulnerable. Any sustained attacks by aircraft and torpedo-boats could have caused chaos. As the armada of hundreds of vessels crept in towards the Normandy coast, they became an open target for shelling from onshore forts. Landing craft took three hours to approach Utah Beach, but even with most of their radar stations destroyed by bombing, the Germans detected the American invasion force when it was still four hours out to sea. In any case, Allied naval bombardment and heavy air raids effectively announced an imminent attack. The significant point is that the Allies got ashore because virtually everything went right for them. Whether they were entitled to act on such an assumption is an ethical, rather than a historical, question. But those who object to counter-factual speculation may care to ponder the comparative example of the Arnhem operation, just three months later, in which British paratroops were sent to leap-frog the invasion across the Rhine. Like D-Day, Arnhem depended for success upon everything going right. In particular, military planning assumed that the Airborne Division would be able to seize a defensible bridgehead, that it would not encounter resistance from armoured forces, and that it could be reinforced overland within a few days. At Arnhem, each of these assumptions was to collapse. On D-Day and immediately after, each assumption came close to failing. As a matter of historical record, D-Day succeeded and Arnhem failed. The disaster of September 1944 underlines the significance of the success in June. Even so, the Normandy campaign was a close-run thing. The Americans suffered very heavy casualties on Omaha Beach. By contrast, despite encountering the worst D-Day weather conditions, the Canadians on Juno Beach were successful in gaining a foothold, although they encountered the toughest resistance as they tried to advance. Indeed, the initial success of the Canadians on the landing grounds contributed to their problems of moving inland, as men and equipment became embroiled in traffic jams within too small a bridgehead. The predominantly narrative form of Canadian history textbooks often fails to ram home the fact that the conscription crisis of 1944, the greatest politico-

236 / Past Futures: The Impossible Necessity of History military dispute in the country’s history, was the product of its location in an intense moment in time, the weeks in which Canadian troops played a leading part as the Allies fought their way through and out of Normandy.66 For a mercifully short period, combat casualties rose dramatically. It is this brief period of carnage that largely accounts for the fact that the conscription crisis ebbed after November 1944, when more mobile pursuit of the retreating Germans sharply reduced Allied losses. From the fact that the invaders were poised to cross the Seine within ten weeks of D-Day, it would seem easy to conclude that Allied forces invariably possessed overwhelming superiority. This was not always the case, especially during the crucial first hours of the attack, when the advantage of numbers usually lay with the defenders. Once again, the historical dice was determined by the vagaries of one man’s decision, as Hitler refused to authorize early and massive counter-attacks by nearby Panzer divisions. A particularly terrifying problem was the German Tiger tank, which massively outgunned any armour that the British and Canadians could throw into battle – and it was the British and the Canadians who fought the bloody set-piece battle of Operation Goodwood in July. (An early British attempt at a break-out to capture the strategic road junction at Villers-Bocage was blocked by a single Tiger in a five-minute rampage.) Even more astonishing was the failure of the Allied high command to foresee how heavy armour would be hampered by the high hedgerows and narrow lanes of the bocage which lay to the south of the invasion bridgehead. It required the ingenuity of an American sergeant to convert the Sherman tank into a bulldozer by welding on steel teeth – but a major military campaign really ought not to depend upon battlefield improvisation. It was in keeping that General Patton’s invasion preparations should have included a study of the supply routes used by William the Conqueror in 1066, on the curious assumption that roads that had proved capable of carrying carts in the eleventh century would be sufficiently robust to cope with heavy armour nine centuries later.67 History does not provide exact road maps for changing times. The key to victory in Normandy was massive Allied superiority in air power. Heavy bombers battered shore defences, while more mobile fighter-bombers were used as aerial artillery to target German rein-

Significance / 237 forcements. Yet this too was a risky strategy. In the crucial early stages of the campaign, air power had to be applied from bases in England, 150 kilometres away.68 Effective bombing also depended upon favourable weather conditions, which on D-Day were lacking. Heavy cloud caused the bombers to miss Omaha Beach altogether, leaving undamaged German shore batteries to pound American troops. In 1944 there were no smart bombs. The Royal Air Force’s workhorse bomber, the Lancaster, cruised five kilometres above the ground at a speed of five kilometres a minute. As a result of just a ten-second error in timing, the Lancasters missed the German coastal fortifications at Mervill, overlooking Sword Beach, by eight hundred metres. Those guns were silenced only by crash-landing three gliders full of British assault troops directly on top of the emplacement. The risk from bad weather in Normandy has been consistently underestimated. It is well known that Eisenhower had access to superior meteorological forecasting, so that the Allies expected that the weather would change on D-Day for long enough for them to get ashore (although nobody seems to have realized that storm tides would reduce the size of the beaches and so increase the potential for chaos on the landing grounds). However, predicting the weather is not the same as controlling the elements. In fact, weather forecasters had little warning of the great storm that blew up on 19 June (D plus 13, as the superstitious duly noted), interrupting cross-Channel communication for three days and seriously damaging the prefabricated Mulberry harbours, through which supplies were unloaded. The Allied advance was halted for lack of ammunition, and air cover was almost abandoned for four days, giving the Germans the opportunity to regroup. At this point, Eisenhower believed the success of the whole operation was in the balance. True, the weather was abnormally bad in June 1944, but it is never safe to depend upon the English summer. D-Day, then, was something more than a dramatic episode along a predestined Road to Victory. So towering has become the historical landmark that it is easy to miss its significance. It is significant that the Allies chose to mount their invasion at a time and a place where they might easily have failed, and it is mightily significant that they did not fail.

238 / Past Futures: The Impossible Necessity of History Even in outline, this survey of the Second World War may suggest that analysis of significance offers as fruitful an approach to historical discussion as can be found through pursuing the mirage of explanation. The British and French decided to declare war in September 1939 because they saw Hitler as a threat to their interests and security. We can illuminate that decision by tracing the way in which one particular past future, a Hitlerite pattern of unrelenting conquest, had become dominant – but that does not tell us ‘why when?’ Ultimately, however, the proof that Britain and France had come to believe that Hitler could only be stopped by a declaration of war has to be deduced from the fact that war was indeed declared at that time, evidence which is practically compelling but intellectually unsatisfactory. Hypothetical past futures make sense of other key decisions. The Japanese attacked Pearl Harbor in December 1941 because they believed confrontation with the United States was inevitable and they might as well swallow their pill: indeed, they expected to run out of oil within a year.69 Hitler discounted the possibility that he would feel the might and determination of the Americans when he formally declared war on them in a pointless act of solidarity. Eisenhower’s invasion options, on the other hand, were steadily narrowed. Stalin and Roosevelt had combined to insist on a direct cross-Channel assault, brushing aside Churchill’s preference to fight through southern Europe. Thereafter the Normandy project acquired a momentum of its own, and the decision on 5 June 1944 was not so much about a determination to go as a realization that it was well-nigh impossible to stop. There was luck, as well as compulsion, in Eisenhower’s muttered resolve to proceed: the next feasible invasion date coincided with the terrible storm of 19 June. Discussion of significance establishes the importance of an event by comparison with likely alternatives. Had D-Day failed, and in the year of a presidential election, political pressure in the United States for an ‘Asia First’ strategy would certainly have grown, the more so if preinvasion tensions with their British allies had erupted into postcatastrophe recriminations. Failure in Normandy in June 1944 might well have left the Western Allies as peripheral spectators in a Europe dominated by Stalin’s Russia since, even if successful, a second invasion could hardly have been more than a tailpiece to the grinding Russian westward advance. Even worse, Stalin might have patched up the

Significance / 239 sort of peace that George Orwell wrote about in Nineteen Eighty-four, enabling the Nazis to switch their remaining resources against the democracies. Moreover, it is neither difficult nor pleasant to imagine the implications for Canadian national unity had the Régiment de la Chaudière been cut to pieces on Juno Beach in a second and larger Dieppe fiasco. Analysis of significance operates in another and more profound way, locating events in time in relation one to another. It was and it remains significant that war broke out in Europe before Hitler had amassed an arsenal of submarines and rockets sufficient to destroy the democracies, significant that Britain remained undefeated when Germany attacked Russia, significant that the outbreak of the Pacific war did not coincide with the Battle of Britain. Significance, then, is a device that enables us to locate events in time in relation to each other. Can it be extended to enable us to anchor the present to the past and so locate ourselves in the sweep of time? To tackle that question, we must decide whether our present (if such a transient concept has any worthwhile independent existence) has disconnected itself from the Second World War in the way that it seems to have divorced itself from the era of the Roman Empire or the Black Death. ‘We no longer live in a postwar world,’ one historian announced in 1999.70 If periods of time can be consigned to sealed compartments in this way, analysis based on the concept of significance would be as sterile as speculation aimed at establishing ultimate causation. The Second World War undoubtedly overshadowed the twenty-five years after 1945, but half a century later it becomes harder to identify any dominant elements in the contemporary world that are indubitably derived from the war against Hitler. The immediate legacy included the notion of a ‘post-war’ world of freedom and justice, tacitly supporting the assumption that the Western world had arrived at a logical culmination of its journey through time. For forty years, the division of Europe from the Baltic to the Adriatic was as taken for granted as the arrival of a Scottish king on the throne of England. Without a successful D-Day, it may be doubted whether the Red Army could have been prevented from advancing at least as far west as Antwerp, where its uncomfortable proximity would have had massive implications for the post-war British political agenda. The entire mental concept of ‘Europe’

240 / Past Futures: The Impossible Necessity of History shrank as the western half of the continent moved toward economic and political union based on a core alliance between two countries with a roughly equal population, France and a truncated West Germany. The collapse of the Soviet empire after 1989 posed the European Union with a major problem, how to expand eastwards while accelerating internal centralization. The practical issue was large enough, but it was made all the more puzzling by the sudden explosion to the east of the mental frontiers of the ‘Europe’ of 1945. It was as if eighteenth-century Britain had suddenly realized that it was no longer Protestant, or seventeenthcentury England had discovered that the king imported from Edinburgh was a Hindu raja in disguise. The fact that the rediscovery of eastern Europe in the nineteennineties marked the final eclipse of the officially designated post-war world does not justify an anti-historical relegation of the Second World War to the realm of ‘a long time ago.’ If anything, the closing years of the twentieth century point to the need to emphasize that numeric adjective: the significance of the Second World War increasingly lies in the fact that it was a sequel to the earlier conflict, the War of 1914. A surprising number of the regional problems of the last two decades of the twentieth century can be related to a seed-bed period from its first two decades. In Europe, the disintegration of Yugoslavia and the problem of Northern Ireland each derive from political agendas set between 1912 and 1922. Both Russia and China have re-established themselves on paths of modernization launched in a period of revolutions between 1905 and 1917. The South African white settler state of 1910 had run its course by 1992. Canada, which rejected free trade with the United States in 1911, accepted continental economic integration in 1988. Of course, these may be selective examples of accidental similarity, although a pattern that accommodates both China and Russia can claim to be an impressively large coincidence. A more serious objection would be that none of these problems or processes began in 1911. There was nothing new in the tensions between Christians and Muslims in the Balkans, nor between Catholics and Protestants in Ireland, while common sense alone will suggest that it is unlikely that political revolution in countries as vast as Russia and China bubbled up from superficial, short-term causes. White supremacy was already well

Significance / 241 entrenched in South Africa. Reciprocity had long been a battle cry in the politics of Canada. This is not to dismiss the Second World War as a meaningless tale of sound and fury, but rather to insist on the relationship between significance and location in the long sweep of time. The significance of any historical episode relates to its own past and to its perceived future. It may also relate to our present, and as that ephemeral moving line advances through time, events that seem recent may recede in importance, while episodes that we have thought of as distant suddenly move close and so acquire huge contemporary significance. As a moral sentiment, the late-twentieth-century orthodoxy that the First World War was an appalling waste of human life cannot be faulted. Merely because a few yards of mud in the Somme valley are of no importance to us, we are not entitled to deny that the battle formed part of a war of enormous significance at the time to the combatants, and to those, like William Coaker in distant Newfoundland, who shouldered the responsibility of supporting the troops in the trenches. By contrast, events forgotten by white people have reemerged from a long-ago past to dominate the agenda of aboriginal politics in Canada, and in Australia and New Zealand as well. One wonders how many lost pasts are lurking out there, awaiting rediscovery to illuminate, or to haunt, our rapidly changing present. The concept of significance is one way of reminding ourselves that because the past is inherent in the present, the continuum of time remains a seamless garment. We must also be prepared for the shock that it will prove to be a coat of many colours. Significance offers a way of signposting the past, of inserting a scaffolding into the welter of events that permits us to form an interpretation of history, without which we cannot locate ourselves in time. It must be firmly distinguished both from the portentous and the trivial, from the hands of the clock that so impressed Mackenzie King and that phantom pencil that perhaps was wielded by Sir Hugh Dowding. Above all, the idea of significance enables us to focus a fresh lens on the past and appreciate that an event can be ‘so well known that its startling and dramatic nature is forgotten.’71

This page intentionally left blank

8 Objections, Review, and Tailpiece

This page intentionally left blank

As a book nears completion, it behoves an author to review its arguments and to ask how they may be received by critics. The reviewing of books is not the most charitable activity in academe, nor should it be. It does not help when authors refuse to accept some responsibility for the misunderstandings that arise from their own obfuscations. On the face of it, the present work seems wide open to attack. Here, it will be said, is a professor who insists that historical explanation is impossible, that it is never possible to say with certainty that ‘A caused B,’ and who dismisses any search for underlying causes as outbreaks of such horrors as the Coldingham Fallacy and the Viking Syndrome. Yet, inconsistently, the work is replete with postulated causal connections and confident (reviewers will probably say ‘opinionated’) accounts that link those despised factors and allegedly doubtful causes to form explanations every bit as insecure as those he seems to take pleasure in denouncing. Furthermore, the author claims that history cannot even be attempted without in some way defining the ethical values by which we are animated while, at the same time, warning us against imposing the righteous stridency of our transient present upon the different morality of the past. To escape from his own contradictions, the writer makes a desperate lunge towards the notion of locating events, urging us to establish their significance rather than engage in the impossible task of seeking their explanation. But how can we do the one without the other? If the significance of an event lies in its potential to influence other events, we are simply reversing the horse and the cart, arriving at ‘A caused B’ by a circuitous route. Is there some truth in the comment of the acerbic historian G.R. Elton that those who sought to switch enquiry from the origins of the First World War to its role in shaping the twentieth century were simply shifting ‘from the causes of the war to war as a cause’? In any case, while an approach that re-inserts lost futures might help us rearrange some major episodes in the more distant past into a neatly tagged pattern, what chance have we of assessing the significance of near-contemporary events when we have no way of knowing what is going to happen in a decade, or a year, or even a week from now? Far better to return to the serious business of archive-crunching, and hope that we may be lucky enough to emulate George B. Stow in his delightful and scholarly discovery that Richard II invented the pocket handkerchief. ‘Be good,

246 / Past Futures: The Impossible Necessity of History sweet maid, and let who can be clever,’ was Charles Kingsley’s advice to Victorian females.1 Be productive, sweet historians, and leave those foolhardy enough to ask what History is all about to tangle themselves in their own contradictions. It is therefore politic to review the arguments in the light of the probability of scepticism. The first and most basic case against the sufficiency of historical explanation is that we simply do not know, cannot know, will never know, everything that happened in the past. A major contributory reason why this should be so is that the participants themselves usually provided only a partial account of events – ‘partial’ not just in the sense of often serving their own ends. Combine inadequacies of observation with opacity of recollection plus the uncertainty of transmission, and the challenge of amassing comprehensive evidence becomes virtually insuperable. This inconvenient problem alone is enough to destroy any theory of explanation, whether it postulates complete narrative of antecedent events or assumes an ability to discern the causally pertinent from the mere dross. Of course, from time to time, new evidence does come to light and can dramatically change our perspective on the past. Such discoveries are superficially reassuring – until we remember the vast iceberg of the unknown that remains forever hidden. Thus we may summarize to this point. We cannot know everything that happened in the past. The discovery of fresh evidence may illuminate a few mysteries and discredit some interpretations, but it will never tell the whole story. If, by definition, the origin of an episode lies in some or all events antecedent to that episode, the obstacle that we cannot reconstruct that totality is sufficient to rob us of any pretension to certainty in causal explanation. In any case, even if we could recover every single incident from the past, we have no sure way of establishing which, if any, of them exercised the crucial leverage that brought about the events we seek to comprehend. The only ‘scientific’ way to establish that ‘A caused B’ would be to replay the events preceding B, omitting A altogether. It is our scholarly misfortune that we cannot recapture the past and submit it to such laboratory examination. Hence, historical explanation is impossible because it can never be complete, either in the recovery of evidence or in the scrutiny of causes. To say this is not to deny that events have causes. It is rather to

Objections, Review, and Tailpiece / 247 recognize that there may be uncountable numbers of contributing elements shaping each and every event. We may dimly discern some of them, but in the last resort we have no way of proving our guesses. Of course, this does not prevent historians, and humanity at large, from engaging in attempts at partial explanation. For practical purposes, some of these incomplete forms are as socially useful as they are unquestionably emotionally necessary – including some of the most basic of ‘A caused B’ equations. Taylor reminds us that there is a distinction between historical explanation and court-room proof. We are safe in concluding that Princip’s bullet almost certainly caused the death of the Archduke Franz Ferdinand on 28 June 1914, but we must draw back from proclaiming that the very same bullet also caused the outbreak of the First World War. Establishing a causal connection in the first instance is a natural human response to the need to make sense of the world around us. Imposing a causal hypothesis in the second reflects rather the desire of historians to present themselves as ‘experts’ on matters where no one can possibly claim certainty. Better by far to locate an event in time than to pretend to elucidate causation. ‘The shooting at Sarajevo can be judged as the beginning of the First World War without also being judged its cause.’2 One historian who came tantalizingly close to appreciating the impossibility of causal explanation was G.R. Elton. Perhaps it was the trenchancy of his own prose, an Eltonian trademark, that disguised that final step from its author. Elton recognized the possibility that because ‘no two things in history are ever alike ... generalization becomes impossible, explanation ceases, and the historian is reduced to a mindless description of a meaningless sequence of events.’ In practice, however, even the most cautiously narrative scholar edges towards causal hypotheses that are implicitly derived from a form of statistical observation of loosely related episodes. However, these dimly perceived ‘laws’ convey multiple possibilities noted from the outcomes of apparently similar examples. (‘Historical causation differs completely from the movement of billiard balls,’ Elton observes.) The example he gives is rapid rise in population which, he says, may variously lead to emigration, increased domestic productivity, or poverty. Elton’s use of a population explosion merits comment because it is reminiscent of the Viking Syndrome, although in his case it is open-

248 / Past Futures: The Impossible Necessity of History ended as to possible results. First, to take the examples he has in mind, from eighteenth- and nineteenth-century British and Irish history, it is by no means clear-cut that these three outcomes were distinct. Both countries experienced emigration and increase in productivity and poverty, if in different areas and certainly in varying combinations. Nor are the outcomes necessarily related to pressure of population: the most dramatic exodus from Ireland occurred as the Great Famine of 1845–9 scythed through the people. To assume that ‘poverty causes emigration’ is as simplistic as to argue that ‘poverty causes crime’: historians of migration have inconveniently pointed out that somebody had to find the money for the ticket, and, in an era dominated by private enterprise, that somebody was usually the migrant. However, perhaps most noteworthy of all is that Elton’s example is implicitly counter-factual. A historian may feel entitled to argue that a population explosion predominantly brought about, say, poverty, and come up with reasons why this seems to have been so. In that case, the significance of the decline into poverty lies in the negative comparison: what impeded emigration, what prevented the growth of industry or the improvement of agriculture? Elton accepts that a population increase alone cannot explain anything since it is what he calls a ‘situational’ cause, a categorization similar to Arthur Lower’s analysis of long-term and short-term elements in an explanation. As Elton says, ‘situational causes cannot by themselves result in any particular event at all ... [since] they are capable of producing a very large range of possible results.’ Thus, ‘the real problem’ remains ‘not what part do laws play in historical causes, but how may one account for the event on the basis of empirical enquiry.’ The historian (assumed here to be male) may plump for one explanation among the many possible options, ‘but unless he can demonstrate it from evidence he must describe it for what it is – a guess.’ This is where we came in, with Ralph Bennett demanding ‘an austere reserve’ never to pass off inference as certainty. How does Elton escape from this blind alley? He argues the need for ‘proof,’ taking us half-way back to that court-room procedure that A.J.P. Taylor pointed out was so different from historical methodology. What constitutes proof? ‘The rigorous answer is that an explanation in history is proved if it can be demonstrated, from historical

Objections, Review, and Tailpiece / 249 evidence, that a given cause A exercised influence in producing effect B.’ Masterful pronouncement may scout the problem, but it does it solve it. Elton himself promptly admits that ‘these standards of rigor need relaxing a trifle.’3 Historians may feel triumphantly certain that their arguments prove their theories, just as in the seventeenth century John Lightfoot was so zestfully sure that Adam and Eve must have run into trouble with their divine landlord during the month of September. But the most confident of causal deductions can turn out to be mistaken. ‘We all make inferences daily,’ writes the historian Robin Winks, ‘and we all collect, sift, evaluate, and then act upon evidence.’ Winks himself recalled how, on being appointed to a professorship at Yale, he equipped himself for the New England winter by throwing a snow shovel into the trunk of his automobile. Consequently he was untroubled by the loud rattling noise that accompanied his various journeys around the streets of New Haven. It was not until silence returned at precisely the point when the gas tank fell off that he began to question the confidence of his own inference.4 Even at the gas-tank and snow-shovel level of explanation, we ought to remember that it is just possible that the archduke died of a simultaneous heart attack, even though the odds against such an exact coincidence must be astronomical. To brand historical explanation as futile is not to deny that it is constantly hazarded, nor even to argue against making the attempt. Indeed, historians might command more public respect if we could adopt the public image of daredevils attempting the impossible. For Elton to claim that Martin Luther’s attack on papal power caused the Reformation simply because it was followed by the Reformation is not proof at all, as he so roundly announces, but merely an example (admittedly a highly plausible one) of post hoc, ergo propter hoc. To go on, as Elton did, and claim that there would have been no Reformation had somebody else mounted the assault, or had Luther delayed until 1520, is to reduce the counter-factual approach to the level of mere assertion. It should be noted that Elton limits himself to claiming that it is possible to prove that ‘a given cause A exercised influence in producing effect B’ (emphasis added). Elsewhere he states that ‘the essence of historical causation is multiplicity,’ since ‘none of the matters to be explained by the historian ever derives from a single identifiable cause.’5 In other words, historians cannot prove causal

250 / Past Futures: The Impossible Necessity of History connections, but can only offer identifications of probable proximity. What poses as historical explanation is in reality nothing more than the location of events in time. We ought to be satisfied with that. We can come at the problem from another angle, by recognizing that there is indeed one basic form of historical explanation that tells us everything about human actions while, at the same time, unluckily telling us really nothing at all. It is the general principle that all events involving people come about as the result of decisions taken by somebody, either individually or as part of some formal or accidental collectivity. Poland was invaded in 1939 because Hitler decided that it should be invaded. Britain and France went to war because their governments decided that the invasion of Poland constituted a casus belli. This form of explanation helps to locate events in a pattern of time, but it does not and cannot tell us why Hitler on the one hand and the democracies on the other said ‘yes’ to propositions involving war when, in bloodless, abstract theory, they might equally well have said ‘no.’ A philosopher who examined the debate on the causes of the Second World War between A.J.P. Taylor and his critics came to an interesting conclusion: the participants seemed ‘to be working with, not one, but a number of rather different notions of what makes a necessary condition a specifically causal one.’6 In other words, historians who thought they would not have declared war upon Hitler in September 1939 denied that the invasion of Poland was the cause of the Second World War, because the response did not seem justified to them. To overlook the more pertinent point that it did seem so to Neville Chamberlain and, so far as we can tell, most of the British people is, to put it mildly, narcissistic, if not downright arrogant. The standard historical approach to a decision is to assume that it is a structure, built brick by brick until it is roofed over by a measured conclusion for action. The argument of the present work is that most decisions are not like that at all. They occur to people in a flash. That instantaneous moment may be remembered (although, more often than not, it passes unnoticed) and even rationalized, but it can never be entirely recalled and so genuinely explained. Yet this is not to say that people’s decisions are always entirely irrational. True, in a political system that compels mass participation, a minority may indeed make choices on a basis that defies comprehension. Democratic but authori-

Objections, Review, and Tailpiece / 251 tarian Australia not only operates a system of proportional representation based on the single transferable vote, but compels its citizens to vote, thereby securing a turn-out of well over 90 per cent. In the 1961 general election, the right-wing government of Robert Menzies survived by a hair’s breadth when it held on to the Queensland electorate of Moreton – thanks to the second-preference votes of the Communist candidate. It is possible that some Moreton voters decided that they wanted Red Revolution first and Pig-Iron Bob second, but it is a lot more likely that they failed to grasp the complexity of the system and simply listed candidates in the order they appeared on the ballot.7 In Canada, which does not drive its citizens to the polls, a participation rate of 90 per cent is almost unthinkable. It is a simple matter of respect for the country’s voluntary democracy to assume that the 93.52 per cent turnout in the 1995 Quebec referendum reflected the considered judgment of five million voters on the future of the province. After twenty years of public debate, including one previous referendum in 1980 at which the four-fifths of Quebecers who are francophones had divided almost exactly on the issue, the arguments for and against could be said to be clear, with incentives and drawbacks on both sides of the case. Yet the margin of the outcome was wafer-thin: a ‘No’ majority of 54,000 votes, barely 1 per cent of those taking part.8 We must accept that the result represented a collage of millions of individual rational decisions. What eludes us is any real understanding of the mechanism by which each of those five million individuals perceived one set of arguments to be more persuasive than the other. The problem can be personalized: why did Pierre-Marc Johnson campaign for an independent Quebec, while his brother Daniel upheld the federalist cause? If two intelligent and articulate siblings could reach totally logical but entirely opposed conclusions on such a key question, what chance have historians of deducing, or imposing, any convincing explanation of rational cause on the basis of the choices people make? Once again, the search for explanation gives place to the attempt to locate events in time. The rationality of people’s decisions depends, at least in part, upon their perception of what is to come. One task for historians is to seek to recover those myriad futures that have been trodden underfoot in the historical record by the future that actually materialized. For that exercise, we need to categorize different forms of

252 / Past Futures: The Impossible Necessity of History perception of the future and identify who subscribed to which of them. Some historical personages, like Cardinal Newman, were content simply to drift with an unseen tide. At the other extreme, terrorists and ideologues believe that they own and thus control the future, however much they be isolated and reviled in the present. In between, the mass of people probably oscillate between optimistic and pessimistic attitudes to their destinies. Most intelligent men and women are capable of expressing the deepest of gloom about the world that we are heading for, twenty or thirty years hence. Yet somehow our doomsday fears do not prevent us from taking out mortgages and subscribing to pension funds. The operative future that helps frame human decisions may loom in the long term at one moment and intimidate by its proximity at the next. Once we have factored those vanished past futures back into the historical picture, we can start to confront other questions about the subjective quality of time. Are we close to the past, or remote from it? Does the past sometimes double back to catch us unawares, just as Churchill claimed when he denied Lord Linlithgow’s claim that the British were on a one-way road towards withdrawal from India? Two fundamental issues arise from such a discussion. One relates to our ethical standpoint: can we make assessments of the past without imposing our own values upon it? If it is indeed impossible to separate mechanical interpretations of history from moral judgments, should we be more open about the beliefs through which we filter our verdicts? ‘The murderous analysis to which we subject the notions of our ancestors is suitably matched by the complacency with which we accept our own at their portentous face value.’ The complaint was made by Donald Creighton in an address to the Canadian Historical Association in 1950. His target was Canada’s Liberal and bureaucratic establishment, for whom ‘an idea of the past which is unnoticed or unpopular in the present is regarded, not only as a poor idea, but also as virtually no idea at all.’9 Half a century later, Creighton’s alternative view of his country is certainly invisible and discounted, but the Canada of which he was beginning to despair also seems unacceptably narrow to the modern mind. Abortion was illegal, not a single woman sat in the House of Commons, and the political system still treated Aboriginal people as children. Creighton’s complaint was particularly germane to the study of Canadian history because, in its European

Objections, Review, and Tailpiece / 253 phase, the Canadian past (except in Quebec) seemed then to be almost entirely recent in quality. For historians of the older societies of Europe itself, the problem is to identify that grey zone of perhaps two hundred years, that period of transition towards the modern world, upon which we may legitimately begin to let loose the superiority of our own opinions. There is no point in criticizing Christopher Codrington, who did after all build one of Oxford’s most beautiful libraries, for profiting from West Indian slavery, since when Codrington died in 1710 hardly anybody had queried the morality of owning black people. But when it comes to William Gladstone (whose first parliamentary speech was made during the debates on abolition in 1833), defending his family’s record as slaveowners in Guyana, can the historian avoid at least a suggestion of superiority towards a man who would build his own later career upon the fearful power of his own morality? The second major issue that arises from posing the question ‘what is a long time in history?’ relates to the nature and definition of the present. In 1994, J.T. Saywell published a brisk overview of Canadian history entitled Pathways to the Present. The title implied the discrete existence of a presumably identifiable band of time called ‘the present,’ and Saywell duly ended his study with a short epilogue called ‘Where Are We?’10 Yet Saywell did not succeed in defining his present. Had it begun with the election of the Chrétien federal ministry in 1993, or from the irruption of the Lévesque separatist government in Quebec in 1976? Was it a continuing present that still exists in some essence and form? More to the point, in an increasingly ‘pastless,’ here-and-now Canada, was there any relationship at all between the present and the country’s history? As the moving line of the Canadian present passed into 1995, Peter C. Newman was denying that any organic connection stretched back even a decade. Back in 1966, Hayden White had written of ‘the radical dissimilarity of our present to all past situations,’ an attitude reflected in his disparaging allusion to history as an ‘incubus.’11 It is a short step from such a dismissal to the assumption that all pasts were identical, that our forebears lived in a black-and-white world illuminated only by the burning of witches until John F. Kennedy, Pierre Trudeau, and the Beatles brought us enlightenment on technicolour videotape sometime around 1960. Even if we seek to use the past merely as a mirror in which we

254 / Past Futures: The Impossible Necessity of History may preen ourselves in our own superiority, it would at least help to possess sufficient historical sensitivity to appreciate that there is nothing new about radical dissimilarity between different epochs of time. Forty years on, the notion that the nineteen-sixties had liberated themselves from the past seems faintly comic. And so from a starting point of recognizing the impossibility of explanation, we come to an alternative conception of the study of history, the location of events in time, the identification of their relationship to each other and to ourselves. For that endeavour to make sense, we must not only abandon any notion of the superiority of the present. Rather, we have to grasp that it is the present that has no independent existence. That present to which Saywell traced pathways as recently as 1994 can now be seen as not even a resting point on the voyage through history, because that journey never pauses and will never reach its end. The argument here is that we can identify those interconnections by interrogating the notion of significance. Precisely because each succeeding present is merely provisional, nothing more than a moving line between past and future, that concept will produce constantly changing perspectives, rediscoveries of aspects of the past that we had forgotten or discounted. There is, in short, a multiple challenge here: significance of what, to what, and to whom? In 1891, the Melbourne intellectual and politician Alfred Deakin described the formation of Australia’s first Labor parties as ‘more significant ... than the Crusades.’ From Deakin’s point of view, the mobilization of the working classes behind an ostensibly socialist movement was indeed more significant than a medieval confrontation between Christianity and Islam. Indeed, the Australian colonies operated immigration policies that kept out anyone from the part of the world where the Crusades had taken place. The drama of a local shift in political power so impressed Deakin that he temporarily overlooked the fact that Australia’s link to Europe depended in large measure upon British control over the Suez Canal. He did not foresee that his country’s security would one day depend upon its relationship with millions of Muslims in Indonesia and the Arab world. Carl Becker once wrote that it was inevitable that the dramatic events of the human experience would ‘fade away into pale replicas of the original picture, for each succeeding generation losing, as they recede into a more distant past, some

Objections, Review, and Tailpiece / 255 significance that once was noted in them.’12 Deakin, presumably, was not denying that the Crusades had been a notable episode. He was simply saying that they were less notable for him than the events of 1891. Until we ask ourselves that simple question, What is a ‘long time’ in History? we shall continue to confuse the portentous with the significant. Too often this means that we may be basing our sense of our location in time upon the superstructure rather than the foundations. Perspectives can change. As this study was approaching completion, New York and Washington were subjected to terrorist attacks by acts of air piracy on September eleventh 2001. The rapid banishment of the term ‘crusade’ from the American vocabulary of retribution was testimony to the fact that there are people on this planet who do not share Alfred Deakin’s evaluation of the relative significance of cultural conflict in the seemingly distant mediaeval world. On a planet instantaneously linked together by television coverage, the emotional impact of those acts of mass destruction was profound and will surely prove to be enduring. Few will forget the pictures from New York, of the collapse of two massive and proud buildings, the inexorable crumbling of steel and glass into dust and rubble. It seems surprising now to recall that, in early September 2001, the United States had been widely regarded with some irritation among its friends around the world, as its new administration moved towards protectionism and withdrew from the Kyoto environmental agreement. The tragedy touched an underlying sense of kinship and alliance among the democracies. Some of the images will surely rank among the most deeply moving that any of us will ever witness: in Canada, airports crowded with diverted aircraft and bewildered passengers; in London, the Guards band at Buckingham Palace playing ‘The Star-Spangled Banner’ as a solemn requiem; in Dublin, the slow and silent march of the city’s fire brigade to the American embassy expressing Ireland’s grief for Irish America. Can the analysis proposed in these pages be applied to an episode that so shocked our complacent world? First, we may note that the assumed superiority of our values was profoundly shaken, largely because they have passed without scrutiny for so long. Not only was there, at best, only a fragile consensus in the Western world on how to respond, but, in the English-speaking world, there was a marked reluc-

256 / Past Futures: The Impossible Necessity of History tance to adopt terminology redolent of crime and outrage. With a proper concern for definition, we may conclude that there were ‘rebellions’ in Canada in 1837–8, but no more than ‘disturbances’ at the Red River in 1869–70. By unspoken agreement, the corresponding term for the terrorist attacks has become, not the ‘massacre’ or the ‘crime,’ but the ‘events’ of September eleventh. To some extent, this speaks of dignified understatement, but it is also hopelessly inadequate as a vehicle for the outrage that we are entitled to express. There were many ‘events’ around the world on September eleventh 2001, just as there had been millions of ‘events’ on September tenth and would be many more of them on September twelfth. When pressed to definition, the religion of all sensible men and women produced only a muted, falsely neutral categorization. Can we ‘explain’ those ‘events’ of September eleventh? Why did they happen? The horror of the carnage, the thought of several hundred ordinary airline passengers suddenly finding themselves facing death by fireball at the hands of suicidal fanatics, tends to pose the ‘why’ question in its ‘wherefore’ form? How could such evil exist, how could anybody feel the hatred that drove them to kill on such a scale and such a way? But ‘wherefore’ questions are not the realm of the historian. The problem is that September eleventh reminds us that the ethical aspect of historical judgments is inescapable. Suppose we telescope the causal arguments, avoiding the outright pitfall of the Coldingham Fallacy, but emulating the process that leads us to conclude that ‘slavery caused the American Civil War.’ Terrorists attacked the United States because they saw America as the prime supporter of their enemy, the state of Israel. Does it then follow that we are justified in offering the historical verdict that America was attacked because the United States supports Israel? Many of us will feel, as citizens of the world, that the nature and extent of American support for Israel may not be in the long-term interests of either country. But for historians to pronounce that September eleventh happened because of American support for Israel comes dangerously close to implying that, in their terms at least, the terrorists were right to take the actions that killed so many people. In this case, historical explanation is not so much impossible as unacceptable. It is downright dangerous as well as plainly incorrect. The only acceptable causal explanation lies in the web of human

Objections, Review, and Tailpiece / 257 decision. September eleventh happened because nineteen men decided to take part in a plot that other, more shadowy terrorist godfathers had determined to plan and finance. The widespread belief that there was a twentieth hijacker who changed his mind at the last minute only confirms the optional, yes/no nature of those decisions to check in at airports armed and ready to attack. Of course, to say that September eleventh happened because nineteen men decided that it should happen does not take us very far down the road of explanation. Millions of people resent the United States for one reason or another: why should those nineteen and (let us hope) only those nineteen people decide to act as they did? There are no laws of logic or compulsion that can account for individual human decisions. Because September eleventh was an act of collective suicide as well as of mass murder, a search for illuminating past futures may seem irrelevant. In the aftermath of the tragedy, there was some tendency to poke grimly contemptuous fun at the hijackers, portraying them as superstitious fools gulled by the promise of a place in Paradise. Historians are probably best advised to leave the controversy over motivation to psychologists and theologians, reflecting only that soldiers engage in suicide missions in all wars, the just as well as the tyrannical. It is likely that those who planned the attacks hoped for a response that would polarize the Islamic world against the United States. Thus far they seem to have been disappointed in seeing the realization of their hoped-for future, but we would do well to recall that we have no agreed measure for a long time in history. Are we too steeped in our own sense of horror to begin to assess the significance of September eleventh 2001? Even if the judgments may be provisional and preliminary, some points of significance may be discerned. An approach to history based upon the location of events in time may suggest that a single episode, however dramatic, is rarely a landmark in itself. Such an event is as likely to reveal that changes are taking place in the world as it is to cause them. For half a century, since the bombing of a Quebec Air flight in 1953, it ought to have been obvious that air travel was highly vulnerable to ruthless attack. Giant planes had been hijacked, bombed, and shot out of the sky, but the convenience of easy-access mass transportation had been tacitly allowed to outweigh the risks of an occasional tragedy. Indeed,

258 / Past Futures: The Impossible Necessity of History the unedifying wrangle in the American Congress over airport security suggested that it would take more than the destruction of the World Trade Center to bring about fundamental change in the prevailing culture of free enterprise and free movement. Therefore, in terms of the positive measure of significance, it may well be that a verdict on September eleventh 2001 in New York and Washington resembles that of the Chinese Communist on 1789 in Paris: it is simply too early to say. However, if we approach significance in its negative, dog-in-thenight, sense, more striking conclusions may emerge. Airliners loaded with aviation fuel proved to be missiles capable of creating devastating fireballs, but at least September eleventh did not involve a nuclear strike or a germ-warfare attack, as some experts on terrorism had feared. However, we need to keep firmly in mind that we do not dwell at the end of history. We are still travelling, and worse horrors may lurk in the future. It may also come to seem equally significant that the attack should have come from the Arab and Islamic world, rather than from other parts of the world where despair and hatred are endemic. Islam is fundamentally a religion of peace and resignation. As with all mass ideologies, it is capable of supporting a warrior fringe – the religion of the Sermon on the Mount has a spectacular history of commitment to the God of Battles, and even Mahatma Gandhi envisaged circumstances in which ‘when there is only a choice between cowardice and violence, I would advise violence.’13 September eleventh generated widespread incomprehension and hostility towards Islam across the Western world, yet there is no reason to assume that it is endemic in the religion of the Prophet to breed and incite martyrs. Furthermore, the oil-rich countries of the Middle East have the capacity to create a prosperous secular lifestyle for most of their people. It is striking that the air pirates were men of some education and training: such people, we should like to think, do not readily sacrifice their hopes of comfort and career. It is otherwise with the desperate of the world, as has been demonstrated by the stream of young suicide bombers from the refugee camps of Palestine. September eleventh briefly focused the attention of the West upon Islam and the countries of southwestern Asia. It may be that the dog that has yet to bark is to be found in sub-Saharan Africa. In that region of the world, not only do whole nations live in

Objections, Review, and Tailpiece / 259 hopeless poverty, but millions of their people are infected with HIV and, literally, have no future to which they can look forward. September eleventh was followed by a combined Western military campaign in Afghanistan aimed at destroying both the terrorists and the regime that had harboured them. It was a reasonable inference, of an ‘A caused B’ variety, to regard the Afghan operation as the result of the attacks on New York and Washington. Some might argue that September eleventh was rather the pretext, or at least the occasion, for a move against a destabilizing regime; others will question whether the Afghans were the sole source of the problem. However, in its initial stages at least, the operation seemed to be highly successful, even if historians of both the British and Soviet Empires will warn that prolonged military involvement in Afghanistan is a hazardous business. As the moving line of the ephemeral present swept onward, the focus of world concern shifted also, to renewed conflict between Israel and the Palestinians, and to the threat posed by the dictatorship in Iraq. Reviewing events in March 2002, a leading British commentator, John Simpson, delivered a devastating verdict – perhaps the first attempt to locate in time the murderous assault against America. ‘The only country that now fails to realise that the terrorist attacks of September 11 are now as much a part of history as the blowing up of the King David Hotel in Jerusalem is the United States.’14 The sentiment is almost shocking: can it really be that a tragedy that remains so fresh in our memories is now something that happened ‘a long time ago’? But the contextual allusion is fuzzy: what happened at the King David Hotel in Jerusalem, and when? The full force of Simpson’s point evidently lies in the comparison that he selected. In the aftermath of the Second World War, the much-battered British retained the mandate over Palestine granted to them by the League of Nations a quarter of a century earlier. Throughout their three-decade occupation of Palestine, the British had tried to hang on to control in the continuing present by pretending to ignore the inevitable future. ‘No one could say what the eventual aim was,’ recalled one official who helped run the country in the nineteen-thirties.15 Insofar as there was a grand plan after 1945, it vaguely foresaw the eventual creation of an independent Palestinian state in which the indigenous Arab population would constitute the majority. The British attempted to balance Zionist

260 / Past Futures: The Impossible Necessity of History aspirations by providing for limited Jewish immigration, but neither a Zionist state nor a Jewish majority were contemplated. On 22 July 1946, a small Zionist terror group, the Irgun, bombed the British administrative headquarters in Jerusalem, the King David Hotel, killing ninetytwo people. The outrage that followed is familiar. ‘The immediate duty is plain,’ a London newspaper declared; ‘– to uproot and destroy terrorism.’ Two years later, the British abandoned Palestine, having found themselves incapable of controlling the territory and, perhaps crucially, unable to agree upon a common policy with the United States. It has been called ‘an act of abdication for which there is no imperial precedent.’ The relationship between the explosion at the King David Hotel, the eventual scuttle of the British, and the bloody emergence of the state of Israel lies beyond this account, but for decades it was generally held that the Irgun was little more than a minor element in a turbulent story. After all, Israel’s first prime minister, David Ben Gurion, had condemned the attack as a ‘dastardly crime committed by a gang of desperadoes.’ The Irgun leader, who publicly acknowledged his responsibility for the mass murder, entered politics and for years failed to win more than fringe support even in Israel’s fragmented multiparty culture.16 The force of Simpson’s comparison comes home when that Irgun leader is named. He was Menachim Begin. In 1977, thirty-one years after the bodies had been pulled from the rubble of that Jerusalem hotel, Begin at last achieved power at the head of a coalition of right-wing parties. His election as prime minister of Israel coincided with an Egyptian decision to liquidate its feud with the Zionist state – it can hardly be said to have caused the shift in Sadat’s attitude – so that, a year later, Begin found himself shaking hands with the Egyptian leader at Camp David and sharing with him the Nobel Peace Prize. As a citizen, I hope that those behind the atrocities of September eleventh 2001 are never welcomed to the White House. As a historian, I cannot be so confident. I am, reluctantly, sure that, if human civilization endures, a time will come when historians will be as indifferent to the anger directed against that atrocity as historians are patronizing today towards those Ontario Protestants who demanded the execution of Louis Riel in 1885. It is almost forty years since J.H. Plumb bemoaned the proliferation of specialized historical research, ‘making detail livid, but blurring the

Objections, Review, and Tailpiece / 261 outlines of the story of mankind, and rendering it almost impossible for a professional historian to venture with confidence beyond his immediate province.’17 More recently, J.L. Granatstein has lampooned the archetypal research topic in Canadian history as ‘Housemaid’s Knee in Belleville in the 1890s.’ Critics have protested that housemaids are as entitled as kings and queens to have their past explored, and there is much to be said for both camps.18 Perhaps the real worry is the increasing likelihood that every single housemaid will command her own doctoral dissertation. It is not so much specialization in research that is the real threat to a historical overview as the sheer torrent of publication in every field that deters specialists from peering over higher and higher fences. Even if we lack adequate archival recall to recreate the past as a seamless web, we may at least enjoy the discovery of Richard II’s pocket handkerchief. We may not follow Plumb in assuming the male gender of the historian, but we ought to share his regret that history as it is at present conceived tends to confine its practitioners to short periods of time and highly specific areas of expertise. Gaillard Lapsley, teaching at Cambridge a century ago, now seems positively broad in his assertion that he ‘only went down to 1485.’ In 1966, Hayden White suggested that the study of history might itself be something best comprehended by location in time, urging us to consider ‘the notion that history, as currently conceived, is a kind of historical accident, a product of a specific historical situation.’ He foreshadowed the possibility that in different circumstances, the subject might ‘lose its status as an autonomous and self-authenticating mode of thought.’19 Historians have a responsibility to defend their calling and preserve its voice in educated discourse. In asserting the impossibility of explanation, we need not contest the human desire to strive for at least an incomplete comprehension of cause and effect. Far better than any pretence to omniscience is admission of the inadequacy of our understanding. It is not that we have been asking the wrong questions, but rather that we have yet to pose all the right ones. In the ultimate sense, we cannot explain the past. Rather, we can, and we must, attempt to locate events in the sweep of history, and so assert the impermanence of the present in order to locate ourselves in time. That way, we may replace our craft in its rightful position at the centre of all studies of the world around us.

This page intentionally left blank

Notes

Note: Where an edition, such as a paperback, other than the first has been used, this is indicated by the addition of ‘ed.’ after the date of publication. 1. redefining history at the centre of debate 1 R.F. Bennett, Intelligence Investigations: How Ultra Changed History. The Collected Papers of Ralph Bennett (London, 1996), 16. 2 Among the readable texts on historical methodology are E.H. Carr, What Is History? (Harmondsworth, 1964 ed.); David Hackett Fischer, Historians’ Fallacies: Toward a Logic of Historical Thought (New York, 1970); J.H. Hexter, Doing History (London, 1971) and The History Primer (London, 1972 ed.); G.R. Elton, The Practice of History (Sydney, 1967); Arthur Marwick, The Nature of History (London, 1973 ed.); David Lowenthal, The Past Is a Foreign Country (Cambridge, 1995); Barbara Tuchman, Practising History: Selected Essays (London, 1983 ed.); and Fernand Braudel (trans. Sarah Matthews), On History (London, 1980 ed.). Elton elaborated his views on explanation in his Political History: Principles and Practice (London, 1970), esp. 112–55: his interpretation is reviewed in chapter 8. The analogy between History and everyday life was amusingly made by Carl Becker in ‘Everyman His Own Historian,’ American Historical Review 37 (1932): 221–36. Conclusions largely opposed to the arguments of the present work may be found in the 1966 essay by Hayden White, ‘The Burden of History’ in White, Tropics of Discourse: Essays in Cultural Criticism (Baltimore, 1978), esp. 43. This note cannot pretend to offer a complete guide to the literature of historiography, nor does it mention every work that has sparked these reflections. Useful starting points for recent debates include John Tosh, The Pursuit of History (London, 1984); Keith Jenkins, Re-thinking History (London, 1991) and On

264 / Notes to pages 14–17 ‘What Is History?’: From Carr and Elton to Rorty and White (London, 1995); Alan Munslow, Deconstructing History (London, 1997); and Geoffrey Roberts, ed., The History and Narrative Reader (London, 2001). 3 R.G. Collingwood, The Idea of History (Oxford, 1961 ed.), 8. 4 Oliver Lyttelton, The Memoirs of Lord Chandos (London, 1992), 22–3. The historian was Gaillard Lapsley, an American who ‘had become the very personification of a Cambridge don.’ 2. history versus the past 1 J.R. Green, A Short History of England (rev. ed., London, 1929), 122. For the unreliable basis of evidence in this condemnation, see W.L. Warren, King John (Harmondsworth, 1966 ed.), 17–31; and C. Becker, ‘Everyman His Own Historian,’ American Historical Review 37 (1932): 232; and cf. D.H. Fischer, Historians’ Fallacies (New York, 1970), 6. 2 Alan Bullock, Hitler: A Study in Tyranny (Harmondsworth, 1962 ed.), 119– 20; Denis Smyth, Rogue Tory: The Life and Legend of John G. Diefenbaker (Toronto, 1995), 283. 3 G.R. Quaife, Wanton Wenches and Wayward Wives (London, 1979), 53. 4 Philosophers will blench at this naively confident use of the verb ‘to know,’ since it is their business to dissect the concept of knowledge. I am encouraged by the words of Geoffrey Roberts: ‘Much of the epistemological critique of narrative history misses the point that, for historians, the world is a particular, ontologically given object of inquiry with problems of knowledge which are quite ordinary and everyday’ (G. Roberts, ed., The History and Narrative Reader [London, 2001], 435). I take this to mean that, for practical purposes, historians can ‘know’ that the First World War began in the summer of 1914. An American might point out that the war did not ‘begin’ for the United States for another three years, but it would take a highly blinkered form of isolationism to deny that such a conflict was taking place elsewhere. An Australian would offer a reservation about the equation between August and summer. Thus, even though so basic a historical ‘fact’ may be open to the perspective of individual observation, Roberts encourages us to believe that we know it to be a fact all the same. 5 Ged Martin, ‘The Canadian Rebellion Losses Bill of 1849 in British Politics,’ Journal of Imperial and Commonwealth History 6 (1977): 61–93. 6 J.A. La Nauze, Alfred Deakin: A Biography (2 vols, Melbourne, 1965), 2: 629, 572. 7 Theodore H. White, The Making of the President 1960 (London, 1964 ed.), 18, 350; White, Breach of Faith: The Fall of Richard Nixon (New York, 1976 ed.), 96.

Notes to pages 18–22 / 265 8 Robert Bothwell and W. Kilbourn, C.D. Howe: A Biography (Toronto, 1979), 9; Peter Calvocoressi, G. Wint, and J. Pritchard, The Penguin History of the Second World War (Harmondsworth, 1999 ed.), xv. 9 Robert Blake, Disraeli (London, 1969 ed.), 237–9; Joseph Pope, Correspondence of Sir John Macdonald (Garden City, NY, 1921 ed.), xxiii; J.L. Granatstein, Canada’s War: The Politics of the Mackenzie King Government, 1939–1945 (Toronto, 1975), 355–6. 10 D.G. Creighton, John A. Macdonald: The Young Politician (Toronto, 1965 ed.), 482; P.B. Waite, The Man from Halifax: Sir John Thompson Prime Minister (Toronto, 1985), vii–viii; Keith Sinclair, Walter Nash (Auckland, 1976), v. Macdonald’s widow doubted whether he ‘made any further reference whatever’ to the papers he had collected. Joseph Pope, Memoirs of the Right Honourable Sir John Alexander Macdonald (2 vols, Ottawa, 1894), vii. 11 J.M.S. Careless, Brown of the Globe: I, The Voice of Upper Canada, 1818–1859 (Toronto, 1959), vii–viii; Donald R. Beer, Sir Allan Napier MacNab (Hamilton, Ont., 1984), 471; G.E. Marindin, ed., Letters of Frederic Lord Blachford Under Secretary of State for the Colonies, 1860–1871 (London, 1896), 5–6; C.P. Stacey, A Very Double Life: The Private World of Mackenzie King (Toronto, 1977 ed.), 10–13; University of Nottingham, Newcastle Papers, NeC 11260, memorandum [1862], and cf. Ged Martin, Britain and the Origins of Canadian Confederation, 1837–1867 (Vancouver, 1995), 110–13. 12 J.P.T. Bury et al., eds, The Cambridge Ancient History, 2 (Cambridge, 1924), 139–40, 152–3. 13 H. Hensley Henson, Retrospect of an Unimportant Life (2 vols, London, 1942); John Willison, Reminiscences Personal and Political (Toronto, 1919), 19. 14 W.A. Harkin, ed., Political Reminiscences of the Rt. Hon. Sir Charles Tupper (London, 1914), followed by E.M. Saunders, ed., The Life and Letters of the Rt. Hon. Sir Charles Tupper (2 vols, London, 1916); Waite, Man from Halifax, viii; John Hamilton Gray, Confederation (Toronto, 1872). Gray was made a judge in British Columbia in 1872, which perhaps explains why his second volume never appeared. The Reminiscences of Sir Richard Cartwright (Toronto, 1912) were in fact a series of question-and-answer interviews with a journalist; Joseph Pope, Confederation: Being a Series of Hitherto Unpublished Documents Bearing on the British North America Act (Toronto, 1895). 15 Willison, Reminiscences, 15; E.W. Watkin, Canada and the States: Recollections, 1851 to 1886 (London, 1887); Arthur R.M. Lower, Colony to Nation: A History of Canada (Don Mills, Ont., 1964 ed.), 324. Among Watkin’s odder distinctions is the fact that he received two separate entries in the Dictionary of National Biography. 16 Alvin Finkel and Margaret Conrad with Veronica Strong-Boag, History of

266 / Notes to pages 22–7

17 18 19 20 21

22

23 24

25

26

the Canadian Peoples: 1867 to the Present (Toronto, 1993), xiii, 8, 518–19. See also Margaret Conrad, Alvin Finkel, and Cornelius Jaenen, History of the Canadian Peoples: Beginnings to 1867 (Toronto, 1993), xv, 18, 381. On this theme generally, see the sensible comments of William H. Dray, ‘Some Varieties of Presentism,’ in Dray, On History and Philosophers of History (Leiden, 1989), 165–87. George B. Stow, ‘Richard II and the Invention of the Pocket Handkerchief,’ Albion 27 (1995): 221–35. Ged Martin, ed., The Founding of Australia: The Argument about Australia’s Origins (Sydney, 1981 ed.). Lord Sydney to the Treasury, 18 August 1786, in Martin, ed., Founding of Australia, 22–9. ’Was Australia Colonised by Accident?’ The National Times (Sydney), 30 January 1978: 8. For Durham’s views on French Canada, see C.P. Lucas, ed., Lord Durham’s Report on the Affairs of British North America (3 vols, Oxford, 1912), 2: 288–96; M-P. Hamel, ed., Le Rapport de Durham (Quebec, 1948). His predictions are discussed in chapter 5. Lucas, ed., Lord Durham’s Report, 1: 318–24; R.S. Neale, ‘Roebuck’s Constitution and the Durham Proposals,’ Historical Studies Australia and New Zealand 25 (1971), 581; Janet Ajzenstat, The Political Thought of Lord Durham (Kingston and Montreal, 1988), 49. Ged Martin, ‘Attacking the Durham Myth: Seventeen Years On,’ Journal of Canadian Studies 25 (1990), 51–2. National Library of Australia, Murray Papers, MS 565/1/15, diary, 29 January 1840; C.B. Sissons, ed., My Dearest Sophie: Letters from Egerton Ryerson to His Daughter (Toronto, 1955), 109; Brian Young, George-Etienne Cartier: Montreal Bourgeois (Kingston and Montreal, 1981), 5. University of Durham, Grey Papers, Grey to Russell, 10 February 1847; Owen Chadwick, The Victorian Church (3rd ed., London, 1971), 1: 140; J.K. Johnson, ed., The Letters of Sir John A. Macdonald 1836–1857 (Ottawa, 1968), 362; J.K. Johnson, ed., Affectionately Yours: The Letters of Sir John A. Macdonald and His Family (Toronto, 1969), 132. Knox published his diatribe from the safety of exile in Geneva. The National Library of Scotland has, until recently, daringly sold a facsimile postcard of the opening page. The vocabulary of moral denunciation is subject to constant devaluation. While Shakespeare used the word ‘naughty’ to mean ‘evil,’ in late-1990s Britain ‘infamous’ came to mean ‘notorious,’ while ‘wicked’ was on the way to becoming a measure of enjoyment.

Notes to pages 28–32 / 267 27 Linda Colley, Britons: Forging a Nation, 1707–1837 (London, 1994 ed.), 11. Thomson’s home town, Jedburgh, is in the Borders. 28 J.W. Wheeler-Bennett, King George VI: His Life and Reign (London, 1958), 467; R.G. Menzies, Afternoon Light: Some Memories of Men and Events (London, 1967), 249. 29 Harry Potter, Hanging in Judgment: Religion and the Death Penalty in England (New York, 1993), 48, 176. Following an enquiry by the Coventry magistrates into the candle-burning incident, the Reverend Richard Chapman was suspended, but unfortunately only from his chaplaincy. 30 Lower, Colony to Nation, 471. 31 Ludovic Kennedy, The Trial of Stephen Ward (Harmondsworth, 1965 ed.), 58. 32 J.M.S. Careless, Brown of the Globe: II, The Statesman of Confederation 1860– 1880 (Toronto, 1963), 143. 33 Anthony Howard and Richard West, The Making of the Prime Minister (London, 1965), 122. 34 A.J.P. Taylor, The Origins of the Second World War (Harmondsworth, 1964 ed.), 38; Carl Bernstein and Bob Woodward, All the President’s Men (New York, 1974 ed.), 359. Andrew Roberts suspects that the publicity-seeking Lord Mountbatten may have retrospectively inserted documents into his personal archive to protect his reputation. Roberts, Eminent Churchillians (London, 1994), 128. 35 John Morley, The Life of William Ewart Gladstone (3 vols, London, 1903), 1: 192. 36 Taylor, Origins of the Second World War, 36. But for a brilliant example of the application of forensic analysis to a historical problem, see John Kaplan’s demolition of the conspiracy theories surrounding the assassination of John F. Kennedy: ‘The Case of the Grassy Knoll: The Romance of Conspiracy,’ in Robin W. Winks ed., The Historian as Detective: Essays on Evidence (New York, 1970), 371–419. A whimsical account of an imaginary murder case by Collingwood (Idea of History, 266–8) is worthless but for the point that juries are expected to make up their minds within a finite time while historians can take their time. Unfortunately, the pressure to publish has increased since Collingwood’s day. Philosophers argue over the nature of narrative in History. In this extensive debate, perhaps the most sensible comment is that of Louis O. Mink, who argued that we could only judge whether the narrative of any episode was accurate if we could test it alongside a full account of what actually happened. Unfortunately, this is impossible. Historians will smile indulgently at these engrossing exercises, pointing to the more basic shortcoming that narrative is impossible without evidence, and evidence is invariably incomplete. See Geoffrey Roberts, ‘Introduction:

268 / Notes to pages 35–44 The History and Narrative Debate 1960–2000,’ in Roberts, ed., History and Narrative Reader, 1–21. 3. the impossibility of explanation 1 John Cannon, Parliamentary Reform, 1640–1832 (Cambridge, 1972), 244; Michael Oakeshott, On History and Other Essays (Totowa, NJ, 1983), 71–2. S.H. Rigby points out that explanation by narration of total antecedence can be traced back to John Stuart Mill. Rigby, ‘Historical Causation: Is One Thing More Important Than Another?’ History (1995): 227–42, esp. 233–4. C. Behan McCullagh puts the point in a slightly different way. If ‘causes are events or states of affairs which are at least contingently necessary for their effects,’ it follows that ‘for any given event there is a large number of causes, indeed a truly infinite number if indirect as well as direct causes are considered.’ McCullagh, Justifying Historical Descriptions (Cambridge, 1984), 194. 2 Michael Brock, The Great Reform Act (London, 1973), 15; Cannon, Parliamentary Reform, xiii. 3 Roland Quinault, ‘The French Revolution of 1830 and Parliamentary Reform,’ History 79 (1994): 377–93. 4 C.K. Sharp, A Historical Account of the Belief in Witchcraft in Scotland (London, 1884, facsimile ed. R.L. Brown, East Ardsley, Yorkshire, 1972), 61–2. 5 J.R. Pitman, ed., The Whole Works of the Reverend John Lightfoot (13 vols, London, 1825), 7: 372–3. 6 Paley’s Evidences of Christianity Epitomised (Cambridge, 1835), 171–3, 183–90; T.M. Healy, Letters and Leaders of My Day (2 vols, London, 1928), 2: 367. 7 D.H. Fischer, Historians’ Fallacies (New York, 1970), 174. 8 A.J.P. Taylor, English History, 1914–1945 (Oxford, 1965), 485. 9 A.J.P. Taylor, Beaverbrook (New York, 1972 ed.), 435–6, 456. The story can be traced to less dramatic origins in Basil Collier, Leader of the Few (London, 1957), 191. For a revisionist view of the crisis of 1940, see Richard Overy, The Battle of Britain: The Myth and the Reality (London, 2001). 10 Martin Gilbert, Winston S. Churchill: III, 1914–1916 (London, 1971), 239. 11 Winston S. Churchill, The Aftermath: A Sequel to The World Crisis (London, 1941 ed.), 386. 12 David Thomson, Europe since Napoleon (2nd ed., London, 1962), 506. The point is expressed diagrammatically in W.H. Dray, On History and Philosophers of History (Leiden, 1989), 84–7. 13 Michael and Eleanor Brock, eds, H.H. Asquith: Letters to Venetia Stanley (Oxford, 1982), 123.

Notes to pages 44–56 / 269 14 A. Bullock, Hitler (Harmondsworth, 1962 ed.), 546. 15 A.R.M. Lower, Colony to Nation (Don Mills, Ont., 1964 ed.), 314–23, esp. 318. For the Canadian press tour of the Maritimes in August 1864, see P.B. Waite, The Life and Times of Confederation 1864–1867: Politics, Newspapers, and the Union of British North America (Toronto, 1963), 65–72. 16 R.F. Bennett, Intelligence Investigations (London, 1996), 2; Keith Robbins, Sir Edward Grey: A Biography of Lord Grey of Falloden (London, 1971), 285. 17 J.L. Granatstein, Canada’s War (Toronto, 1975), 162. 18 Chester Martin, Foundations of Canadian Nationhood (Toronto, 1955), 297–8; J. Bartlet Brebner, Canada: A Modern History (Ann Arbor, 1960), 273; The Canadian Encyclopedia (3 vols, Edmonton, 1985 ed.), 1: 399. 19 D.H. Fischer, Historians’ Fallacies (New York, 1970), xxi–xxii and cf. A. Marwick, Nature of History (London, 1973 ed.), 99; H. White, ‘Burden of History,’ in Tropics of Discourse (Baltimore, 1978), 43; McCullagh, Justifying Historical Descriptions, 202. For a comparison of the two episodes, see D.G. Creighton, ‘The United States and Canadian Confederation,’ Canadian Historical Review 39 (1938), 209–22. ‘That one cannot re-create identical historical situations,’ writes Elton, ‘is so familiar a point that I need not labor it.’ Rather the point cannot be emphasized enough. G.R. Elton, Political History (London, 1970), 129. 20 G.M. Trevelyan, Clio, A Muse and Other Essays (London, 1913), 147. 21 Gwyn Macfarlane, Alexander Fleming: The Man and the Myth (London, 1984), 246. 22 D.H. Farmer, ed., Bede: Ecclesiastical History of the English People (London, 1990 ed.), 251–4. 23 Newfoundland Royal Commission 1933: Report (London, 1933, command no. 4480), 78. 24 Peter Neary, Newfoundland in the North Atlantic World, 1929–1949 (Kingston and Montreal, 1988), 15–24. 25 Goldwin Smith, The Empire: A Series of Letters (Oxford, 1863), 1; Macmillan’s Magazine 11 (1865): 419. 26 Zara S. Steiner, Britain and the Origins of the First World War (London, 1977), 222–3. 27 Jennifer Smith, ‘Representation and Constitutional Reform in Canada,’ in David E. Smith, P. MacKinnon, and J.C. Courtney, eds, After Meech Lake: Lessons for the Future (Saskatoon, 1991), 71. Cf. Jeremy Webber, Reimagining Canada: Language, Culture, Community, and the Canadian Constitution (Kingston and Montreal, 1994), 152–6. 28 F. Donald Logan, The Vikings in History (London, 1983), 25–6. 29 Bjorn Myhre, ‘The Archaeology of the Early Viking Age in Norway’ and

270 / Notes to pages 58–63

30

31 32

33

34 35 36

37

38

39

Donnchadh Ó Corráin, ‘Viking Ireland – Afterthoughts,’ in H.B. Clarke, M. Ní Mhaonaigh, and R. Ó Floinn, eds, Ireland and Scandinavia in the Early Viking Age (Dublin, 1988), 11, 18, 25–6, 434. Cf. G. Jones, A History of the Vikings (Oxford, 1984 ed.), 196–9. D.C. Moore, ‘Concession or Cure? The Sociological Premises of the First Reform Act,’ Historical Journal 9 (1966): 39–59, and see his Politics of Deference: A Study of the Mid-Nineteenth Century English Political System (Hassocks, Sussex, 1976), 176, and the criticism by Derek Beales in Historical Journal 21 (1978): 701–3; see also Donald Creighton, Towards the Discovery of Canada: Selected Essays (Toronto, 1972), 32. Chester Wilmot, The Struggle for Europe (London, 1959 ed.), 804n. Ian MacKay and Suzanne Morton, ‘The Maritimes: Expanding the Circle of Resistance,’ in Craig Heron, ed., The Workers’ Revolt in Canada, 1917–1925 (Toronto, 1998), 43. C.C. Eldridge, Victorian Imperialism (London, 1978), 1; A.P. Thornton, Imperialism in the Twentieth Century (London, 1980 ed.), 1; D.C.M. Platt, ‘Economic Imperialism and the Businessman,’ in Roger Owen and Bob Sutcliffe, eds, Studies in the Theory of Imperialism (London, 1972), 309; Ronald Hyam, Britain’s Imperial Century, 1815–1914: A Study of Empire and Expansion (London, 1976); P.J. Cain and A.G. Hopkins, eds, British Imperialism: Innovation and Expansion 1688–1914 (London, 1993), 42. Thornton, Imperialism in the Twentieth Century, 1; Cain and Hopkins, British Imperialism, 1688–1914, 43. Ronald Robinson, ‘Non-European Foundations of European Imperialism,’ in Owen and Sutcliffe, eds, Studies in the Theory of Imperialism, 118–19. Luke Trainor, British Imperialism and Australian Nationalism: Manipulation, Conflict and Compromise (Cambridge, 1994), 19, and cf. Ged Martin, Australia, New Zealand and Federation, 1883–1901 (London, 2001); C.A. Bayly, Imperial Meridian: The British Empire and the World, 1780–1830 (London, 1989), 73; Cain and Hopkins, British Imperialism, 1688–1914, 46. Alfred Lyall, The Life of the Marquis of Dufferin and Ava (London, 1905). The passage translates as ‘It must either be an elephant or a turtle dove, for these are the only two creatures we do not know by sight.’ I have emended ‘Hillymuir’ in the original, which makes no etymological sense. J.M. Beck, ‘Joseph Howe: Opportunist or Empire-Builder?’ in G.A. Rawlyk, ed., Historical Essays on the Atlantic Provinces (Toronto, 1967), 141–60; Christopher Hill, The English Bible and the Seventeenth-century Revolution (Harmondsworth, 1994 ed.), 35. J. Gallagher and R.E. Robinson, ‘The Imperialism of Free Trade,’ first published in Economic History Review, 2nd ser., 6 (1953), 1–15, often reprinted,

Notes to pages 63–71 / 271

40 41 42 43 44 45 46 47

48 49 50

51

52 53

54

e.g. in A.G.L. Shaw, ed., Great Britain and the Colonies, 1815–1865 (London, 1970), 142–63. Cain and Hopkins, British Imperialism, 1688–1914, 258–73, esp. 258–9. Donald Creighton, John A. Macdonald: The Old Chieftain (Toronto, 1965 ed.), 84–102, 108–10. D.M.L. Farr, The Colonial Office and Canada, 1867–1887 (Toronto, 1955), 64–106. W.K. Hancock, The Wealth of Colonies (Cambridge, 1950), 1. Lord Acton, Lectures in Modern History (ed. H.R. Trevor-Roper, London, 1960 ed.), 17 (lecture given in 1895). Fischer, Historians’ Fallacies, 15. John Bartlett, A Complete Concordance ... of Shakespeare (London, 1953 ed.), 1700. John Cannon and Ralph Griffiths still refer to ‘the conspiracy of Pontiac’ in The Oxford Illustrated History of the British Monarchy (Oxford, 1988), 505. Colin Read, The Rising in Western Upper Canada, 1837–8: The Duncombe Revolt and After (Toronto, 1982), 4. G.R. Elton, The Practice of History (Sydney, 1967), 101. Lower, Colony to Nation, 218. As Reinhart Koselleck splendidly puts it, this type of argument is ‘intended to rise above any historically unique situation’ to offer ‘a level of proof of supertemporal achronic permanence’ (R. Koselleck, The Practice of Conceptual History: Timing History, Spacing Concepts, trans. K. Behnke [Stanford, 2002], 109). More bluntly, it is a form of rent-an-explanation, one that ‘explains’ rebellions that never took place just as easily as those that did. Read, The Rising in Western Upper Canada, 18. There is some evidence that sympathy for the rebel cause was widespread, and it is possible that other districts would have joined in had they not learned of the failure of the insurrection almost immediately after receiving first news of its outbreak. In the light of that assumption, it is possible to see the three distinct outbursts as manifestations of a single attempt at a Canadian revolution. Colin Read and Ronald J. Stagg, eds, The Rebellion of 1837 in Upper Canada (Don Mills, Ont., 1985), xxxiv. D.G. Creighton, ‘The Economic Background to the Rebellions of 1837’ in Creighton, Towards the Discovery of Canada, 103–22; Douglas McCalla, Planting the Province: The Economic History of Upper Canada, 1784–1870 (Toronto, 1993), 189; Read, The Rising in Western Upper Canada, 165–8. For MacNab, see McCalla, Planting the Province, 192; Read, The Rising in Western Upper Canada, 168–9.

272 / Notes to pages 72–87 55 Read and Stagg, eds, The Rebellion of 1837 in Upper Canada, lxvi, 119. 56 McCalla, Planting the Province, 193; J.M. Bumsted, ‘1763–1783: Resettlement and Rebellion,’ in P.A. Buckner and J.G. Reid, eds, The Atlantic Region to Confederation: A History (Toronto, 1994), 176–7. 57 Allan Greer, The Patriots and the People: The Rebellion of 1837 in Rural Lower Canada (Toronto, 1993), 21–5, 50–1; L.C. Sanders, ed., Lord Melbourne’s Papers (London, 1889), 423. 4. the moment of decision 1 J.K. Chapman, The Career of Arthur Hamilton Gordon: First Lord Stanmore 1829–1912 (Toronto, 1964), 278; Hugh Thomas, Armed Truce: The Beginnings of the Cold War, 1945–46 (London, 1988 ed.), 185, 194; Richard Shannon, Gladstone: I, 1809–1865 (London, 1984 ed.), 293; S.G. Checkland, The Gladstones: A Family Biography, 1764–1851 (Cambridge, 1971), 119. 2 Maurice Mandelbaum, ‘A Note on History as Narrative,’ in G. Roberts, ed., The History and Narrative Reader (London, 2001), 54. 3 H. Smith, History of the Parish of Havering-atte-Bower (Colchester, Essex, 1925), 168–9. 4 W.S. Churchill, The World Crisis, 1911–1914, 1 (London, 1923), 24. R.I. Fitzhenry, ed., The Fitzhenry & Whiteside Book of Quotations (Toronto, 1981), 63, attributes the ‘camel’ story to ‘Anon.’ 5 Cameron Hazlehurst, Politicians at War, July 1914 to May 1915 (New York, 1971), 25–117; Churchill, The World Crisis, 1911–1914, 1: 220. 6 Diego Gambetta, Were They Pushed or Did They Jump? (Cambridge, 1987), esp. 187. On the question of responsibility for decisions, see also C.B. McCullagh, Justifying Historical Descriptions (Cambridge, 1984), 218–21. 7 H. Blair Neatby, William Lyon Mackenzie King: III, 1932–1939 The Prism of Unity (Toronto, 1976), 11; Carl Berger, The Writing of Canadian History: Aspects of English-Canadian Historical Writing since 1900 (2nd ed., Toronto, 1986), 132; L.B. Pearson, Mike: The Memoirs of the Right Honourable Lester B. Pearson: I, 1897–1948 (Scarborough, Ont., 1973 ed.), 15–16. 8 [B. Russell], The Autobiography of Bertrand Russell 1872–1914 (London, 1967), 63; Martin Gilbert, Winston S. Churchill, III: 1914–1916 (London, 1971), 9. 9 Richard Overy, Why the Allies Won (New York, 1996 ed.), 159, 347; C. Wilmot, The Struggle for Europe (London, 1959, ed.), 255. The novelist Anthony Trollope recalled the indecisiveness of the moment when he became engaged in similar terms, N. John Hall, Trollope: A Biography (Oxford, 1983), 88. 10 Carlos d’Este, Decision in Normandy (London, 1994 ed.), 110. Edith Lyttel-

Notes to pages 88–93 / 273

11 12

13 14

15 16 17

18

19

20 21

ton, Alfred Lyttelton: An Account of His Life (London, 1917), 237; J.E. Dunleavy and G.W. Dunleavy, Douglas Hyde: A Maker of Modern Ireland (Berkeley, 1991), 187. W.F. Coaker, Past, Present and Future (Port Union, Nfld., 1932), unpaginated, article from Fishermen’s Advocate, 19 October 1932. Ian McDonald (ed. J.K. Hiller), ‘To Each His Own’: William Coaker and the Fishermen’s Protective Union in Newfoundland Politics, 1908–1925 (St John’s, 1987), 70. Coaker, Past, Present and Future. J.M.S. Careless, Brown of the Globe (Toronto, 1963), 2: 360; W.L. Morton, ed., Monck Letters and Journals, 1863–1868: Canada from Government House at Confederation (Toronto, 1970), 170; Agatha Ramm, ed., The Political Correspondence of Mr Gladstone and Lord Granville (2 vols, London, 1952), 1: 19; H.A. Taylor, Jix: Viscount Brentford (London, 1933), 184. Randolph S. Churchill, Winston S. Churchill: II, The Young Statesman, 1901– 1914 (London, 1967), 418. Morton, ed., Monck Letters and Journals, 170; Bernard O’Donoghue and G.W. Jones, Herbert Morrison: Portrait of a Politician (London, 1973), 545, 309. [R.A. Butler], The Art of the Possible: The Memoirs of Lord Butler (Boston, 1972 ed.), 202; Peter C. Newman, Renegade in Power: The Diefenbaker Years (Toronto, 1964 ed.), 93n. Taylor, Jix, 182; [John Simon], Retrospect: Memoirs of the Rt. Hon. Viscount Simon (London, 1952), 209; L. Marks and T. Van Bergh, Ruth Ellis: A Case of Diminished Responsibility (London, 1990). A recent study of the prerogative of mercy in Victorian Canada suggests that the carrying out of the death penalty was something of a lottery: Jonathan Swaingler, The Canadian Department of Justice and the Completion of Confederation, 1867–1878 (Vancouver, 2000), 63–78. As minister of justice, Edward Blake was oppressed by the responsibility of the death penalty. The London Free Press called him ‘the best friend the Canadian murderer ever had.’ Joseph Schull, Edward Blake: The Man of the Other Way, 1833–1881 (Toronto, 1975), 170–1. John Henry Newman, Apologia pro Vita Sua (London, 1959 ed.), 209, 259, 248, 269, 273, 275; Desmond Bowen, The Idea of the Victorian Church: A Study of the Church of England, 1833–1889 (Montreal, 1968), 152; Andrew Cohen, A Deal Undone: The Making and Breaking of the Meech Lake Accord (Vancouver, 1991 ed.), 253. G.E. Marindin, ed., Letters of Lord Blachford (London, 1896), 227–8. National Archives of Canada, Elgin Papers, A-397, Newcastle to Elgin, private, 21 December 1853; Newcastle Papers, A-308, Newcastle to Elgin, private, 4 April 1854; G. Martin, Britain and the Origins of Canadian Confeder-

274 / Notes to pages 93–101 ation (Vancouver, 1995), 228–34. 22 Alan Sked and Chris Cook, Post-war Britain: A Political History (London, 1993 ed.), 587. 23 Ged Martin, The Durham Report and British Policy: A Critical Essay (Cambridge, 1972), 7. 24 J. Pope, Memoirs of Macdonald (Ottawa, 1894), 2: 281; Martin Gilbert, ‘Never Despair’: Winston S. Churchill, 1945–1965 (London, 1990 ed.), 15; Alan Clark, Diaries (London, 1993), 150. 25 Hazlehurst, Politicians at War, 79. Pressed to define his government’s policy on a foreign-affairs issue in 1972, Australian prime minister William McMahon memorably replied: ‘Our position is perfectly clear. We have not yet made up our minds.’ L. Oakes and D. Solomon, The Making of an Australian Prime Minister (Melbourne, 1973), 77. 26 Theodore Walrond, Letters and Journals of James, Eighth Earl of Elgin (London, 1872), 96–7; Helen Irving, ed., The Centenary Companion to Australian Federation (Cambridge, 1999), 77; Gilbert, Churchill, 3: 523, 484. 27 Montreal Gazette, 27 June 1864; James Young, Public Men and Public Life in Canada: The Story of the Canadian Confederacy (2 vols, Toronto, 1912), 1: 211. 28 Ged Martin, ‘The Case against Canadian Confederation,’ in Ged Martin, ed., The Causes of Canadian Confederation (Fredericton, 1990), 26–8. 29 M.C. Cameron in Parliamentary Debates on the Subject of the Confederation of the British North American Provinces (Quebec, 1865), 39. 30 Claire Hoy, Friends in High Places (Toronto, 1988 ed.), 169. 31 Parliamentary Debates on Confederation, 485; J.K. Johnson and C.B. Stelmack, eds, The Letters of Sir John A. Macdonald, 1858–1861 (Ottawa, 1969), 347; Montreal Gazette, 21 May 1864. 32 P.B. Waite, The Life and Times of Confederation, 1864–1867 (Toronto, 1963), 139; Careless, Brown of the Globe, 1: 314–22; Brian Young, The Politics of Codification: The Lower Canadian Civil Code of 1866 (Montreal, 1994); Joseph Cauchon, L’Union des provinces de l’Amérique Britannique du Nord (Quebec, 1865), 46. 33 J. Pope, Confederation (Toronto, 1895), 55. 34 Donald Creighton, The Road to Confederation: The Emergence of Canada, 1864– 1867 (Toronto, 1964), 65; Public Record Office, CO 188/141, Gordon to Cardwell, no. 93, 5 December 1864, fols 395–6. 35 W. Ullmann, ‘The Quebec Bishops and Confederation,’ Canadian Historical Review 43 (1962), 218. 36 Paul Knaplund, ed., Gladstone–Gordon Correspondence, 1851–1896 (Philadelphia, 1961), 42–3. 37 Pope, Memoirs of Macdonald, 2: 298; Montreal Gazette, 17 March 1864.

Notes to pages 101–12 / 275 38 Parliamentary Debates on Confederation, 880–2. 39 Bruce W. Hodgins, ‘The Canadian Political Elite’s Attitude toward the Nature of the Plan of Union,’ in B.W. Hodgins, D. Wright, and W.H. Heick, eds, Federalism in Canada and Australia: The Early Years (Waterloo, Ont., 1978), 46. 40 Parliamentary Debates on Confederation, 114. 41 R.G. Haliburton, Intercolonial Trade Our Only Safeguard Against Disunion (Ottawa, 1868), 9. 42 F.W.P. Bolger, Prince Edward Island and Confederation, 1863–1873 (Charlottetown, 1964), 262, 270–2. 43 Creighton, Road to Confederation, 320; William M. Baker, Timothy Warren Anglin 1822–1896: Irish Catholic Canadian (Toronto, 1977), 78. 44 J.M. Beck, Pendulum of Power: Canada’s Federal Elections (Scarborough, Ont., 1968), 72. 45 Beck, Pendulum of Power, 144; R. Starr, Richard Hatfield: The Seventeen Year Saga (Halifax, 1988 ed.), 239–50. 46 Cook and Sked, Post-war Britain, 285–91. 47 J.L. Granatstein et al., Twentieth Century Canada (2nd ed., Toronto, 1986), 206; H. Blair Neatby, William Lyon Mackenzie King: II, 1924–1932: The Lonely Heights (Toronto, 1963), 159–62; Roger Graham, Arthur Meighen: II, And Fortune Fled (Toronto, 1963), 414–36. 48 Beck, Pendulum of Power, 177–90; Neatby, King, 2: 158–9. 49 Bernard Crick, George Orwell: A Life (rev. ed., Harmondsworth, 1992), 468. 5. past futures 1 Stephen B. Oates, With Malice Toward None: The Life of Abraham Lincoln (London, 1978 ed.), 142. 2 M. Oakeshott, On History and Other Essays (Totowa, NJ, 1983), 13. In 1999, a study of 40,000 British women who had used oral contraception for a quarter of a century offered reassurance against ‘a lurking fear that something dreadful might pop out of the woodwork after 15, 20 or even 25 years.’ Awareness of possible risk had not deterred millions of women from taking the Pill. Daily Telegraph (London), 8 January 1999. 3 W.H. Russell, Canada: Its Defences, Condition, and Resources (London, 1865), 147; Christopher Hill, God’s Englishman: Oliver Cromwell and the English Revolution (Harmondsworth, 1972 ed.), 188. 4 Carl Sandburg, Abraham Lincoln (New York, 1954 ed.), 664. 5 Fortnightly Review, 1 April 1877: 431. In a newspaper article in March 1862,

276 / Notes to pages 112–15

6 7

8 9

10 11 12

13 14 15 16

17 18

Smith predicted that if Canadians were required to undertake their own defence, they would immediately create a militia force of 200,000 men. When the Canadian parliament refused to pay for a force half that size, Smith republished the article but suppressed the prediction. Compare G. Smith, The Empire (Oxford, 1863), 73–87 with Daily News (London), 12 March 1862. J.R. Seeley, The Expansion of England: Two Courses of Lectures (London, 1914 ed.), 196–7. John Morley, The Life of William Ewart Gladstone (3 vols, London, 1903), 2: 204. Owain Glyndwr, leader of a fifteenth-century Welsh revolt, made similar use of prophecy to rally support. R.R. Davies, The Revolt of Owain Glyndwr (Oxford, 1995), 157–60. For the importance of ancient prophecy in pre-modern England, especially the appeal to a mythic past as a means of shaping the future, see Keith Thomas, Religion and the Decline of Magic (Harmondsworth, 1978 ed.), 416–514. John Gunther, Inside Russia Today (London, 1959 ed.), 134. Elizabeth Longford, Elizabeth R, (London, 1984 ed.), 360; J. Cannon and R. Griffiths, Oxford Illustrated History of the British Monarchy (Oxford, 1988), 669. Benjamin Disraeli, Endymion (1880), 2: chap. 4. Martin Gilbert, Winston S. Churchill: V, 1922–1939 (London, 1976), 480. Alex Danchev and Daniel Todman, eds, War Diaries, 1939–1945: Field Marshal Lord Alanbrooke (London, 2001), 445; and, as an exercise in the presentation of evidence, compare the cannibalized version in Arthur Bryant, Triumph in the West 1943–1946 (London, 1960 ed.), 112. Martin Gilbert, Finest Hour: Winston S. Churchill, 1939–1941 (London, 1989 ed.), 1212n. Facsimile of original in B. Martin, John Henry Newman: His Life and Work (London, 1990 ed.), 53; Shannon, Gladstone, 143. Peter C. Newman, The Distemper of Our Times: Canadian Politics in Transition 1963–1968 (Toronto, 1968), 43. Phillip A. Buckner, The Transition to Responsible Government: British Policy in North America, 1815–1850 (Westport, Conn., 1985), 47–8. Cf. Reinhart Koselleck’s view of ‘wishful prognosis,’ in Koselleck, The Practice of Conceptual History (Stratford, 2002), 141. Robert Craig Brown, Robert Laird Borden: I, 1854–1914 (Toronto, 1975), 68. Viscount Grey of Falloden, Twenty-five Years, 1892–1916 (2 vols, London, 1925), 1: 6; ibid., 2: 305–6. A similar confusion can be found in the thinking of the colonization theorist E.G. Wakefield: Ged Martin, Edward Gibbon Wakefield: Abductor and Mystagogue (Edinburgh, 1997), 8–14.

Notes to pages 116–22 / 277 19 Henry Boylan, ed., A Dictionary of Irish Biography (New York, 1978), 311–12; Ian Colvin, The Life of Lord Carson (3 vols, London, 1934), 2: 298. 20 Eric Stokes, The English Utilitarians and India (Oxford, 1959), 45; Walter Bagehot, The English Constitution (Oxford, 1928 ed.), 5. 21 Seeley, Expansion of England, 223; Earl of Ronaldshay, The Life of Lord Curzon (3 vols, London, 1928), 2: 230. Curzon misquoted the hymn, but was corrected by Nicholas Mansergh, The Commonwealth Experience (London, 1969), 5. 22 Ronald Hyam, Elgin and Churchill at the Colonial Office, 1905–1908: The Watershed of the Empire-Commonwealth (London, 1968), 543. 23 Anthony Trollope, Phineas Redux (1874), 2: chap. 33; J.R. Pitman, ed., The Whole Works of John Lightfoot, 4 (London, 1825), 65; Caroline E. Stephen, The Right Honourable Sir James Stephen: Letters with Biographical Notes (Gloucester, 1906), 2. 24 H.C.G. Matthew, Gladstone: 1875–1898 (Oxford, 1995), 7–8; Morley, Life of Gladstone, 2: 548. His rival, Disraeli, more opaquely believed in ‘some form of future existence, though whether accompanied with any consciousness of personal identity with the self of this present life, he thought it impossible to foresee.’ John Vincent, ed., Disraeli, Derby and the Conservative Party: Journals and Memoirs of Edward Henry, Lord Stanley, 1849–1869 (Hassocks, Sussex, 1978), 31. 25 Norman Gash, Sir Robert Peel: The Life of Sir Robert Peel after 1830 (London, 1972), 24. 26 T.C. Smout, A History of the Scottish People, 1560–1830 (London, 1972 ed.), 102. 27 Nancy M. Taylor, The New Zealand People at War: The Home Front (2 vols, Wellington, 1986), 1: 127. John Ritchie drew my attention to the resonance of the words. 28 J.J. Eastwood and F.B. Smith, eds, Historical Studies: Selected Articles First Series (Melbourne, 1967 ed.), 183 (Blainey); A.W. Martin, ed., Essays in Australian Federation (Melbourne, 1969), 154 (Norris). 29 Gilbert, Winston S. Churchill, 3: 694; Ged Martin, ‘Asquith, the Historians and the Maurice Debate,’ Australian Journal of Politics and History 31 (1985), 435–44. 30 Lord Newton, Lord Lansdowne: A Biography (London, 1929), 466. 31 W.F. Reddaway, Frederick the Great and the Rise of Prussia (London, 1925 ed.), 67. 32 Gilbert, Winston S. Churchill, 5: 940, 922; A.J.P. Taylor, The Origins of the Second World War (Harmondsworth, 1964 ed.), 191. An American journalist based in Berlin in 1938 claimed that a firm stand would have stopped Hitler

278 / Notes to pages 122–8

33

34 35 36 37

38 39 40

41

42

43

44 45 46

in his tracks: William L. Shirer, The Rise and Fall of the Third Reich: A History of Nazi Germany (London, 1988 ed.), 364. I add here a personal tribute to the Cambridge historian F.R. Salter, who was still teaching, effectively and memorably, at the age of 79 in 1967. To my undergraduate protest that it was impossible to prevent the Sudeten Germans from joining the Reich, he explosively objected that they were not Germans at all but ‘Teutons.’ In fact, such issues rarely vanish altogether. The claims of the Sudeten Germans were raised in 2002 when the Czech Republic sought membership of the European Union. Oates, With Malice Toward None, 199. C. Headlam, ed., The Milner Papers: South Africa, 1897–1899 (London, 1931), 385. Z.S. Steiner, Britain and the Origins of the First World War (London, 1977), 220. But Lichnowski would say that, wouldn’t he? Parliamentary Debates on the Subject of the Confederation of the British North American Provinces (Quebec, 1865), 497; Donald Creighton, The Passionate Observer: Selected Writings (Toronto, 1980), 19. Dale C. Thomson, Jean Lesage and the Quiet Revolution (Toronto, 1984), 3. Illustrated in Donald Read, Edwardian England (London, 1972), 48. Ramsay Cook, The Politics of John W. Dafoe and the Free Press (Toronto, 1963), 251; A.S.F. Gow, Letters from Cambridge (Cambridge, 1945), 8; C. Sandburg, Abraham Lincoln (New York, 1954 ed.), 138. University of Nottingham, Newcastle Papers, NeC 11260. Formal votes in the British cabinet were unusual. For the division of opinion, see H.C.G. Matthew, ed., The Gladstone Diaries: VII, 1861–1868 (Oxford, 1978), 114. Spectator, 18 April 1857: 417–18, reviewing The Imaginary History of the Next Thirty Years; and cf. P. Appleman, W.A. Madden, and M. Wolff, eds, 1859: Entering an Age of Crisis (Bloomington, Ind., 1959). Lord Derby’s description of his own ministry’s Reform Act was made in the House of Lords on 6 August 1867. Arthur Keppel-Jones, When Smuts Goes: A History of South Africa from 1952 to 2010 (London, 1947). The book was written as a corrective to ‘irresponsible optimism about the future’ (p. 7). G.R. Elton, ed., The Tudor Constitution: Documents and Commentary (Cambridge, 1962), 5–6. D. Creighton, John A. Macdonald: The Old Chieftain (Toronto, 1965 ed.), 202. G.H.L. Le May, British Supremacy in South Africa, 1899–1907 (Oxford, 1965), 14. Kruger was then probably about 73 years of age, but nobody knew for sure. The last British Viceroy of India, Lord Mountbatten, subsequently claimed that he agreed to Partition and the creation of Pakistan in 1947

Notes to pages 129–33 / 279

47 48 49

50 51

52 53 54 55 56

57

because the British could not hang on to power for long enough to wait for the death of the ailing but inflexible Muslim leader, M.A. Jinnah. One historian has invoked the principle of ‘Cleopatra’s Nose’ to argue that ‘Jinnah’s chest may have played a decisive role in Commonwealth history. Had the tuberculosis from which he died in September 1948 cut short his life two years earlier it is doubtful whether the Muslim League could have marshalled the strength required to secure Pakistan.’ A. Roberts, Eminent Churchillians (London, 1994), 82; R.J. Moore, Making the New Commonwealth (Oxford, 1987), 6. Marcus Cunliffe, American Presidents and the Presidency (London, 1972 ed.), 69. Clinton Rossiter, ed., The Federalist Papers (New York, 1961 ed.), 207, 118 (nos. 34, 16). H. Irving, ed., The Centenary Companion to Australian Federation (Cambridge, 1999), 229. J.A. La Nauze, The Making of the Australian Constitution (Melbourne, 1972), 69. Ged Martin, ed., The Causes of Canadian Confederation (Fredericton, 1990), 30. National Archives of Canada, Macdonald Papers, 510, Macdonald to M.C. Cameron [copy], 19 December 1864. Cf. P.B. Waite, The Life and Times of Confederation (Toronto, 1963), 123. Yet when his friend Judge J.R. Gowan wrote to him pleading for a legislative union, Macdonald replied that he thought the Quebec scheme had avoided the ‘rocks’ of a federation. Ibid., Gowan to Macdonald, 8 October 1864; NAC, Gowan Papers, M1898, Macdonald to Gowan, private, 15 November 1864. O.D. Skelton (ed. G. MacLean), Life and Times of Sir Alexander Tilloch Galt (Toronto, 1966 ed.), 192; Creighton, Macdonald: The Old Chieftain, 484–5. Grey of Falloden, Twenty-five Years, 2: 220. Lord Beaverbrook, The Decline and Fall of Lloyd George (London, 1963), 305. Cf. K.O. Morgan, The Age of Lloyd George (London, 1971), 150–5. Maya V. Tucker, ‘Centennial Celebrations 1888,’ in Australia 1888 (Bicentennial History Bulletin) 7 (1981), 21. S.F. Wise and R.C. Brown, Canada Views the United States: Nineteenth-Century Political Attitudes (Toronto, 1972); L.B. Shippee, Canadian–American Relations, 1849–1874 (New Haven, 1939): 192–3. Dermot Keogh, Twentieth-century Ireland: Nation and State (Dublin, 1994), 334; M.L.R. Smith, Fighting for Ireland? The Military Strategy of the Irish Republican Movement (London, 1995), 226–7; and cf. Benjamin E. Kline, ‘Northern Ireland: A Prolonged Conflict,’ in Karl P. Mayer and Constantine P. Danopoulos, eds, Prolonged Wars: A Post-nuclear Challenge (Warren, Ala., 1994), 421–48.

280 / Notes to pages 133–7 58 Waite, Life and Times of Confederation, 50; Globe, 3 April 1856. 59 W.L. Morton, ed., The Shield of Achilles: Aspects of Canada in the Victorian Age (Toronto, 1968), 198. 60 W.M. Whitelaw, The Maritimes and Canada before Confederation (Toronto, 1934), pp, 128, 177–8. 61 J.M. Beck, Joseph Howe: II, The Briton Becomes Canadian, 1848–1873 (Kingston and Montreal, 1983), 162. 62 Waite, Life and Times of Confederation, 63. 63 Globe, 28 August 1856. 64 Johnson, ed., Letters of Sir John A. Macdonald, 1836–1857, 339. For other grandiose projects cf. National Archives of Canada, Newcastle Papers, A-307, Newcastle to Monck (copy), 21 March 1863, for serious consideration of an Illinois company that planned to construct a canal from Georgian Bay to the Ottawa River, and Nicolas Landry, ‘Transport et régionalisme en contexte pré-industriel: Le project du canal de la baie Verte, 1820–1875,’ Acadiensis 24 (1994): 59–87. 65 Morning Chronicle (Halifax), 27 October 1863. 66 Edward Whelan, comp., The Union of the British Provinces (Summerside, PEI, 1949 ed.), 29–30. 67 W.M. Baker, Timothy Warren Anglin (Toronto, 1977), 115. 68 Chester Martin, ‘Sir Edmund Head’s First Project of Federation, 1851,’ Canadian Historical Association Annual Report, 1928: 17. 69 John Darwin, Britain and Decolonisation: The Retreat from Empire in the PostWar World (Basingstoke, 1988), 157, 215; John Gunther, Inside Africa (London, 1955), 801. 70 Philip Mason, The Year of Decision: Rhodesia and Nyasaland in 1960 (Oxford, 1960), 69. 71 J. Barber, Rhodesia: The Road to Rebellion (Oxford, 1967), 44. 72 Colin Leys, European Politics in Southern Rhodesia (Oxford, 1959), 286; Barber, Rhodesia, 100. 73 C.F. Goodfellow, Great Britain and South African Confederation (Cape Town, 1966), 214; Parliamentary Debates on Confederation, 486. 74 Hyam, Elgin and Churchill at the Colonial Office, 543. The Locarno Treaty of 1925, an idealistic attempt to bring permanent peace to Europe, may have been another example of looking too far ahead. The British Foreign Secretary, Austen Chamberlain, explained: ‘I am working not for today or tomorrow but for some date like 1950 or 1960 when German strength will have returned and when the prospect of war will again cloud the horizon.’ Harold Nicolson, King George V: His Life and Reign (London, 1952), 407. 75 R. Sedgwick, ed., Lord Hervey’s Memoirs (Harmondsworth, 1984 ed.), 65.

Notes to pages 137–42 / 281 76 Creighton, Macdonald: The Old Chieftain, 126. 77 G.P. de T. Glazebrook, Sir Charles Bagot in Canada: A Study in British Colonial Government (Oxford, 1929), 60. 78 H.C.G. Matthew, ed., The Gladstone Diaries: X, 1881–1883 (Oxford, 1990), 433. 79 Erik Olssen, A History of Otago (Dunedin, 1984), 73; Erik Olssen, John A. Lee (Dunedin, 1977), 40; Graeme Davison, J.W. McCarty, and A. McLeary, eds, Australians 1888 (Sydney, 1987), 427–8; A.W. Martin, Henry Parkes: A Biography (Melbourne, 1980), 410; Ruth Dudley Edwards, Patrick Pearse: The Triumph of Failure (London, 1979 ed.), 183, 338. 80 P.C. Newman, Renegade in Power (Toronto, 1964 ed.), 72. 81 Locksley Hall was published in 1842, but written some years earlier. (Tennyson had seen his first train in 1830, but unfortunately at nighttime, so that he did not realize that it ran on rails. Hence his famous reference to ‘the ringing grooves of change.’) Locksley Hall Sixty Years After was published in 1887, but the demonization of ‘Celtic Demos’ makes clear the connection with the Irish Home Rule crisis of 1886. 82 Gilbert, Finest Hour, 569–71. 83 W.K. Hancock and Jean van der Poel, eds, Selections from the Smuts Papers: I, 1886–1902 (Cambridge, 1966), 530; D.J. Lee, Japan: A Documentary History (Armonk, NY, 1997), 457–8. 84 Peter C. Newman, The Canadian Revolution, 1985–1995: From Deference to Defiance (Toronto, 1995), 322. 85 G.E. Marindin, ed., Letters of Lord Blachford (London, 1896), 156–7. 86 Frederick Madden with David Fieldhouse, eds, Settler Self-Government, 1840–1890 (Westport, Conn., 1990), 798. 87 James Belich, Making Peoples: A History of New Zealanders from Polynesian Settlement to the End of the Nineteenth Century (Auckland, 1996), 229. 88 James Bryce, Impressions of South Africa (London, 1899 ed.), 432, 465. 89 Le May, British Supremacy in South Africa, 196. 90 W.K. Hancock, Smuts: I, The Sanguine Years, 1879–1919 (Cambridge, 1962), 220. 91 G.B. Pyrah, Imperial Policy and South Africa, 1902–10 (Oxford, 1955), 166. 92 John Barnes and D. Nicholson, eds, The Leo Amery Diaries: I, 1896–1929 (London, 1980), 36–7. 93 Hancock, Smuts, 1: 221. 94 Alexander Hepple, Verwoerd (Harmondsworth, 1967), 159. 95 Anthony Trollope (ed. J.H. Davidson), South Africa (Cape Town, 1973 ed.), 454. 96 Andrew Duminy and Colin Ballard, eds, The Anglo-Zulu War: New Perspectives (Pietermaritzburg, 1981), xix.

282 / Notes to pages 142–6 97 G.P. Browne, ed., Documents on the Confederation of British North America (Toronto, 1969), 129. 98 The origins of the phrase ‘Canada’s century’ are discussed in J.R. Colombo, ed., Colombo’s Canadian Quotations (Edmonton, 1984), 331–4. 99 Bruce Hutchison, Canada: Tomorrow’s Giant (Toronto, 1957). 100 Russell, Canada, 199. 101 A. Lyall, The Life of the Marquis of Dufferin and Ava (London, 1905), 193. 102 R.A. Davies, ed., The Letters of Thomas Chandler Haliburton (Toronto, 1988), 16. 103 Fortnightly Review, 1 April 1877: 158. 104 A. Shortt and A.G. Doughty, eds, Documents Relating to the Constitutional History of Canada, 1759–1791 (Ottawa, 1907), 198. 105 C.P. Lucas, ed., Lord Durham’s Report (Oxford, 1912), 2: 289–91. 106 Arthur G. Doughty, ed., The Elgin-Grey Papers, 1846–1852 (4 vols, Ottawa, 1937), 1: 150–1, echoing E.-P. Taché in Jacques Monet, The Last Cannon Shot: A Study in French-Canadian Nationalism, 1837–1850 (Toronto, 1969), 3. Cf. J.L. Sturgis, ‘Anglicisation as a Theme in Lower Canadian History, 1807–1843,’ British Journal of Canadian Studies 3 (1988): 210–29. 107 Johnson, ed., Letters of Sir John A. Macdonald, 1836–1857, 339. 108 Nineteenth Century 20 (1886): 14–15. 109 Robert Sellar (ed. R. Hill), The Tragedy of Quebec: The Expulsion of the Protestant Farmers (Toronto, 1974 ed.), 326; Cook, The Politics of John W. Dafoe and the Free Press, 13. 110 Réal Bélanger, ‘Le Nationalisme ultramontain: Le cas de Jules-Paul Tardivel,’ in N. Voisine and J. Hamelin, eds, Les Ultramontanes canadiens-francais (Montreal, 1985), 267–303, esp. 291, and Pierre Savard in Dictionary of Canadian Biography 13: 1009–13. 111 Robert Craig Brown and Ramsay Cook, Canada, 1896–1921: A Nation Transformed (Toronto, 1974), 267; C. Berger, The Writing of Canadian History (Toronto, 1986), 130–1. 112 W.D. Coleman, The Independence Movement in Quebec, 1945–1980 (Toronto, 1984), 146–8; John A. Dickinson and Brian Young, A Short History of Quebec, 2nd ed. (Toronto, 1993), 294–9; Marc V. Levine, The Reconquest of Montreal: Language Policy and Social Change in a Bilingual City (Philadelphia, 1990). Some of the more pessimistic estimates were subsequently revised. 113 J. Levitt, A Vision beyond Reach: A Century of Images of Canadian Destiny (Ottawa, 1982), 161. 114 Pierre Fournier (trans. S. Fischman), A Meech Lake Post-mortem: Is Quebec Sovereignty Inevitable? (Montreal, 1991), 85. 115 Newman, The Canadian Revolution, 93.

Notes to pages 151–61 / 283 6. a long time in history 1 Alvin Toffler, Future Shock (London, 1981 ed.), 18. 2 Oliver St John Gogarty, As I Was Walking Down Sackville Street: A Phantasy in Fact (London, 1968 ed.), 45; H. White, ‘The Burden of History,’ in Tropics of Discourse (Baltimore, 1978), 41. 3 David Steele, Lord Salisbury: A Political Biography (London, 1999), 42; B. Crick, Orwell (1992 ed.), 40. In 1964, Geoffrey Barraclough proposed a new concept of ‘contemporary history.’ This effectively extended the present back to about 1890, and aimed at identifying ‘the element of discontinuity’ with the past. Barraclough did not claim any moral superiority for this extended present, but he did decline to speculate about its unknowable future. G. Barraclough, Introduction to Contemporary History (Harmondsworth, 1967 ed.), esp. 9–42. 4 J. Darwin, Britain and Decolonisation (Basingstoke, 1988), 335. 5 Spectator (London), 22 April 1995: 16–17. 6 J. Seeley, Expansion of England (London, 1914 ed.), 254. 7 C. Hollis, The Oxford Union (London, 1965), 197. 8 W.C. Sellar and R.J. Yeatman, 1066 and All That (Harmondsworth, 1960 ed.), 123; Seeley, Expansion of England, 1. 9 Francis Fukuyama, The End of History and the Last Man (New York, 1992), xii, 46, 8; H. Thomas, Armed Truce (London, 1988 ed.), 31. 10 Compare J.-P. Derrienic, Nationalisme et démocratie (Quebec, 1995) with Fortnightly Review, 1 April 1877: 455. 11 Fukuyama, The End of History, 46–51. 12 David Ogg, Europe of the Ancien Régime, 1715–1783 (London, 1965), 217. Another estimate pointed to 1789 jurisdictions, an oddly portentous number. Golo Mann (trans. M. Jackson), A History of Germany since 1789 (Harmondsworth ed., 1984), 21. 13 M.B. Jansen, ed., The Cambridge History of Japan: IV, The Nineteenth Century (Cambridge, 1989), 98. 14 Paul Langford, A Polite and Commercial People: England, 1727–1783 (Oxford, 1989), 306. 15 Spike Milligan, Puckoon (Harmondsworth, 1965 ed.), 54; Jonathan Bardon, A History of Ulster (Belfast, 1984 ed.), 146, 257. 16 Ged Martin, ‘The Origins of Partition,’ in Malcolm Anderson and Eberhard Bort, eds, The Irish Border: History, Politics, Culture (Liverpool, 1999), 75–9. 17 T.C. Smout, History of the Scottish People (London, 1972 ed.), 403–6; John Benson, British Coalminers in the Nineteenth Century: A Social History (Dublin, 1980), 81.

284 / Notes to pages 162–6 18 F.D. Logan, Vikings in History (London, 1983), 39–40. 19 Olive P. Dickason, Canada’s First Nations: A History of the Founding Peoples from Earliest Times (Toronto, 1992), 161; J.A. Chisholm, ed., The Speeches and Public Letters of Joseph Howe (2 vols, Halifax, 1909), 2: 445. 20 I owe this point to Dr Barbara C. Murison, who makes it to history students at the University of Western Ontario. 21 Mann, History of Germany since 1789, 30; J.R. Saul, Reflections of a Siamese Twin: Canada at the End of the Twentieth Century (Toronto, 1998 ed.), 13; J. Bradley, ed., Viking Dublin Exposed: The Wood Quay Saga (Dublin, 1985). 22 Michael Brock, The Great Reform Act (London, 1973), 146; M.H. Keen, England in the Later Middle Ages (London, 1973), 296; M. Gilbert, ‘Never Despair’ (London, 1990 ed.), 12; P.C. Newman, The Canadian Revolution (Toronto, 1995), xi. The Oxford Dictionary of Quotations is unable to establish a precise source for the Wilson quotation, thereby confirming the point about transience. 23 [Bertrand Russell], The Autobiography of Bertrand Russell: II, 1914–1944 (London, 1968), 16. 24 F. Braudel, On History (London, 1980 ed.), 3–4. ‘We might speak, not of one historical time, but of many that overlie one another.’ R. Koselleck, The Practice of Conceptual History (Stanford, 2002), 110. The way time is measured contributes to the manner in which it is conceptualized. On this, see the fascinating work by G.J. Whitrow, Time in History: Views of Time from Prehistory to the Present Day (Oxford, 1989 ed.). 25 E.D. Steele, Irish Land and British Politics: Tenant Right and Nationality, 1865– 1870 (Cambridge, 1974), 19. 26 A.W. Martin, Robert Menzies: A Life: I, 1894–1943 (Melbourne, 1993), 179. Buildings can be used as a focus for preserving memory. In 1851, an Irish Protestant recalled being shown a fortified house in the town of Youghal by his grandfather in 1788, and being made to promise to pass on to his descendants the story of a forebear’s imprisonment there for supporting William III in 1688–9. Institutions similarly foster long memories. An American visitor to a Cambridge college ca. 1870 sought the secret of its beautiful lawns: she was advised to try 200 years of cultivation. In the 1890s, the venerable Master of the same college talked of a political crisis in 1839 as if it were ‘an affair of yesterday.’ [W.G. Field], The Hand-Book for Youghal (Youghal, Ireland, 1896), 55; Harry de Windt, My Restless Life (London, 1909), 70; A.F.R. Wollaston, Life of Alfred Newton (London, 1921), 242. 27 Falklands Island Briefing (Falkland Islands Government Office, London, 1997), 7. The petition of Maria Angelica Vernet was submitted to a United Nations committee on colonial issues on 6 July 1998.

Notes to pages 167–76 / 285 28 Fortnightly Review, 1 April 1877: 436; Alice Brown, D. McCrone, and L. Paterson, Politics and Society in Scotland (Basingstoke, 1996), 24. 29 Sunday Telegraph (London), 11 August 1996. 30 Compare Christopher Moore, Louisbourg Portraits: Life in an Eighteenthcentury Garrison Town (Toronto, 1983 ed.), 38–41 with Stephen Hornsby and Graeme Wynn, ‘Walking through the Past,’ Acadiensis 10 (1981): 152–9. 31 The quotation comes from L.P. Hartley’s novel, The Go Between (1953) and provides the title of a reflective book on history, David Lowenthal, The Past Is a Foreign Country (Cambridge, 1985). 32 S.J. Houston, James I (London, 1973), 116. 33 A. Lyall, The Life of the Marquis of Dufferin and Ava (London, 1905), 44. 34 Alison L. Prentice and Susan E. Houston, eds, Family, Society, and School in Nineteenth-century Canada (Toronto, 1975), 96. 35 Ged Martin, ed., The Causes of Canadian Confederation (Fredericton, 1990), 24; J.K. Johnson, ed., Affectionately Yours: The Letters of Sir John A. Macdonald and His Family (Toronto, 1969), 104; M.V. Brett, ed., Journals and Letters of Reginald Viscount Esher: I, 1870–1903 (London, 1934), 270; Mary Rubio and Elizabeth Waterston, eds, The Selected Journals of L.M. Montgomery: II, 1889–1910 (Toronto, 1985), 255. 36 L.B. Pearson, Mike (Scarborough, Ont., 1973 ed.), i: 9; Sidney Lee, King Edward VII: A Biography: II, The Reign (London, 1927), 522. 37 Andrew Lownie, John Buchan: The Presbyterian Cavalier (London, 1995), 274; J. Adam Smith, John Buchan: A Biography (Oxford, 1985 ed.), 455–7. 38 Carl Ballstadt, E. Hopkins, and M. Peterman, eds, Susanna Moodie: Letters of a Lifetime (Toronto, 1985), 171; Globe, 5 November 1856. 39 R.C. Brown and R. Cook, Canada, 1896–1921 (Toronto, 1974), 267. 40 Raymond B. Blake, Canadians At Last: Canada Integrates Newfoundland as a Province (Toronto, 1994), 19–24, drawing upon Jeff A. Webb in Newfoundland Studies 5 (1989): 203–30; Richard Gwyn, Smallwood: The Unlikely Revolutionary (Toronto, 1968 ed.), 110. A colloquial phrase in Scotland, ‘as lonely as a Coatbridge Protestant,’ similarly draws upon an undefined but widely known belief about the religious affiliations of the people of that town. 41 H. Butterfield, History and Human Relations (London, 1951), 127; C.B. McCullagh, Justifying Historical Descriptions (Cambridge, 1984), 225. 42 C.E. Kingsley, Charles Kingsley: His Letters and Memories of His Life (London, 1904 ed.), 247, 240. 43 R. Hyam, Britain’s Imperial Century (London, 1976), 80, 349–50; Kingsley, Charles Kingsley, 236. 44 W.L. Shirer, Rise and Fall of the Third Reich (London, 1988 ed.), esp. 39, 50. 45 J. Cannon and R. Griffiths, Oxford Illustrated History of the British Monarchy (Oxford, 1988), 389 (but see the comments of a distinguished American

286 / Notes to pages 00–00

46

47

48 49 50 51 52

53 54 55

56

57 58

scholar on writing the history of this period: Mark Kishlansky, A Monarchy Transformed: Britain, 1603–1714 [London, 1996], ix–xi). Benjamin Disraeli, Endymion (1881), chap. 81. Disraeli was something of a plagiarist: the story is also attributed to the Earl of Shaftesbury in the time of Charles II. A.T.Q. Stewart, The Narrow Ground: The Roots of Conflict in Ulster (London, 1989 ed.); Paul Bew, Ideology and the Irish Question: Ulster Unionism and Irish Nationalism, 1912–1916 (Oxford, 1994). Both Stewart and Bew write with insight that makes them far more than mere apologists. I. Colvin, Life of Lord Carson, 3 (London, 1936), 386. J.J. Lee, Ireland 1912–1985: Politics and Society (Cambridge, 1989), 11; J. Bardon, History of Ulster (Belfast, 1994 ed.), 406. A.T.Q. Stewart, The Ulster Crisis (London, 1967), 237–43; Bew, Ideology and the Irish Question, 159–60. Ronald Hyam, Empire and Sexuality: The British Experience (Manchester, 1990), 193. Mark T. Berger, ‘Imperialism and Sexual Exploitation ...,’ in Journal of Imperial and Commonwealth History 17 (1988): 82–9 and Ronald Hyam, ‘... A Reply,’ ibid., 90–8, esp. 93. Cf. Hyam, Empire and Sexuality, 7, 17. Hyam, Empire and Sexuality, 215. Ibid., 160–9. The Marlowe quotation comes from The Jew of Malta (ca. 1588). A.J.P. Taylor, English History, 1914–1945 (Oxford, 1965), 431. Similarly, relations between Britain and Ireland are still overshadowed by a sense of outrage that government failed the Irish people during the Great Famine of 1845–9. The death rate in an earlier famine, that of 1740–1, was at least as great, but nobody now expects that an eighteenth-century government would have even tried to relieve distress, and the episode is almost forgotten. Cf. David Dickson, Arctic Ireland (Belfast, 1997). A.J.P. Taylor, Origins of the Second World War (Harmondsworth, 1964 ed.), 27; George Radwanski, Trudeau (Scarborough, Ont., 1979 ed.), 55–6 and cf. R. Overy, Why the Allies Won (New York, 1996 ed.), 282–313. Rosalind Mitchison, A History of Scotland (London, 1970), 175. Taylor, Origins of the Second World War, 29. Cf. Theodore H. White, ‘The End of the Postwar World,’ in White, The Making of the President 1972 (New York, 1973), prologue.

7. significance 1 Jenny Wormald, ‘James VI and I: Two Kings or One?’ History 68 (1983): 187. 2 Michael Oakeshott, Rationalism in Politics and Other Essays (London, 1962),

Notes to pages 190–5 / 287

3

4 5

6

7 8 9 10

154; Robert Grant, Thinkers of Our Time: Oakeshott (London, 1990), 100–1; E.H. Carr, What Is History? (Harmondsworth, 1964 ed.), 12 and cf. K. Jenkins, On ‘What Is History?’ (London 1995), 45–7; Michael Oakeshott, Experience and Its Modes (Cambridge, 1933), 42–3; J.L. Hammond, C.P. Scott of the Manchester Guardian (London, 1934), 96. G.M. Trevelyan, Clio: A Muse and Other Essays (London, 1913); J.R. Seeley, Expansion of England (London, 1914 ed.), 189; Niall Ferguson, ed., Virtual History: Alternatives and Counterfactuals (London, 1998 ed.), 2. The great danger with the counter-factual is that it may slide into the merely declamatory. Thus, in a schoolboy essay in 1906, Rupert Brooke proclaimed that if William III had not become king in 1688, ‘Canada, the most satisfactory part of our Empire[,] would certainly now be French’; Christopher Hassall, Rupert Brooke: A Biography (London, 1972 ed.), 91. I owe this quotation to Ann Barry. For an alternative perspective on ‘relevance,’ see M.M. Postan, Fact and Relevance: Essays on Historical Method (Cambridge, 1971), esp. 48– 64. Robin W. Winks, The Relevance of Canadian History (Toronto, 1979), offers examples, but no definition. Sunday Tribune (Dublin), 2 August 1998; W.L. Shirer, Rise and Fall of the Third Reich (London, 1988 ed.), 1036. D.H. Akenson, The Irish in Ontario: A Study in Rural History (Kingston and Montreal, 1984), 331. For other comments on ‘significance,’ see C.B. McCullagh, Justifying Historical Descriptions (Cambridge, 1984), 198–9 and Arthur Danto, Analytical Philosophy of History (Cambridge, 1965), 11. William H. Dray offers a parallel discussion of ‘importance,’ in On History and Philosophers of History (Leiden, 1989), 73–91. Kenneth D. Brown, Britain and Japan: A Comparative Economic and Social History (Manchester, 1998), 12; Robert Bothwell, I. Drummond, and J. English, Canada, 1900–1945 (Toronto, 1987), 154. D.H. Fischer, Historians’ Fallacies (New York, 1970), 100, 69. K.O. Morgan, Wales 1880–1980: Rebirth of a Nation (Oxford, 1982 ed.), 208; A.J.P. Taylor, English History (Oxford, 1965), 485. The Shakespeare quotation is from Macbeth; and cf. E.H. Horne, The Significance of Air War (London, 1937), 90–9. R. MacGregor Dawson, William Lyon Mackenzie King: A Political Biography: I, 1874–1923 (Toronto, 1958), 252; C.P. Stacey, A Very Double Life: The Private World of Mackenzie King (Toronto, 1977 ed.), 161; Ged Martin, ‘Mackenzie King, the Medium and the Messages,’ British Journal of Canadian Studies 4 (1989): 125; and cf. R.M. Dawson, The Conscription Crisis of 1944 (Toronto, 1961), 84. I owe the story of the typing error to Dr James A. Gibson, who kindly permitted me to cite it.

288 / Notes to pages 195–206 11 William M. Baker, ‘“So What?”: The Importance of the Lethbridge Strike of 1906,’ Prairie Forum 12 (1997): 295–300; H.S. Ferns and Bernard Ostry, The Age of Mackenzie King (Toronto, 1976), 73. 12 G.R. Elton, The Practice of History (Sydney, 1967), 101. 13 William Ferguson, Scotland: 1689 to the Present (Edinburgh, 1978 ed.), 51–3. 14 A.J. Crosbie, ‘Time and Geography,’ Shadow 4 (1987): 59–65; L. Colley, Britons: Forging a Nation (London, 1994 ed.), 18. 15 D. Creighton, John A. Macdonald: The Old Chieftain (Toronto, 1965 ed.), 484; J.M. Beck, The History of Maritime Union: A Study in Frustration (Fredericton, 1989) and Ernest R. Forbes and D.A. Muise, eds, The Atlantic Provinces in Confederation (Toronto, 1993), 461–3; W.P. Morrell, The Provincial System in New Zealand, 1852–1876 (Christchurch, 1974 ed.), 107–15, 221–85; O.D. Skelton (ed. G. MacLean), Alexander Tilloch Galt (Toronto, 1966 ed.), 283; Norman L. Nicholson, The Boundaries of the Canadian Confederation (Toronto, 1979), 129–38. Alvin Finkel, The Social Credit Phenomenon in Alberta (Toronto, 1989) argues that Social Credit thinking initially contained a substantial left-wing element. But Alberta, unlike Saskatchewan, did not prove fertile soil for such ideas. 16 M.B. Jansen, ed., The Cambridge History of Japan: IV, The Nineteenth Century (Cambridge, 1989), 571. 17 I.C. Campbell, A History of the Pacific Islands (Berkeley, 1989), 152–4. 18 C.A. Bayly, Imperial Meridian (London, 1989), 23. 19 D.A. Low, Lion Rampant: Essays in the Study of British Imperialism (London, 1973), 8; J. Darwin, Britain and Decolonisation (Basingstoke, 1988), 102. 20 J.R. Miller, Skyscrapers Hide the Heavens: A History of Indian–White Relations in Canada (Toronto, 1989), 48; O. Dickason, Canada’s First Nations (Toronto, 1992), 63. For the general problem, see Bruce G. Trigger, Natives and Newcomers: Canada’s ‘Heroic Age’ Reconsidered (Kingston and Montreal, 1985), 231–51. 21 Bruce G. Trigger, The Children of Aataentsic: A History of the Huron People to 1660 (Kingston and Montreal, 1987 ed.), 31–2; L.F.S. Upton, ‘The Extermination of the Beothucks of Newfoundland,’ Canadian Historical Review 58 (1977): 134; Ralph Pastore, ‘The Collapse of the Beothuck World,’ Acadiensis 19 (1989): 55. 22 John S. Milloy, The Plains Cree: Trade, Diplomacy and War 1790 to 1870 (Winnipeg, 1988), 72–3; Gerald Friesen, The Canadian Prairies: A History (Toronto, 1984), 15. 23 C.W. de Kiewiet and F.H. Underhill, eds, Dufferin–Carnarvon Correspondence, 1874–1878 (Toronto, 1955), 125. 24 M. Kishlansky, A Monarchy Transformed (London, 1996), 144.

Notes to pages 207–15 / 289 25 Worcester Herald, 10 November 1849; Nicholas C. Edsall, Richard Cobden: Independent Radical (Cambridge, Mass., 1986), 12; C. Ballstadt et al., eds, Susanna Moodie: Letters of a Lifetime (Toronto, 1985), 170; Sandra Gwyn, The Private Capital: Ambition and Love in the Age of Macdonald and Laurier (Toronto, 1989 ed.), 235–59; J. Gathorne-Hardy, The Public School Phenomenon, 597–1977 (London, 1977), 109; Dale C. Thomson, Alexander Mackenzie: Clear Grit (Toronto, 1960), 90. 26 Brian Brivati, Hugh Gaitskell (London, 1997), 432–4. 27 J.A. Spender and C. Asquith, Life of Herbert Henry Asquith, First Lord Oxford and Asquith (2 vols, London, 1932), 1: 66. 28 A.L. Rowse, The Use of History (London, 1946), 127. 29 J.E. Neale, Queen Elizabeth I (Harmondsworth, 1960 ed.), 57. The strength of traditional Catholicism in Reformation England has been emphasized by Eamon Duffy in The Stripping of the Altars: Traditional Religion in England, 1400–1580 (New Haven, 1992). The uprisings of 1536, 1549, and 1569 all seem to have been inspired partly by adherence to the old faith. Only Wyatt’s rebellion of 1554 could be said to have been driven by Protestantism: it was very localized and also directed against Mary’s Spanish marriage. In Dutch history, the death of Stadtholder William II in November 1650, just four months after he had seized control of the United Provinces, may be regarded as a similar landmark. Jonathan Israel, The Dutch Republic: Its Rise, Greatness, and Fall, 1477–1806 (Oxford, 1998 ed.), 595–609. 30 McCullagh makes the point in closely similar terms: ‘[O]ne has to imagine what would have happened if the world had been without the event or state of affairs whose causal significance is being assessed. There is no need to imagine a substitute for that event or state of affairs. Its actual significance is found by comparing what would have happened, as far as one can imagine, without it and what actually did happen’ (McCullagh, Justifying Historical Descriptions, 199). But McCullagh is writing of a controversy among economic historians over the impact of railroads on the American economy, where the imagined difference is between railroads and no railroads. In most cases, the choice must lie among positive alternatives; e.g., if James VI had not become king of England, some other monarch would have been found. 31 Arthur Conan Doyle (ed. S.C. Roberts), Sherlock Holmes: Selected Stories (London, 1985 ed.), 25; Oliver MacDonagh, ‘Ireland as Precursor,’ in John Eddy and Deryck Schreuder, eds, The Rise of Colonial Nationalism (Sydney, 1988), 88. 32 A.T.Q. Stewart, The Narrow Ground (London, 1989 ed.), 11. 33 V.E. Durkacz, The Decline of the Celtic Languages (Edinburgh, 1983); Reg

290 / Notes to pages 215–21

34

35

36 37

38

39

40

41

42

43

Hindley, The Death of the Irish Language: A Qualified Obituary (London, 1990); Brian Ó Cuiv, ed., A View of the Irish Language (Dublin, 1969); Peter Thomas, Strangers from a Secret Land (Toronto, 1986), 216; G.M. Story, W.J. Kirwin, and J.D.A. Widdowson, eds, Dictionary of Newfoundland English (Toronto, 1992), 7–8, 239. Margaret Prang, ‘Clerics, Politicians and the Bilingual Schools Issue in Ontario,’ Canadian Historical Review 41 (1960): 281–307; Marilyn Barber, ‘The Ontario Bilingual Schools Issue: Sources of Conflict,’ Canadian Historical Review 47 (1966): 227–48. Charles Townshend, The British Campaign in Ireland, 1919–1921: The Development of Political and Military Policies (Oxford, 1978 ed.), 170–1; David Omissi, Air Power and Colonial Control: The Royal Air Force, 1919–1939 (Manchester, 1990), 14–17. R.F. Foster, Modern Ireland, 1600–1972 (London, 1988), 569. David McCrone, Understanding Scotland: The Sociology of a Stateless Nation (London, 1992), 1; Graeme Morton, ‘What If? The Significance of Scotland’s Missing Nationalism in the Nineteenth Century,’ in D. Brown, R.J. Finlay, and M. Lynch, eds, Image and Identity: The Making and Re-making of Scotland through the Ages (Edinburgh, 1998), 157–76, esp. 163. Morton develops his interpretation further in Unionist Scotland: Governing Urban Scotland, 1830– 1860 (Edinburgh, 1999), esp. 48–63. Robert Rhodes James, The British Revolution: British Politics, 1880–1939 (2 vols, London, 1977); H.M. Hyndman, The Record of an Adventurous Life (London, 1911), 20. Chesterton’s line comes from his poem, ‘The Secret People.’ R.R. Davies, The Revolt of Owain Glyn Dwr, 6; Glanmor Williams, Renewal and Reformation: Wales c. 1415–1642 (Oxford, 1993 ed.), 68. The pre-Contact Amerindian settlement of Cahokin in Illinois has also been likened in size to medieval London. Dickason, Canada’s First Nations, 49. George Rudé, Paris and London in the Eighteenth Century: Studies in Popular Protest (New York, 1971 ed.), 35–8; Roy Porter, London: A Social History (London, 2000 ed.), 248–9. Daniel Defoe, A Tour Through the Whole Island of Great Britain (Harmondsworth, 1971 ed.), 54; E.A. Wrigley, ‘A Simple Model of London’s Importance in Changing English Society and Economy 1650–1750,’ Past and Present 37 (1967): 44–70, esp. 59. G.A. Wilkes, A Dictionary of Australian Colloquialisms (Sydney, 1985 ed.), 257; Russel Ward, The History of Australia: The Twentieth Century, 1901–1975 (London, 1978), 360. Joseph Hamburger, James Mill and the Art of Revolution (New Haven, 1963).

Notes to pages 221–32 / 291 44 F.B. Smith, The Making of the Second Reform Bill (Cambridge, 1966), 225. 45 Paul Smith, ed., Lord Salisbury on Politics (Cambridge, 1972), 96. 46 Norman McCord, The Anti-Corn Law League, 1838–1846 (London, 1968 ed.), 75–7, 178–9. 47 L.M. Cullen, The Emergence of Modern Ireland, 1600–1900 (Dublin, 1993 ed.), 213. 48 P. Calvocoressi et al., Penguin History of the Second World War (Harmondsworth, 1999 ed.), 1243–71. As will be clear from the notes to this section, this attempt to assess significance is based upon generally available texts and makes no claim to specialist ‘research.’ 49 A.W. Purdue, The Second World War (Basingstoke, 1999), 7–12. 50 G. Barraclough, Introduction to Contemporary History (Harmondsworth, 1967 ed.), 28 offers the mildly eccentric claim that the Japanese began the Second World War in July 1937, but without realizing that they were doing so. 51 This story was related to me by the late W.K. Hancock. 52 W.S. Churchill, The Second World War, III: The Grand Alliance (London, 1950), 543. (‘... after all when you have to kill a man it costs nothing to be polite.’) 53 A. Bullock, Hitler (Harmondsworth, 1962 ed.), 490. John Lukacs invokes a double past future, arguing that by 1937 Hitler believed that his own health was failing and that British and French rearmament would tilt the balance against Germany after 1943. J. Lukacs, The Hitler of History: Hitler’s Biographers on Trial (London, 2000 ed.), 142–8. 54 Richard Overy, Why the Allies Won (New York, 1996 ed.), 208–44. 55 John Keegan, The Battle for History: Re-fighting World War Two (London, 1995), 18 and cf. 70. 56 Churchill, Second World War, III: The Grand Alliance, 157–8, 172. 57 Calvocoressi et al., Penguin History of the Second World War, 224. The point is also emphasized by Lukacs, The Hitler of History, 153–4, but he is unconvincing in his dismissal of ‘the somewhat jejune speculation about the formalities of the German declaration of war on the United States.’ The suggestion that Hitler ‘could hardly betray his Japanese ally by welshing on the principal item in their alliance’ attributes an unduly idealistic quality to the Führer’s attitude towards diplomatic agreements. 58 Again, this assessment of the significance of the Normandy landings draws heavily upon a standard source, Chester Wilmot, The Struggle for Europe (London, 1959 ed.), an account recently praised by Keegan, Battle for History, 77. Also useful are Carlo d’Este, Decision in Normandy (London, 1994 ed.) and Overy, Why the Allies Won, 134–79. 59 R.F. Bennett, Intelligence Investigations (London, 1996), 23, 26–9.

292 / Notes to pages 232–9 60 Geoffrey Roberts, Victory at Stalingrad: The Battle That Changed History (Harlow, 2002), 5; d’Este, Decision in Normandy, 108. 61 Roberts, Stalingrad, 14; Purdue, Second World War, 146; Overy, Why the Allies Won, 136–7. 62 A. Danchev and D. Todman, eds, War Diaries, 1939–1945: Field Marshal Lord Alanbrooke (London, 2001), 554. 63 Anthony Cave Brown, Bodyguard of Lies (London, 1986 ed.), 409–678. 64 One gap in the blanket of secrecy was noted by a British observer who reported that officers of the Royal Canadian Air Force, although ‘brave, friendly and resourceful,’ were entirely anglophone, and incapable of censoring the letters home written by volunteers from Quebec. Even this educated Englishman noted that ‘their peculiar French patois was a little difficult to understand,’ but he feared that the poor relations ‘between them and their English-speaking officers boded ill for the tranquillity of post-war Canada.’ John Colville, The Fringes of Power: Downing Street Diaries, 1939– 1955, II: October 1941–1955 (London, 1987 ed.), 115–16. 65 Wilmot, Struggle for Europe, 271; Rowse, Use of History, 163–4. Shakespeare was also mobilized in support of the Normandy landings in the form of the celebrated Olivier movie of Henry V. Presumably in support of national unity, this version omitted a subplot dealing with internal subversion aimed at the king. It was largely filmed in neutral Ireland, and a prominent County Wicklow landmark, the Sugarloaf, can be seen in the Agincourt scenes. Montgomery’s dramatic challenge to his commanders at the pre-invasion briefing of 7 April 1944 (‘If anyone has any doubts in his mind, let him stay behind’) was probably an invocation of the king’s speech before Agincourt (‘He that hath no stomach for this fight / Let him depart’). 66 R. MacGregor Dawson, The Conscription Crisis of 1944 (Toronto, 1961) remains a key source, even though it was written as a fragment of a larger and incomplete biography of Mackenzie King. See also J.L. Granatstein, Canada’s War (Toronto, 1975), 333–81. 67 D.H. Fischer, Historians’ Fallacies (New York, 1970), 158–9. Brooke, often condemned for his pessimism, did foresee problems about close-range fighting in the bocage; d’Este, Decision in Normandy, 87. 68 Remarkably, in the first 24 hours of the invasion, only 36 German aircraft were sighted over the beachheads. But the subsequent campaign showed that the Luftwaffe had not gone away. Allied aircraft were also at risk from what is now called ‘friendly fire.’ 69 Conrad Totman, A History of Japan (Oxford, 2000), 433. 70 Purdue, Second World War, 183.

Notes to pages 241–61 / 293 71 See Jenny Wormald’s comment in the first paragraph of this chapter, and note 1 above. 8. objections, review and tailpiece 1 G.R. Elton, Political History: Principles and Practice (London, 1970), 120. Kingsley’s lines were addressed to his future wife who, fortunately, proved clever enough to write his biography. A posthumous collected edition of his verse changed the stanza to the better-known version, ‘let who will be clever.’ 2 W.H. Dray, ‘On the Nature and Role of Narrative in History,’ in G. Roberts, ed., The History and Narrative Reader (London, 2001), 30. 3 Elton, Political History, 131–2, 138–9, 135, 143, 145. 4 R.W. Winks, ed., The Historian as Detective (New York, 1970), xvi. 5 Elton, Political History, 124. 6 William Dray, Perspectives on History (London, 1980), 93. 7 L.F. Crisp, Australian National Government (3rd ed., Hawthorn, Vic., 1973), 208. 8 Kenneth McRoberts, Misconceiving Canada: The Struggle for National Unity (Toronto, 1997), 230. 9 D.G. Creighton, ‘Sir John Macdonald and Kingston,’ Canadian Historical Association Annual Report, 1950: 74. 10 J.T. Saywell, Canada: Pathways to the Present (Toronto, 1994), esp. 158–62. 11 H. White, ed., Tropics of Discourse (Baltimore, 1978), 44, 33. Similar thinking underlined Barraclough’s definition of ‘contemporary history.’ G. Barraclough, Introduction to Contemporary History (Harmondsworth, 1967 ed.), 9–42, esp. 36. 12 J.A. La Nauze, Deakin, 1: 144; Becker, ‘Everyman His Own Historian,’ American Historical Review 37 (1932): 236. 13 J. Nehru, An Autobiography (London, 1942 ed.), 83. 14 Sunday Telegraph (London), 31 March 2002. 15 Hugh Foot, A Start in Freedom (London, 1964), 40. 16 David Leitch, ‘Explosion at the King David Hotel,’ in Michael Sissons and P. French, eds, Age of Austerity, 1945–51 (Harmondsworth ed., 1964), 58–85, esp. 62; J. Darwin, Britain and Decolonisation (Basingstoke, 1988), 115–20. 17 J.H. Plumb, ‘Introduction’ to C.R. Boxer, The Dutch Seaborne Empire (London, 1965), xv. 18 J.L. Granatstein, Who Killed Canadian History? (Toronto, 1998), 73. For a thoughtful rebuttal, see A.B. MacKillop, ‘Who Killed Canadian History? A View from the Trenches,’ Canadian Historical Review 80 (1999), 269–99. 19 White, ed., Tropics of Discourse, 29.

Index

Acton, Lord, 64 Adamnan, 50–2, 54 Afghanistan, 259 Africa, 64, 135, 151, 155, 180, 216, 258–9 Air power, 39–41, 127, 193–4, 217, 228–30, 235–7, 257–8, 292 Akenson, D.H., 192 Alberta, 195–6, 198–9, 225, 288 Alcuin of York, 162 Alline, Henry, 72 Amery, L.S., 141 Amulree, Lord, 51–2 Anne, Queen, 210 Anti-Corn Law League, 222 Argentina, 166 Asquith, H.H., 42, 44, 86, 89, 94, 119, 129, 184, 210 Australia, 15, 22–5, 28, 41, 61, 94, 113, 118–19, 129, 131–2, 138, 159, 201, 220, 241, 250–1, 254, 274 Austria, 42–4, 120, 123–4, 228 Bagehot, Walter, 116 Bagot, Sir Charles, 137–8 Baker, William M., 134, 190, 195–6 Barraclough, Geoffrey, 283, 291, 293

Bayly, C.A., 60–1, 199–200 Beaverbrook, Lord, 40, 131 Becker, Carl, 254–5, 263 Bede, Venerable, 50–2 Begin, Menachim, 260 Belfast, 143, 160, 214, 230 Belgium, 42, 82, 115, 124, 184, 218, 232, 239 Ben Gurion, David, 260 Bennett, Ralph, 7, 45, 74, 232, 248 Benson, John, 161 Bew, Paul, 177, 286 Birmingham, 159, 219, 222 Blainey, Geoffrey, 118 Blake, Edward, 273 Blake, Raymond B., 173 Bloomfield, Janet, 154 Borden, Robert, 114, 203 Borneo, 176 Bose, S.C., 232 Bosnia, 42, 44, 155 Boston, 55, 159 Bourget, Ignace, 128 Bowell, Mackenzie, 209 Braudel, Fernand, 164–5, 196, 212 Brazil, 227 Brebner, J.B., 47–8

296 / Index British Columbia, 125, 134, 143, 202 Brooke, Sir Alan, 113, 233, 292 Brooke, Rupert, 287 Brown, Alice, 167 Brown, George, 19, 30, 89, 95–7, 99, 102, 134, 137 Brown, R. Craig, 173 Bryce, James, 140 Bulgaria, 117, 209 Burma, 200 Buthelezi, M.G., 142 Butler, R.A., 90 Butterfield, Herbert, 174, 202 Byng, Lord, 105 Cain, P.J., and A.G. Hopkins, 60–4 Calvocoressi, Peter, 17–18, 231 Cambridge, 38, 58, 86, 124, 175, 191, 218, 264, 278, 284 Cameron, M.C., 130 Campbell, Kim, 173 Canada: aboriginal people, 73, 161–2, 184, 201, 241, 252; Charlottetown Accord, 54, 102, 139–40; Charter of Rights, 156–7, 178; clergy reserves, 67, 69; Confederation, 1858 initiative, 45, 48, 97, 133, 198, 269; Confederation (1867), 19, 21, 30–1, 45–9, 79–80, 92–3, 95–103, 123, 125–6, 130, 134–6, 142, 173, 203, 205, 279; conscription crisis (1944), 19, 235–6; Durham Report, 24–5, 144; elections (1896), 103–4, 209; — (1917), 104, 129, 209; — (1926), 105; — (1988), 157; First World War, 42; franchise, 161; Free Trade Agreement 1988, 156–7; French language in, 24, 143–5, 202–3, 223; future of, 125–6, 138, 142–6, 282; Georgian Bay canal, 134, 280; Hudson’s Bay

Company, 66, 184; immigration, 138, 171, 205; Intercolonial Railway, 20, 49, 63, 101, 125, 133; Meech Lake Accord, 42, 54, 92, 139, 198; October crisis, 214; Pacific railways, 64, 112, 114, 125–6, 133, 199, 210; population, 200–3; rebellions 1837, 15, 41, 66–73, 202–3, 205, 256, 271; Red River troubles, 66, 71, 177, 202, 256; Rowell-Sirois commission, 46; St Lawrence Seaway, 63; Second World War, 226– 7, 229, 235, 239; theories of imperialism, 63; uprising 1885, 210; Winnipeg General Strike, 58, 195 Canning, George, 206 Cannon, John, 35, 176–7, 271 capital punishment, 28–9, 71, 89–91, 267, 273 Carleton, Guy, 143–4, 146 Carnarvon, Lord, 136 Carr, E.H., 8, 190 Carson, Sir Edward, 116, 178 Cartier, G.-E., 26, 30, 97, 142 Cartwright, Richard, 265 Castlereagh, Lord, 206 Catherine of Aragon, 211 Cauchon, Joseph, 98–9 Ceylon, 200 Chamberlain, Austen, 280 Chamberlain, Joseph, 128 Chamberlain, Neville, 121–2, 184, 228, 250 Charles I, 168, 206 Charles II, 286 Chatham, 30 Chesterton, G.K., 218, 290 China, 56, 159, 163, 226, 240 Chown, Dwight, 173 Chrétien, Jean, 253

Index / 297 Churchill, Clementine, 119 Churchill, Winston, 25, 40–1, 82, 86, 93–4, 113, 121, 139, 146, 163, 227, 229–30, 238, 252, 291 Coaker, William, 87–8, 92, 94–5, 106, 241 Cobbett, William, 219 Cobden, Richard, 207, 212 Codrington, Christopher, 253 cold war, 23, 153–4, 164, 192, 194, 239 Colley, Linda, 27–8, 198, 202 Collingwood, R.G., 8, 267 Collins, Michael, 216 Connolly, James, 179 Cook, Ramsay, 173 Craig, Sir James, 68 Creighton, Donald, 57, 123, 136, 146, 252–3 Crewe, Lord, 116–17 Cromwell, Oliver, 62, 112, 166–8 Cullen, L.M., 224 Curzon, Lord, 116, 277 Cyprus, 135 Czechoslovakia, 121–2, 124, 183, 184, 216, 228, 278 Dafoe, J.W., 103–4, 124, 145 Daladier, E., 121 Dalhousie, Lord, 41, 68 Dandurand, Raoul, 172 Darwin, Charles, 126, 175 Darwin, John, 153, 200 Deakin, Alfred, 15–16, 254, 255 decisions, 3–5, 8, 32, 43, 73–6, 79–107, 191, 198, 204, 216, 220, 229–30, 232– 3, 236, 238, 250, 257, 272; decisionmaking process, 81, 84, 92, 96 de Cosmos, Amor, 202 de Coulanges, F., 13 Defoe, Daniel, 220

democracy, 24, 64, 102–3, 153–8, 161– 2, 173, 177–8, 185, 226, 229 demography. See population Denmark, 115, 124, 184 Derby, Lord, 126, 278 Derrienic, Jean-Pierre, 157 de Stael, Madame, 180 de Valera, Eamonn, 227 Dickson, James, 101 Diefenbaker, John, 14, 90, 138–9 Dillon, John Blake, 165–6, 184 Disraeli, Benjamin, 18, 113, 177, 184– 5, 210, 277, 286 Dorion, A.-A., 97–8 Dowding, Sir Hugh, 39–41, 193 Doyle, Arthur Conan, 212, 222 Dray, W.H., 247, 250, 266, 287 Dufferin, Lord, 143, 170, 202 Duncombe, Charles, 66 Dunkin, Christopher, 97, 123, 136 Duplessis, Maurice, 123, 154 Edward VI, 205–6 Edward VII, 45, 172 Egypt, 260 Eisenhower, Dwight, 86–7, 233, 237– 8 Eldridge, Colin, 60 Elgin, 8th earl of, 15, 19, 92, 94, 144 Elgin, 9th earl of, 137 Elizabeth I, 167, 189, 205, 210–12 Ellis, Ruth, 90–1 Elton, G.R., 67, 196, 245, 247–9, 263, 269 English Civil War, 61–2, 167–8, 176, 206, 221 Essex, 81; earl of, 206 ethical values, 6, 13, 61, 67, 74, 174–86, 202, 226–7, 241, 245, 252–3, 255–6

298 / Index evidence, 3, 4, 13–32, 35, 40, 70–1, 72–3, 83, 85, 120–1, 246, 264, 267, 276 Falkland Islands, 27, 166, 284 Fallon, Michael, 215 Ferguson, Niall, 191 Fergusson Blair, A.J., 207–8 Fiji, 79 Finland, 215, 227 First World War, 40, 193; and Canada, 42, 85, 104, 124, 145, 173, 203–5; Gallipoli, 23; and Ireland, 179–80; and Newfoundland, 87–8; outbreak, 42–3, 46, 53, 82, 86, 94, 115, 119, 122, 124, 131, 163–4, 184, 245, 247, 264; outcome, 155, 160, 185, 192, 217, 226, 231, 240, 245 Fischer, D.H., 8, 39, 48, 65–6, 193 Fisher, Charles, 103 Fisher, Geoffrey, 28 Fisher, Jacky, 40 Fleming, Alexander, 49 Foreign Office, British, 93, 115 Fournier, Pierre, 146 France, 27, 36, 43, 46, 124, 126, 158, 203, 219, 240; First World War, 42, 82, 115, 119–20; revolutions, 36, 197, 218, 222, 258; Second World War, 39–40, 189, 226–7, 250 Franz Ferdinand, 42–4, 53, 247 Frederick the Great, 120, 123 French Canadians. See Quebec Fukuyama, Francis, 155–8 futures, 4–6; continuation of past, 123–4, 127, 147; catastrophic, 139– 40; inspirational, 138–9; past futures, 5–6, 107, 111–48, 151–2, 179, 191, 220, 228, 233, 238, 252, 257, 259, 275, 280, 291

Gaitskell, Hugh, 207–9 Gallagher, J., 62–4 Galt, Alexander, 130, 198 Gambetta, Diego, 84 Gandhi, Mahatma, 113, 232, 258 Gardiner, S.R., 176–7 George V, 171 George VI, 28, 170, 172, 194 Germany, 13, 30, 42, 64, 79, 93, 120–1, 158, 163, 176, 184–5, 215, 218, 226– 40, 283; First World War, 42, 53, 82, 124; Second World War, 226–40 Gilbert, Martin, 40 Gilbert, W.S., 190–1, 193 Gladstone, W.E., 15, 20, 31, 80–1, 112, 114, 117, 128, 138, 143, 207, 209–10, 253 Glyndwr, Owain, 276 Goderich, Lord, 206 Goebbels, Josef, 58 Gogarty, Oliver St John, 151 Goodfellow, C.F., 136–7 Gordon, Arthur, 79–80, 99–100 Granatstein, J.L., 261 Granville, Lord, 89 Gray, J.H., 21, 265 Greece, 41, 163, 230 Green, J.R., 13 Grey, Earl, 26 Grey, Sir Edward, 46, 89, 115, 124, 131 Grey, George, 140 Griffith, Samuel, 129 Guyana, 253 Haig, Sir Douglas, 120, 169 Haliburton, R.G., 102 Haliburton, T.C., 143 Hamilton, Alexander, 128–9 Hamilton, P.S., 133 Hancock, W.K., 64, 291

Index / 299 Handel, G.F., 66 Harper, Elijah, 42 Hartley, L.P., 169 Hatfield, Richard, 104–5 Hazlehurst, Cameron, 82 Head, Sir Edmund, 19, 89, 92, 135 Head, Sir Francis B., 73 Heath, Edward, 105 Henripin, Jacques, 145–6 Henry IV, 163 Henry VII, 127 Henry VIII, 127, 170, 206, 211 Henson, Hensley, 20 Herbert, Sidney, 207 Hetton-le-Hole, 161 Hexter, J.H., 8 Hill, Christopher, 62 historical devices: bourgeois hegemony, 58, 64, 71; Cashel cathedral, 55, 67; Cleopatra’s Nose, 39, 42–3, 279; Coldingham Fallacy, 50–4, 64– 5, 74, 83, 103, 245, 256; contemporary history, 283, 293; counterfactual, 191–2, 196, 208, 210–13, 217, 223, 228–9, 235, 248–9, 287; Dowding’s pencil, 39–41, 86, 228, 241, 268; factors, 67, 74, 196; gunpowder and match, 42–4, 53, 59, 73; judgment of history, 13–14, 121; lobster of Kirriemuir, 61–2, 64–5, 74, 270; long- and short-term causes, 44–5; Mandy’s riposte, 29– 30, 38, 46, 73, 95, 102, 195, 278; oxymoron, 62; post hoc, ergo propter hoc, 45–9, 106, 164, 249; Ramses II, 20; reification, 4, 58–61, 64; relevance, 191; scientific analogy, 47–9, 74, 195, 246; smoking gun, 48, 67; statistical correlations, 69, 70; turning points, 190; Viking Syndrome, 54–8,

70, 74, 83, 245, 247–8; Wilson’s heckler, 30; witches of North Berwick, 37, 64–5 history: antecedent events, 35–6, 44, 46, 49, 73, 111, 189, 246; explanation in, 7, 13, 31–2, 85, 189, 197, 246–9; impossibility of explanation in, 4, 6, 13–14, 35–76, 102, 147, 196, 238, 245, 248–50; knowledge in, 264; narrative in, 7, 35, 45, 71, 267 Hitler, Adolf, 14, 44, 120–1, 124, 165, 172, 176, 184–5, 190, 192, 196, 225– 39, 250, 277, 291 Hobhouse, J.C., 118 Holmes, Sherlock, 212–13, 220–1, 223 homosexuality, 22, 176, 182 Horne, Donald, 113 Horne, E.H., 193–4 Howe, C.D., 17 Howe, Joseph, 61–2, 99, 162, 171 Hungary, 122 Huskisson, William, 206 Hutchison, Bruce, 142 Hyam, Ronald, 60, 181–3 Hyde, Douglas, 87 Hyndman, H.M., 218 Illinois, 16 ’imperialism,’ 60–4 India, 56, 63, 113, 116, 126, 137–8, 159, 199–200, 206, 216, 231–2, 252 Iran, 156 Iraq, 259 Ireland, 36, 87, 115–16, 132–3, 138, 143, 151, 159–60, 163, 165–6, 176, 177–80, 200, 209–10, 213–19, 247, 255, 284, 286, 292; famine, 55, 247, 286; Home Rule, 24, 116, 139, 217, 222, 281; Northern Ireland, 143,

300 / Index 160, 163, 177–9, 213–14, 223, 240; Partition, 132, 160, 223–4; Second World War, 41, 227, 233 Islam, 38–9, 116, 254, 257–8 Ismay, H., 232–3 Israel, 256, 259–60 Italy, 43–4, 84, 227 James VI and I, 36–7, 169–70, 172, 183–4, 189, 191–2, 197, 212, 289 Japan, 23, 112, 139, 156, 159, 192–3, 199, 226–7, 230–2, 291 Jinnah, M.A., 279 Joad, C.E.M., 155 John, King, 13, 164 Johnson, Andrew, 208 Johnson, Daniel, 251 Johnson, Lyndon B., 208 Johnson, Pierre-Marc, 251 Joynson-Hicks, Sir William, 89–90 Kaplan, John, 267 Kean, Maurice, 163 Keegan, John, 229 Kennedy, John F., 16, 208, 253, 267 Kenya, 180–3 Kenyatta, Jomo, 180 Keppel-Jones, Arthur, 126–7, 192 Kerr, Sir Robert, 118 Khrushchev, Nikita, 112–13, 156 King, W.L. Mackenzie, 17, 18–20, 85, 93, 105–6, 159, 172, 194–5, 241, 287, 292 Kingsley, Charles, 175–7, 246, 293 Kipling, Rudyard, 66 Kishlansky, Mark, 286 Kitchener, Lord, 94, 131 Knatchbull-Hugessen, Edward, 64 Knox, John, 27, 266 Koselleck, Reinhart, 271, 284

Kruger, Paul, 128, 278 Labaree, Benjamin, 65 Labour Party, British, 30, 89, 113, 209, 219–20 Laflèche, Louis, 100–1 LaFontaine, L-H., 138 language: English, 158–9, 170, 215– 16, 220; Irish, 214–16; meaning in, 25–9, 58, 65–6, 180, 186, 189, 234, 266; translation and, 180, 186 Lansdowne, Lord, 119 Lapsley, Gaillard, 9, 261, 264 Laurier, Wilfrid, 19, 103–4, 114, 142 Leacock, Stephen, 112 Lee, J.J., 178–9, 192, 222–3 Lennox-Boyd, Alan, 121 Lévesque, René, 253 Lichnowski, Prince, 123, 278 Lightfoot, John, 38, 74, 117, 249 Lincoln, Abraham, 59–60, 111–12, 122, 124–5, 208 Lindisfarne, 162–3 Linlithgow, Lord, 113, 252 Liverpool, 55, 81, 222 Liverpool, Lord, 114, 206 Lloyd George, D., 119–20, 131 Locarno Treaty, 280 Logan, F. Donald, 56 London, 26–8, 63, 117, 130, 159, 169, 190, 196, 219–22, 228–9, 255, 290 Low, Anthony, 200 Lower, Arthur, 21, 29, 44–5, 68, 85, 145, 204, 248 Lukacs, John, 291 Luther, Martin, 249 Lyttelton, Edith, 87 Macaulay, T.B., 116, 176 McCalla, Douglas, 70, 72

Index / 301 McCrone, David, 167, 217 McCullagh, C. Behan, 175, 268, 289 MacCurtain, Tomas, 216 MacDonagh, Oliver, 213 MacDonald, Donald, 96, 146 McDonald, Ian, 88, 94, 106 Macdonald, John A., 18–19, 21, 26, 30–1, 64, 89, 93, 97–9, 101–2, 104, 128, 130, 134, 136–7, 143–4, 146, 171, 198, 208–10, 265, 279 McGee, D’Arcy, 136 McGrath, Charles A., 52 MacKay, Ian, 58 Mackenzie, Alexander, 21 Mackenzie, W.L., 66, 68–9, 72–3 McMahon, William, 274 Macmillan, Harold, 19 MacNab, Sir Allan, 19, 70 Malaya, 135 Manchester, 53, 159, 190, 219, 222 Mandelbaum, Maurice, 80 Manitoba, 23, 42, 54, 100, 103–5, 199, 209, 223 Mann, Golo, 163 Marlowe, Christopher, 183, 286 Martin, Chester, 47–8 Martineau, Harriet, 92 Mary, Queen (d. 1953), 172 Mary, Queen of Scots, 27 Mary I, 210–12, 289 Mary II, 211 Massingham, H.W., 164 Maurice debate (1918), 119–20 Meighen, Arthur, 105–6 Melbourne, Lord, 26, 73 Menzies, Robert, 28, 93, 166, 220, 251 Meredith, Edmund, 207 Merriman, J.X., 141 Mexico, 46, 166, 227 Mikardo, Ian, 113

Mill, J.S., 268 Milligan, Spike, 160 Milloy, John, 201 Milner, Alfred, 122–3 Mink, Louis O., 267 Mitchison, Rosalind, 185 Monck, Lord, 19, 207 Montgomery, Bernard, 234, 292 Montgomery, L.M. 171 Montreal, 64, 66, 68, 72, 144, 146, 198–9, 223; Bank of, 26; riots (1849), 15, 94 Moodie, Susanna, 172, 207 Moore, D.C., 57–8 Morgan, K.O., 193, 231 Morris, Grace, 193 Morrison, Herbert, 89–90 mortality, 205–12 Morton, Graeme, 217–18, 290 Morton, Suzanne, 58 Moss, H.W., 207 Mountbatten, Lord, 267, 278–9 Mulroney, Brian, 54, 96 Mulroney, Mila, 43 Munich crisis (1938), 120–2, 124, 184, 228, 277 Murison, Barbara C., 284 Myhre, B., 56 Nash, Walter, 19 naval power, 17–18, 30, 52, 82, 86–7, 115, 227, 229–31, 235 Neale, Sir John, 211 Neatby, H.B., 85, 105–6, 194 Netherlands, 55, 115, 124, 155, 232, 289 New Brunswick, 45, 58, 79, 97–105, 125, 133, 135, 198, 201, 215, 269 Newcastle, duke of, 20, 45, 49, 92–3, 125, 133, 207

302 / Index Newfoundland, 51–2, 55, 87–8, 92, 94– 5, 97, 123–4, 135, 139, 143, 162, 173– 4, 201, 205, 215, 241 Newman, John Henry, 91–2, 113–14, 123, 252 Newman, Peter C., 139, 163, 253 New Zealand, 19, 118, 138, 140, 159, 198, 227, 241 Nixon, Richard, 14, 16, 30 Norfolk, duke of, 206 Normanby, Lord, 93 Norris, R.R., 118–19, 123, 145 Norway, 55–6 Nova Scotia, 45, 58, 61, 72, 97–100, 102, 125, 133–5, 143, 159, 168, 170–1, 198, 205, 215, 229, 269, 280 Oakeshott, Michael, 35, 111, 189–90 O’Connell, Daniel, 36 O’Farrell, Patrick, 165 Ontario, 23, 198 Ontario Hydro, 52 Ontario schools, 215 Orwell, George, 107, 153, 165, 239 Overy, Richard, 233 Oxford, 52, 91, 114, 170, 190, 253 Pakistan, 80, 224, 278–9 Paley, William, 38–9 Palmerston, Lord, 126, 184 Palestine, 258–60 Papineau, L.-J., 41, 66, 68, 72 Parkes, Henry, 138 Parnell, Charles Stewart, 39, 209–10 Paterson, Lindsay, 167 Patton, George, 236 Pearse, Padraig, 138 Pearson, Lester B., 85, 114, 147, 171–2, 204 Peel, Robert, 18, 31, 117

Philip of Spain, 210–12 Platt, D.C.M., 60 Plumb, J.H., 260–1 Poland, 41, 44, 120, 122, 228, 250 Pole, Reginald, 211 Pontiac, 66, 271 Pope, Joseph, 21, 93 population, 55–6, 71, 199–205, 219– 20, 247–8; Quebec, 145, 202–3, 282 Porter, Roy, 219 Portugal, 43 Pouliot, J.B., 101 present. See time Prince Edward Island, 89, 101–2 Princip, Gavrilo, 43, 247 Profumo Affair, 29 Purdue, A.W., 233, 239 Quebec, 24, 29, 41, 61, 68, 72–3, 95–8, 104, 123–4, 128, 130, 137–8, 143–6, 154, 157, 161, 172–3, 185–6, 198, 202–4, 209, 214–15, 217, 223, 251, 253, 292 Quebec City, 72, 98, 144, 146 questions, 3–5, 15, 22–3, 65–6, 68–9, 73; what? 66; what if? 191; wherefore? 65–7, 180, 193, 256; why? 3, 15, 22–3, 65–7, 74, 85, 180, 256; why what? 3, 5, 85, 93–6; why when? 3, 5, 68, 70, 74, 85, 93–6, 133, 238; why where? 68–9, 74; why who? 3, 69– 70, 74 Ralston, J.L., 18–9 Read, Colin, 67, 69, 71 Reddaway, W.F., 120 Reform Act: 1832, 35–6, 57–8, 118, 207, 221; 1867, 112, 126, 221, 278; 1884, 161, 221 religion, 28–9, 65–7, 69–70, 72, 91–2,

Index / 303 99, 103–4, 114, 117, 126, 128, 132–3, 145, 153, 160, 169–74, 177–9, 183, 193–4, 198, 209–12, 215–16, 223–4, 240, 258, 277, 285, 289 Revel, Jean-François, 156 Rhodesia, 24, 127, 135–7 Ribbentrop, Joachim von, 228 Richard II, 22, 163, 210, 245, 261 Richmond, duke of, 41 Riel, Louis, 177, 202, 260 Rigby, S.H., 268 Ritchie, John, 277 Roberts, Andrew, 267, 279 Roberts, Geoffrey, 232–3, 264 Robinson, R.E., 60, 62–4 Roche, Sir Boyle, 115–16 Rogers, Frederic, 19, 92 Roosevelt, Franklin D., 80, 238 Rosebery, Lord, 222 Routledge, W.S., 182–3 Rowse, A.L., 210 ’Rule Britannia,’ 27–8, 159 Rupert, Prince, 184 Russell, Bertrand, 86, 164 Russell, Lord John, 26, 93 Russell, W.H., 111–12, 133, 142–3 Russia, 42–3, 55, 112–13, 153–6, 194, 207, 215, 218, 226–7, 229–32, 238–40, 259 Ryerson, Egerton, 26 Sadat, Anwar, 260 Salisbury, Lord, 153, 221–2 Salter, F.R., 278 Samoa, 79–80 Sampson, Agnes, 37 Sarajevo, 42, 53, 247 Saskatchewan, 198–9, 225, 288 Saul, John Ralston, 163 Saywell, J.T., 253–4

Schreiner, W.P., 141 Scotland, 19, 27–8, 36–7, 50, 61, 64–5, 69, 118, 159–61, 166–9, 189, 192, 197, 215, 217–18, 230, 266, 270, 285; Jedburgh, 166, 267 Scott, C.P., 190 Scott, F.R., 93 Scott, Thomas, 177 Second World War, 28, 165, 184–5, 190, 197, 225–40, 250; Arnhem, 232, 235; Battle of the Atlantic, 17–18; Battle of Britain, 39–41, 139, 228–9, 239; and Canada, 17–19, 41, 203–4, 226, 233, 235–6, 292; Dieppe, 233, 239; Dunkirk, 229; intelligence, 18, 232; Normandy invasion, 17, 86–7, 189, 195, 204, 225, 227, 232–9, 291– 2; outbreak, 44, 124, 226, 291; outcome, 192 Seeley, J.R., 112, 116, 123, 155, 191 Selborne, Lord, 140–1 Sellar, Robert, 145 Sellar, W.C., 155 September eleventh 2001, 255–60 Serbia, 42–4, 53 Shakespeare, William, 27–8, 65, 136, 193, 234–5, 266, 292 Shaw, G.B., 170 Sheridan, R.B., 117 Shirer, William L., 176, 277–8 significance, 7, 8, 151, 192–241, 245, 257–8, 287; negative, 196, 212–23, 231, 259 Silberrad, Hubert, 182 Simon, Sir John, 90 Simpson, John, 259 Smallwood, J.R., 174 Smith, Goldwin, 52–3, 112, 143, 145, 157, 166–7, 197, 276 Smith, Sydney, 163

304 / Index Smuts, J.C., 126, 139, 141, 192 sociology, 57–8, 84 Somaliland, 217 Somerset, 14–15 South Africa, 87, 113, 122–3, 126–8, 136–7, 139–42, 159, 163, 171, 192, 201, 203, 223, 240–1 South Carolina, 59 Spain, 27, 62, 226 Stagg, Ronald, 69, 71 Stalin, Joseph, 192, 225–7, 231–2, 238 Stalingrad, 232–3 Stephen, James, 117 Stewart, A.T.Q., 177, 213, 286 Steyn, M.T., 113 Stow, George B., 22, 245 Strafford, earl of, 206 Sweden, 155 Switzerland, 158, 266 Sydney, Lord, 23 Taché, Etienne, 99 Tahiti, 199 Tardivel, J.-P., 145 Taylor, A.J.P., 30, 32, 39–40, 43, 122, 184–5, 193, 247–8, 250 Temple, Sir William, 55 Tennyson, Alfred, 139, 281 terrorism, 133, 157, 177, 180, 214, 252, 255–60 Texas, 16, 166 Thatcher, Margaret, 93 Thomas, Hugh, 156 Thompson, Sir John, 19, 209–10 Thomson, Dale C., 207 Thomson, David, 43 Thomson, James, 27–8, 267 Thornton, A.P., 60 Tilley, S.L., 100 time, 3, 7–8, 164–5, 284; ‘long time,’ 6,

152–69, 234, 240, 253, 255, 257; past, totality of, 13; present, 3, 4, 6, 8, 239, 245, 253–4; present as endpoint of, 13, 152–8, 168, 172, 178, 217, 239, 254 Toffler, Alvin, 151 Toronto, 26, 66, 68–70, 72, 89, 133–4, 159, 172–3 Toynbee, Arnold, 8 Trainor, Luke, 60–1 Trent incident (1861), 52–3 Trevelyan, G.M., 49, 70, 101, 190–1 Trimble, David, 180 Trollope, Anthony, 117, 142, 272 Trudeau, Pierre, 24, 54, 173, 185, 253 Truman, Harry, S., 80 Tupper, Charles, 21, 104 Turkey, 41, 117, 199–200 Tweedsmuir, Lord, 172 Ukraine, 231 United States: aboriginal people, 66, 176, 224, 290; and Britain, 155, 170; and Canada, 14, 23, 46–7, 52–3, 64, 69, 71, 80, 96–7, 132, 143, 146, 156– 7, 204–5, 240, 276; Civil War, 46–9, 52–3, 59, 111–12, 122, 125–6, 175, 224, 276; colonial period, 158, 224; Constitution, 16, 125, 128–9, 132, 224–5; elections (1860), 59; (1944), 238; (1960), 16–17; Franco-Americans, 145; immigration, 202; and Japan, 159, 192; and the Middle East, 255–60; Pacific War 1941–5, 226, 230, 238–9; presidents, 43, 80; Reconstruction, 208; and Samoa, 79; Second World War, 41, 190, 192, 196, 226, 229–30, 235–6, 238, 291; slavery in, 59–60, 83, 112, 124– 5, 175; Vietnam War, 124, 156; War

Index / 305 of Independence, 64–5, 72, 114, 213; Washington Treaty (1871), 64; Watergate, 14, 30 Verwoerd, Hendrik, 141–2 Victoria, Queen, 26, 171, 209 Vikings, 54–6, 71, 162–3 Waite, Peter B., 21, 48 Wakefield, E.G., 276 Wales, 193, 203, 215, 219, 276 Walpole, Sir Robert, 137, 159 Warham, William, 211 Waterloo, battle of, 9, 41, 191, 203 Watkin, Edward, 21, 265 welfare state, 158, 161–2, 185 Wells, Clyde, 92 West Indies, 253 White, Hayden, 48, 151, 253, 261, 263

White, Theodore H., 16–17 William the Conqueror, 236 William II, 289 William III, 197, 211, 284, 287 Williams, Sir Fenwick, 89, 92 Willison, J.S., 20–1 Wilmot, Chester, 58 Wilson, Harold, 30, 163, 284 Winks, Robin, 249 Wise, B.R., 94 women, 27, 29, 37, 55, 89–90, 161, 169, 179–83, 193, 245–6, 252, 275 Wormald, Jenny, 189, 229, 241, 293 Wrigley, E.A., 220 Yeatman, R.J., 155 Young, Brian, 26 Yugoslavia, 240

This page intentionally left blank

THE JOANNE GOODMAN LECTURES 1976 C.P. Stacey, Mackenzie King and the Atlantic Triangle (Toronto: Macmillan of Canada 1976) 1977 Robin W. Winks, The Relevance of Canadian History: U.S. and Imperial Perspectives (Toronto: Macmillan 1979) 1978 Robert Rhodes James, ‘Britain in Transition’ 1979 Charles Ritchie, ‘Diplomacy: The Changing Scene’ 1980 Kenneth A. Lockridge, Settlement and Unsettlement in Early America: The Crisis of Political Legitimacy before the Revolution (New York: Cambridge University Press 1981) 1981 Geoffrey Best, Honour among Men and Nations: Transformations of an Idea (Toronto: University of Toronto Press 1982) 1982 Carl Berger, Science, God, and Nature in Victorian Canada (Toronto: University of Toronto Press 1983) 1983 Alistair Horne, The French Army and Politics, 1870–1970 (London: Macmillan 1984) 1984 William Freehling, ‘Crisis United States Style: A Comparison of the American Revolutionary and Civil Wars’

1985 Desmond Morton, Winning the Second Battle: Canadian Veterans and the Return to Civilian Life, 1915–1930 (published with Glenn Wright as joint author, Toronto: University of Toronto Press 1987) 1986 J.R. Lander, The Limitations of the English Monarchy in the Later Middle Ages (Toronto: University of Toronto Press 1989) 1987 Elizabeth Fox-Genovese, ‘The Female Self in the Age of Bourgeois Individualism’ 1988 J.L. Granatstein, How Britain’s Weakness Forced Canada into the Arms of the United States (Toronto: University of Toronto Press 1989) 1989 Rosalind Mitchison, Coping with Destination: Poverty and Relief in Western Europe (Toronto: University of Toronto Press 1991) 1990 Jill Ker Conway, ‘The Woman Citizen: Translatlantic Variations on a Nineteenth-Century Feminist Theme’ 1991 P.B. Waite, The Loner: Three Sketches of the Personal Life and Ideas of R.B. Bennett, 1870–1947 (Toronto: University of Toronto Press 1992) 1992 Christopher Andrew, ‘The Secret Cold War: Intelligence Communities and the East-West Conflict’ 1993 Daniel Kevles, ‘Nature and Civilization: Environmentalism in the Frame of Time’

1994 Flora MacDonald, ‘An Insider’s Look at Canadian Foreign Policy Initiatives since 1957’ 1995 Rodney Davenport, Birth of the ‘New’ South Africa (Toronto: University of Toronto Press 1997) 1996 Ged Martin, Past Futures: The Impossible Necessity of History (Toronto: University of Toronto Press 2004) 1997 Donald Akenson, If The Irish Ran the World: Montserrat, from Slavery Onwards (Montreal: McGill-Queen’s University Press 1997) 1998 Terry Copp, Fields of Fire: The Canadians in Normandy (Toronto: University of Toronto Press 2003) 1999 T.C. Smout, ‘The Scots at Home and Abroad 1600–1750’ 2000 Jack P. Greene, ‘Speaking of Empire: Celebration and Disquiet in Metropolitan Analyses of the Eighteenth-Century British Empire’ 2001 Jane E. Lewis, Should We Worry about Family Change? (Toronto: University of Toronto Press 2003) 2002 Jacklyn Duffin, ‘Lovers and Livers: Disease Concepts in History’

This page intentionally left blank

THE JOANNE GOODMAN LECTURES

1976 C.P. Stacey, Mackenzie King and the Atlantic Triangle (Toronto: Macmillan of Canada 1976) 1977 Robin W. Winks, the Relevance of Canadian History: U.S. and Imperial Perspectives (Toronto: Macmillan 1979) 1978 Robert Rhodes James, ‘Britain in Transition’ 1979 Charles Ritchie, ‘Diplomacy: The Changing Scene’ 1980 Kenneth A. Lockridge, Settlement and Unsettlement in Early America: The Crisis of Political Legitimacy before the Revolution (New York: Cambridge University Press 1981) 1981 Geoffrey Best, Honour Among Men and Nations: Transformations of an Idea (Toronto: University of Toronto Press 1982) 1982 Carl Berger, Science, God, and Nature in Victorian Canada (Toronto: University of Toronto Press 1983) 1983 Alistair Horne, The French Army and Politics, 1870–1970 (London: Macmillan 1984)

128 / The Joanne Goodman Lectures 1984 William Freehling, ‘Crisis United States Style: A Comparison of the American Revolutionary and Civil Wars' 1985 Desmond Morton, Winning the Second Battle: Canadian Veterans and the Return to Civilian Life, 1915–1930 (published with Glenn Wright as joint author, Toronto: University of Toronto Press 1987) 1986 J.R. Lander, The Limitations of the English Monarchy in the Later Middle Ages (Toronto: University of Toronto Press 1989) 1987 Elizabeth Fox-Genovese, ‘The Female Self in the Age of Bourgeois Individualism’ 1988 J.L. Granatstein, How Britain's Weakness Forced Canada into the Arms of the United States (Toronto: University of Toronto Press 1989) 1989 Rosalind Mitchison, Coping with Destitution: Poverty and Relief in Western Europe (Toronto: University of Toronto Press 1991) 1990 Jill Kerr Conway, ‘The Woman Citizen: Transatlantic Variations on a Nineteenth-Century Feminist Theme’ 1991 P.B. Waite. The Loner: Three Sketches of the Personal Life and Ideas of R.B. Bennett, 1870–1947 (Toronto: University of Toronto Press 1992)

The Joanne Goodman Lectures / 129

1992 Christopher Andrew, ‘The Secret Cold War: Intelligence Communities and the East-West Conflict’ 1993 Daniel Kevles, ‘Nature and Civilization: Environmentalism in the Frame of Time’ 1994 Flora MacDonald, ‘An Insider's Look at Canadian Foreign Policy Initiatives since 1957’ 1995 Rodney Davenport, Birth of the ‘New’ South Africa (Toronto: University of Toronto Press, 1997) 1996 Ged Martin, ‘Past Futures: Locating Ourselves in Time’ 1997 Donald Akenson, If The Irish Ran the World: Montserrat, From Slavery Onwards (McGill-Queen’s University Press, 1997) 1998 Terry Copp, ‘Fields of Fire: The Canadians in Normandy’ 1999 T.C. Smout, ‘The Scots at Home and Abroad, 1600–1750’ 2000 Jack P. Greene, ‘Speaking of Empire: Celebration and Disquiet in Metropolitan Analyses of the Eighteenth-Century British Empire’

130 / The Joanne Goodman Lectures

2001 Jane E. Lewis, ‘Should We Worry about Family Change?’ 2002 Jacklyn Duffin, ‘Lovers and Livers: Disease Concepts in History’