130 118 19MB
English Pages [192] Year 1994
THE LIFE OF LEARNING
This page intentionally left blank
THE CHARLES HOMER HASKINS LECTURES OF THE AMERICAN COUNCIL OF LEARNED SOCIETIES
edited by
Douglas Greenberg Stanley N. Katz with the assistance of
Candace Frede
$.¢ ACLS New York Oxford Oxford University Press 1994.
Oxford University Press Oxford New York Toronto Delhi Bombay Calcutta) Madras Karachi Kuala Lumpur Singapore Hong Kong Tokyo Nairobi Dar es Salaam Cape Town Melbourne Auckland Madrid and associated companies in
Berlin Ibadan
Copyright © 1994 by Oxford University Press, Inc. Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016-4314 Oxford is a registered trademark of Oxford University Press All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press. Library of Congress Cataloging-in-Publication Data The Life of learning : Haskins lectures sponsored by the American Council of Learned Societies / edited by Douglas Greenberg, Stanley N. Katz.
p. cm. ISBN o~-19~508339—3
I. Greenberg, Douglas. II. Katz, Stanley Nider. III. American Council of Learned Societies.
ACs.L718 1994 car.3—de20 9337336
24689753 Printed in the United States of America on acid-free paper
This book is published in honor of
Frederick Burkhardt and dedicated to the memory of
R. M. Lumiansky and John William Ward
This page intentionally left blank
PREFACE
The American Council of Learned Societies (ACLS) was founded in 1919 to support humanistic learning in the United States and to represent American scholarship abroad. Originally composed of twelve learned societies, ACLS now includes fifty-three member societies, a long list of college and university associate members, and a group of seven affiliates devoted to other areas of humanistic endeavor. Over the course of a distinguished history, the Council has supported the scholarship of thousands of researchers through its fellowship programs, thereby assisting in the publication of innumerable books and monographs, and participated effectively in virtually every aspect of teaching and learning in the humanities.
When John William Ward became President of the ACLS in 1982, he concluded that the ACLS tradition of involvement in scholarship and teaching of the highest caliber should be commemorated through an annual lecture delivered by a distinguished humanist on the “life of learning.” He further proposed that the lecture be named for Charles Homer Haskins, a scholar and teacher of great accomplishment, who was instrumental in the founding of ACLS and served as the first chairman of its board of directors. Ward’s conception of the Haskins Lecture was that it should not be a work of scholarship in the usual sense, but should represent an opportunity for selfreflection by a senior humanist still active in his or her field. Such a scholar would be in a position to offer a meditation of sorts on the body of work he or she had thus far produced and the forces, personal as well as intellectual, that had shaped it. Thus, in his invitation to the first Haskins Lecturer, Maynard Mack, Ward wrote: Our intention is to ask each year’s honoree to reflect on a lifetime of work as a scholar, on the motives, the chance determinations, the satisfactions (and the dissatisfactions) of the life of learning, to explore through one’s own Iife the larger, institutional life of scholarship. We do not wish the speaker to present the products of one’s own scholarly research, but rather to share with other scholars the personal process of a particular lifetime of learning.
Vill PREFACE Bill Ward’s formulation of his idea was so graceful that these words have appeared in every letter of invitation sent to Haskins Lecturers since 1983. The honor the Haskins Lectureship confers, as well as Ward’s elegant description
of it, have proved sufficiently attractive to ensure that, to date, only one person has declined the invitation to deliver the lecture. The difficult problem of choosing the annual Haskins Lecturer was given to the Executive Committee of the Delegates of the ACLS. The Delegates are
the formal governing body of the Council, each representing one of the member societies. Annually, they choose from among their number an Executive Committee that serves as a committee on admissions for new members of the Council, advises the staff on the program of the Councils Annual Meet-
ing, and selects the Haskins Lecturers. From year to year, the committee collects suggestions from members of the learned societies and each year, in a discussion that resembles an attempt to determine the relative merits of many great works of art, selects one scholar to be invited to deliver the next year’s
lecture. Indeed, a list of those scholars who have been considered for the lectureship, but have not yet been chosen, reads like a ““Who’s Who” of a very rarefied kind. In any case, once the decision is made, the President of ACLS
drafted in 1983. |
dutifully dispatches to the honoree virtually the same letter that Bill Ward While the responses to the invitation have been invariably affirmative, their pungency may suggest the high seriousness with which the Haskins Lecturers
have approached their task. Maynard Mack, who inaugurated the series, replied that although he was pleased to accept, “I now know what Kierkegaard meant by fear and trembling.” Several weeks later, in the process of handling some logistical details with Ward, he wrote: “It gets scarier and scarier.” Mack’s superb lecture set the precedent for those who followed, and many of them have asked for copies of previous lectures as a guide. When Paul Oskar Kristeller accepted the invitation in 1988, he requested copies of all the previous lectures because, he said, “they may help me to understand and follow the format and content expected in these lectures.” With characteristic propriety, he also wrote that he felt “greatly honored by this invitation and [I] gladly accept it. I hope that I shall be able to do justice to the task, in spite of my advancing age.” When Milton Babbitt was invited to deliver the lecture, he also commented
on the honor it represented, but he warned that “I charge extra to play cocktail piano at the reception.” Milton Anastos was effusive in his acceptance, writing that “It carries me to the highest point in my career. . . .” Carl Schorske, who had previously served as a member of the Executive Commit-
PREFACE 1X tee of the Delegates and so had reason to know what sort of competition he had bested, was blunt: “What a surprise your kind letter contained. Having thought about the assignment for the past forty-eight hours, I realize all too well some of its difficulties—much more vividly than when I was on the
Council’s Executive Committee weighing candidates. . . . In any case, I shall do my best to do justice to so signal an honor from an organization that lies so close to my heart.”
All the responses reflected a modesty that the accomplishments of the invitees hardly seemed to require; some, however, confessed outright astonishment. Annemarie Schimmel wrote: “I was highly surprised when I read [the letter of invitation] and could not believe that you had chosen me to give
the distinguished ACLS lecture... . I am very very grateful to you although I feel somewhat hesitant to speak in the same series in which a number of great scholars have spoken until now. . . .” No Haskins lecturer put it better than Donald Meinig who, upon receiving the invitation, wrote: “Your astonishing letter arrived yesterday. I took it home to share it with my wife and to leave it on the table to see if it were there—and real—the next morning.” The lectures themselves provide ample evidence of both the wisdom of the committee’s choices and the gravity with which the lecturers have approached the task. Occasionally, ACLS had the opportunity to communicate with the lecturers after the lecture as well as before. Carl Schorske, on reading Donald Meinig’s lecture, wrote to comment on its graceful prose and to observe that it had given him a new appreciation for Meinig’s discipline of geography.
Meinig himself confessed that “in the euphoria of having actually gotten through the Haskins Lecture” he thought he had failed to thank ACLS properly for the invitation. But John Hope Franklin described the experience in a way that perhaps all the Haskins Lecturers would have understood when he observed that he hoped “the Delegates, Board of Directors, and the audience . . . enjoyed the proceedings half as much as I did. Although I felt just a bit as if I were undressing in public, the talk gave me the opportunity to
reflect on some aspects of my life. . . . It was . . . a memorable expertence.”
The experience of hearing the lectures each year and then ushering them into print has also been memorable. As the collective wisdom and understanding that the lectures represents has accumulated over the years, Bill Ward’s prediction that they would eventually make a splendid little book has proved only too true. ‘Thus, the tdea of publishing this volume originated in Ward’s first
X PREFACE plans for the lecture series. Those plans have been reinforced each year, however, as each lecturer has brought a new perspective to a consideration of the
meaning of a life of learning. Those of us fortunate enough to hear these lectures when they are first presented recognize, moreover, that they provide not only perspectives on individual lives and careers, but also on the development of scholarship in the humanities in the broadest sense. Indeed, they are important primary sources for the writing of American intellectual history in the postwar period. And this has served only to strengthen our conviction that the entire series to date deserves publication.
Although the temptation to a fuller analysis and commentary on these collected lectures is great, we have decided to foreswear an extended interpretative essay in the knowledge that one of the great pleasures in reading these texts is the surprises, great and small, that they contain. To introduce them too elaborately would be to reveal some of those surprises and thereby to reduce the great delight that awaits readers of this book. The Haskins Lectures appear here precisely as they were delivered, with the exception of the deletion of passing remarks specific to the occasion. The lecturers are identified, where appropriate, by the institution with which they were affiliated at the time of the lecture. The 1984 lecture, delivered by Mary Rosamond Haas, is not included in this volume. Professor Haas spoke from notes on the occasion itself and was not able subsequently to provide ACLS with a text for publication. We regret the absence of her lecture. The editors record their gratitude to each of the lecturers for their unfailing cooperation in bringing this book to publication so expeditiously. In addition, Barbara Henning has provided essential administrative support to the lecture series over the entire period of its history and under the direction of three ACLS presidents. The editors record their deepest gratitude to her for this support and her attention to the many other tasks essential to the operation of ACLS. Since 1987, Candace Frede, Managing Editor of ACLS Publications, has expertly supervised the production of our series of Occasional Papers and has guided the last seven Haskins Lectures through their original publication. We thank her for these and her continuing efforts on behalf of the ACLS publications program. The dedication of this book in honor of Frederick Burkhardt and to the memory of R. M. Lumiansky and John William Ward is an inadequate and
belated gesture of gratitude to three men whose devotion to ACLS as the Councils President and, more important, to the support of many lives of
PREFACE XI learning was an anonymous gift to countless scholars for many years. Fred Burkhardt continues to inspire us with his devotion to the life of learning as the Editor of the Correspondence of Charles Darwin and as a person of uncom-
mon wisdom and modesty whose occasional visits to the ACLS offices are eagerly anticipated by the editors of this volume, who revere him for his past accomplishments and admire him for what he continues to accomplish. Bob Lumuansky served ACLS for many years as the Chair of its Board of Directors and then as its President. Ward served the Council informally for almost three
decades before becoming its President in 1982. Fittingly, when Ward’s untimely death left the Council without a president, Lumiansky returned for a year as President pro tempore. No one who knows Fred Burkhardt or who knew Bob Lumiansky and Bill Ward would doubt that each of them belongs in this volume with the Haskins Lecturers, a trio of distinctive exemplars of the life of learning.
New York D.G. September 1993 S. N. K.
This page intentionally left blank
THE CHARLES HOMER HASKINS LECTURES OF THE AMERICAN COUNCIL OF LEARNED SOCIETIES
Charles Homer Haskins (1870-1937), for whom the ACLS lecture sertes is named, was the first Chairman of the American Counci of Learned Socteties, 1920-26. He began his teaching career at the Johns Hopkins University, where he received the B.A. degree in 1887, and the Ph.D. tn 1890. He later taught at the University of Wisconsin and at Harvard, where he was Henry Charles Lea Professor of Medieval History at the time of Ins retirement tn 1931, and Dean of the Graduate School of Arts and Sciences from 1908 to 1924. He served as President of the American Historical Association, 1922, and was a founder and the second President of the Medieval Academy of America, 1926.
A great American teacher, Charles Homer Haskins also did much to establish the reputation of American scholarsinp abroad. His distinction was recognized in honorary degrees from Strasbourg, Padua, Manchester, Paris, Louvain, Caen, Harvard, Wisconsin, and Allegheny College, where tn 1883 he had begun his higher education at the age of thirteen. In 1983, to recognize Haskins’ signal contributions to the world of learning in the United States, the ACLS inaugurated a serves of lectures entitled “The Life of Learning” in lis honor. Designed to pay tribute to a life of scholarly achievement,
the Haskins Lecture 1s delivered at the Annual Meeting of the Council by an eminent humanist. The lecturer 1s asked to reflect and to reminisce upon a lifetime of work as a scholar, on the motives, the chance determinations, the satisfactions and the dissatisfactions of the life of learning.
This page intentionally left blank
CONTENTS
1983 / Maynard Mack 3 1985 / Lawrence Stone 17 1986 / Milton V. Anastos 37 1987 / Carl E. Schorske 53 1988 / John Hope Franklin = 71
1989 / Judith N. Shklar 87 1990 / Paul Oskar Kristeller 105
1991 / Milton Babbitt 121 1992 / D. W. Meinig = 141
1993 / Annemarie Schimmel 161
This page intentionally left blank
THE LIFE OF LEARNING
This page intentionally left blank
19 8 3 MAYNARD MACK Sterling Professor of English Yale University
I Iam reminded by Professor [Georges] May’s generous introduction of a story about Winston Churchill. After World War IT and his stint as prime minister,
he was invited back to his old school, Harrow, to give the commencement address and decided he ought probably to oblige. So he went, weathered an introduction almost as laudatory as the one you've just listened to (except in his case deserved) then got to his feet and said to the graduating class, “Nevah give up!” and sat down.
I think you will agree that this is the most memorable commencement address you have ever heard as well as perhaps the wisest possible comment
on the life that all of us here are engaged in fostering, and that I, alas, on grounds that will be no more apparent to you than they are to me, have been singled out (“fingered” is, I believe, the underworld term) to address: the life
of learning. . . . Although I stand here before my betters, I do not stand here before very many of my elders. I have already drawn down from that mysterious fund with which we all begin three and a half years beyond my Biblical allowance, with the result that on any reasonably quiet afternoon I can hear my brain cells dying so fast they sound like popcorn. And that, I came to realize, 1s precisely what ACLS had in mind: they wanted to exhibit me, the way the Egyptians used to exhibit a skeleton at the beginning of their feasts. “Nothing like a mouldy old professor,” I could hear the Executive Board whispering, “to energize an audience of other professors into taking thought—before they get to be like him.” So do take thought, ladies and 3
i gentlemen, golden lads and girls; and as an old gravestone in Exeter churchyard says, “The faults you saw in me, Pray strive to shun; And look at home:
There’s something to be done.” |
My instructions for this talk urged me to be somewhat personal, even to reminisce. And to tell you the unvarnished truth, I did sit down at first and produce a simply elegant piece of autobiography for this occasion. Very moving, I thought: parts of it would have brought tears to your eyes. But my wife refused to let me give it. She went over it very thoroughly, as she does everything I do. It took her a while, because she paused, dutifully, at all the places where I had written PAUSE FOR APPLAUSE. But when she was
finished, she looked up and said, “It wont go down.” “What won’t go down?” I said, having a certain talent for repartee. “That part about your
reading Charles Homer Haskins at 17,” she said. “Nobody'll believe that. They'll think you laid it on just for this occasion.” “But I did read Haskins at 17,” I said. “I remember that book of his on the Renaissance of the twelfth century had just come out, and there was a copy of it lying around the house, and I read it—it was my last year in high school. And furthermore, two years later, I even got hold of his Ruse of Universities and read that. If it hadn’t been
for those two books, I might not be in the profession [’m in.” She was not convinced. “Just the same, they won’t believe it,” she said; “and, besides, I don’t see the point of that long digression you’ve got here on deconstructive criticism and how much it has liberated you and how you owe
it all to Swift. You know perfectly well it's nothing to do with Swift. How many times have you told me it all goes back to Saussure and Levi-Strauss!” “Oh but that’s the theory,” I said. “Swift was the first practitioner. . . . You remember the Three Brothers in his Tale of a Tub and how they handle their father’s will? Well—they deconstruct it, and out of that very same text came the Roman Catholic, the Anglican, and the Evangelical churches. If that isn’t liberating, I don’t know what is.” “OK,” she said, with a withering glance, “but a lot of people are going to be shocked to find you’ve gone over to that side! Stull, I suppose there’s no fool like an old fool. Do as you like; but don’t forget that what you’ve already got here is at least an hour and ten minutes too long.” There, I had to admit, was a point. Maybe I should try for something a little
more astringent, syncopated, compact. Something in couplets maybe, like those of Mr. Pope, of whom I am presently trying to write a biography. A sort of modern-day Dunciad perhaps. Alas, it was a dream of glory soon shattered. For almost immediately there leapt to my mind an experience I had had in the far past. I was a sophomore at Yale then, taking a course called English 25, and
MAYNARD MACK § our instructor assigned us the task of writing a witty couplet in the manner of Mr. Pope, whose poems we were studying. “Just one couplet,” he said, for he had given us a great deal of reading to do. “One couplet, and bring it to class next time.” So I wrote my couplet and took it to class and showed it to him. He looked at it a very long time—sort of the way I umagine Balboa must have looked at the Pacific. “Well,” I finally asked, getting nervous: “How is it?” He didn’t answer right away, and when he did, his voice was all choked up. “It’s great,” he said; “it’s absolutely great—if only you could get rid of those first two lines.” At this point, I really began to panic. The 14th of April was closing in, and I had nothing in hand. “Now there’s only one thing left you can do,” I said to myself: ““Look in thy heart and write!” I had always been a notable phrasemaker. “Yes,” I said, “look in the place ‘where all the ladders start, In the foul
rag-and-bone shop of the heart’”—coining another phrase. So in the wee small hours of yesterday morning, I had a look around down there, in the ragand-bone shop, I mean. Very depressing! All I could see was a lifetime’s accumulation of clichés, stacked up like garment bags in an attic. I stared at them glumly. Of the three bags nearest me, one, I noticed, was exceptionally small and had a tag on it reading: HUMANITIES: PROSPECTS OF. The next
bag was somewhat larger and its tag read: TEACHING: WHAT EVER HAPPENED TO IT? The third bag seemed to be made of some sort of latex. It could obviously expand or shrink. On its tag all I could find was a question mark followed by the words: THE ACADEMY: VERDICT NOT IN. “Well, it'll all have to come from those three,” I said to myself. “I haven’t got time for
any more. And even if they ave clichés, maybe my audience will remember what an eminent living classicist once said: that it is in the very nature of clichés to be both profoundly true and perpetually forgotten, to the peril of everyone.”
It So here is a sampling of the contents of those bags, just as they tumbled out.
And, first, on the prospects of the humanities. Not reassuring, the bag said. . . . Still, itis incumbent on at least the humanists among us to take the long view. Have the prospects of the humanities ever been entirely reassuring? I suspect not. We all remember, I dare say, the schoolboy who was asked to write a theme on Socrates, whom many humanists of the past, at least, used to claim as the founder of their sect. (This was in the dear dead days when
6 1983 teachers still asked for themes and read them.) “Socrates was a Greek,” the boy wrote. “He went around giving people advice. They poisoned him.” From the Middle Ages comes a not much more sanguine report. That twelfthcentury John of Salisbury, on whom Charles Haskins has so many fascinating pages in the two books I have mentioned, found himself confronted in his own time by attitudes very like those voiced so vociferously during the 1960s and 1970s in this country, and now not so much voiced as practiced in the nationwide stampede to vocationalism. “What 1s the old fool after?” John imagines himself being asked by the vocationalists of /s day. “Why does he quote the sayings and doings of the ancients to us? We draw knowledge from ourselves. We, the young, do not recognize the ancients.” It 1s positively uncanny how history repeats itself. Here 1s Jefferson writing in his older years to John Adams, former presidents both, in one of the great correspondences of all time: Our post-revolutionary youth are born under happter stars than you and I were. They acquire all learning in their mother’s womb, and bring it into the world ready-made. The information of books 1s no longer necessary; and all knowledge which is not innate is in contempt, or neglect at least. . . . When sobered by experience, I hope our successors will turn their attention to the advantages of education.
Nor is any of this altogether different from that battered sign James Reston
once reported he found on the study door of a tired old professor at Coe College, Iowa: “We the willing, led by the unknown, are doing the impossible for the ungrateful. We have done so much for so long with so little, we are
now qualified to do anything with nothing.” Obviously, then, we are not the first followers of the life of learning to feel unloved, or to have to realize that we live in a time and place that doesn’t quite know what to make of us. Nor can I think that we will be the last. There is a fundamental ambiguity about the work of the humanities and social sciences
that can easily persuade any society to regard them as a threat. Much of the time we appear to be harmless drudges, tunneling about in archives and libraries or among exotic cultures and landscapes on errands as alien to the interests of the corner pharmacist as the Beach Boys to James Watt. As such we are bound to be regarded by the doers and shakers of the nation with impatient forbearance or at best with affectionate contempt. It is an attitude I dare say many of you have met with in testifying before Congressional committees. Trouble is, however, that every now and then there
MAYNARD MACK 7 comes fountaining up out of our tunnels and far voyages, not necessarily an Origin of Species or a Das Kapital or a Psychopathology of Everyday Life, but at least some significantly altered perspective, after which nothing ever looks quite the same. Blake, you will recall, registers one such shift unforgettably: What, it will be questioned, when the sun rises, do you not see a round disk of fire somewhat like a guinea? O no, no, I see an innumerable company of the Heavenly Host crying, “Holy, holy, holy is the Lord God Almighty”: I question not my corporeal or vegetative eye any more than I would question a window concerning a sight. I look through it and not with it.
As Blake’s example suggests, disturbances of the status quo are by no means
confined to the social sciences. Changes wrought by achievements in the humanities and arts may be more leisurely and indirect, but few, I think, would care to argue that the great geniuses in philosophy, history, literature, and in artistic endeavor generally have toiled in vain. They too clarify and challenge, inform and ask questions. And this double function of the disciplines we severally represent here tonight is surely our badge of honor, even if sometimes, when the chips are down, we have also to wear it as our badge— not, [ hope, our red badge—of courage.
Ii] When I shook out the second bag entitled TEACHING: WHAT EVER HAPPENED TO IT? I had to be considerably more selective. It was full of little
notes to myself; for the life of learning, in my case, has been largely synonymous with teaching. My father was a teacher, a teacher in Oberlin, where I grew up, and even a very great teacher, I was eventually forced to conclude, from the dozens upon dozens of former students who bothered to write notes of appreciation after his death. (Most of the rewards of the life of learning, I have noticed, come after death if they come at all.) I could not hope to match
his magic, but I knew I had not taught for 42 years without learning something. What had I learned? One thing I remember learning very early, and sure enough it was recorded on the first three-by-five that fell out on my lap. “Never forget,” 1t began-—and instantly I realized I was simply confirming from my own experience what Leo Strauss used to tell every aspiring young teacher:
8 1983 Never forget that in your classroom there will always be at least one student altogether your superior both in mind and in heart. Never forget, either, that there is, or should be, in that classroom a second teacher far more important than you are: a great text. Try not to get in the way of that traffic. Remember Milton’s admonition in Aveopagitica: A good book 1s the precious lifeblood of a master spirit, embalmed and treasured up on purpose to a life beyond life. Be careful not to say anything so egregiously silly as that a great artist—or a great historian or a great philosopher or any other creator of a great text—endured all that toil, and often all that suffering, to make a work that refers to nothing but itself, that is about itself and not about the follies, grandeurs, and miseries of the human lot. Be even more careful not to show off—either your learning, if you have any, or your latest critical panacea. Remember that the more luminous the work, the deeper the darkness an intervening opaque body casts. Try not to be that body. If you actually believe, cross your heart, that Proust’s Remembrance of
Things Past is really “about” metaphor and metonymy, or that Poe’s “The Purloined Letter” celebrates a phallus, try to get off by yourself somewhere and lie down.
What else had I learned? Many things, of course, too obvious to be repeated were set down in those little three-by-five messages to myself, and
some rather saddening to repeat. One of the latter is the remarkable shrinkage, as it seems to me, if not sometimes the total erosion, of the awe, reverence, wonder, and, yes, love with which the best teachers of my youth approached the books they taught. Doubtless they were less sophisticated than many of their successors. Certainly they were less sophistic. ‘They would have laughed out loud at the idea that it was some kind of status symbol to be called a “research” or—even more ineptly—a “distinguished” professor, if it meant being without students to profess to, or teaching only when and what the whim moved. They would have laughed yet louder at the self-delusion of
those who imagined they were “above” teaching undergraduates. “Be prepared for the coming of the Stranger,” cries a voice in one of Eliot’s choruses from The Rock. “Be prepared for him who knows how to ask questions.” In the lite of learning—institutionalized learning, at any rate—it 1s the undergraduate who fills the role of the Stranger. It is he or she who will ask you, What good is this stuff anyway? or, Why is the king wearing no clothes? and who are all those people pretending they don’t notice? Graduate students never ask such questions. They have already settled down in the village. All they want to know is where 1s the nearest photocopier or how much 1s the sewage tax.
So I suppose it 1s fair to say that the teachers who taught my generation
MAYNARD MACK 9 were unsophisticated. Yet they did have a firm hold on something that nowa-
days appears to be in short supply. One of my earliest memories is of my father coming out of his study, where he had been reading Wordsworth in preparation for a class, with tears in his eyes—a startling experience for a child. And I remember, later on, at Yale, how C. B. Tinker would bring very rare books or manuscripts from his own library to class, and urge us to touch them as if we were in a holy place—as indeed we were: the “holy Republic of Letters,” as W. H. Auden would call it later on. And one thinks, too, of Keats
discovering Chapman’s translation of the Odyssey and feeling lke some watcher of the skies when a new planet swims into his ken; and of Pope at his friend Lord Bathurst’s reading aloud in Greek the scene of Priam’s meeting with Achilles, and being unable to go on; and of Flaubert writing to a friend: The most beautiful works . . . are serene in aspect, unfathomable. . . . They are motionless as cliffs, green and murmurous as forests, forlorn as the desert,
blue as the sky. .. . Through small apertures we glimpse abysses whose sombre depths turn us faint. And yet over the whole there hovers an extraord1-
nary tenderness . . . like the smile of the sun. It is calm, calm and strong. It is easy to make fun of the metaphors and manners of a former age. But I cannot help wondering if Flaubert’s “extraordinary tenderness” and Pope’s unsteadied voice and Keats’s conquistadors staring at each other with a wild surmise, and C. B. Tinker’s trembling fingers and my father’s tears do not all add up to an openness, and a kind of humility, of mind and heart from which we could learn. How rarely now, at least in my discipline, are we willing to let
the unfathomable come to us on its own terms. How busily we insist, like Rosencrantz and Guildenstern, on plucking out the heart of mysteries of which we cannot even finger the first stops. How often we seem to value the great works primarily as trapezes to show our agility on the upward swing to tenure. And how very often we reduce them to cadavers by our skills in anesthesia.
A gifted poet of our day, himself a professor of literature, attended not long since a session of papers on Swift, delivered at a typical annual meeting of a typical professional association. After it, he wrote a poem. The commentaries it reacts to may have been good, bad, or indifferent-—-no matter. For the poet, there had been something terribly missing 1n that room, which his poem
undertakes to restore. Later, he sent the poem to a friend named Wayne Burns, also an English professor who had been at the meeting, inscribed in a collection of Swift’s poems. It reads as follows:
IO 1983 I promised once if I got hold of This book Pd send it on to you These are the songs that Roethke told of, The curious music loved by few. I think of lanes in Laracor Where Brinsley MacNamara wrote His lovely elegy, before The Yahoos got the Dean by rote.
Only, when Swift-men are all gone Back to their chosen fields by train And the drunk Chairman snores alone, Swift is alive in secret, Wayne: Singing for Stella’s happiest day, Charming a charming man, John Gay, And greeting, now their bones are lost, Pope’s beautiful, electric ghost.
Here are some songs he lived in, kept Secret from almost everyone And laid away, while Stella slept, Before he slept, and died, alone. Gently, listen, the great shade passes, Magnificent, who still can bear, Beyond the range of horses’ asses, Nobilities, light, light and air.
IV From the pathology of some—not all, but all too many—of our professional meetings, it is an easy step to the pathology of the academy as a whole. (I have reached Bag No. 3.) How long can we safely assume that Congresses, legislatures, foundations, and generous men and women generally will continue to support what we do? Into eternity, we like to think, or at least into its nuclear equivalent, whichever comes first. Yet surely the signs of trouble are multiplying, at least for the humanities and literature, and, so far as I am competent to judge, for some of the social sciences as weil. When one reads thoughtfully in
the works by Darwin, Marx, and Freud cited earlier, or any of their other works, what one finds most impressive is not the competence they show in the studies with which we associate them, though that z of course impressive, but
the range of what they knew, the staggering breadth of the reading which they had made their own and without which, one comes to understand, they
MAYNARD MACK Ii could never have achieved the insights in their own areas that we honor them for. Today, it seems to me, we are still moving mostly in the opposite direction, despite here and there a reassuring revolt. We are narrowing, not enlarging our horizons. We are shucking, not assuming our responsibilities. And we communicate with fewer and fewer because it is easier to jabber in a jargon than to explain a complicated matter in the real language of men. How long can a democratic nation afford to support a narcissistic minority so transfixed by its own image?
A splendid report by John Gerber, published a few years back but as pertinent today as it was then, exposes this flank of our irresponsibility unforgettably. In the word “our” I include only his field and mine, English litera-
ture; but I have the strong impression that the shoe fits others too. Though some of you will have seen the essay, I hope you will forgive me if I summarize it briefly for the rest. This time an English professor encounters Socrates in a Greek pub and introduces himself. Socrates makes the predictable comment that now he’ll have to watch his English grammar. The professor makes the predictable reply that English grammar is not his concern—he’s a humanist, he says. “As a matter of fact,” he adds, “we like to think of our English departments as bastions of the humanities.” Upon that cue, Socrates begins.
He inquires about the great religious thinkers. “Well, no. We don’t teach them nowadays: the trustees might object.” What about the great philosophers? “Alas, a little too hard for our students to follow.” The rhetoricians then: Cicero, Longinus, Boileau, Burke? “No, the speech departments have them.” The ancients? “No, Classics has them.” What about the moderns—the Europeans, the Asians, the Africans? “To tell you the honest truth, Socrates,
the people interested in those things have gone off and established departments of Comp. Lit.” “You must have stoutly resisted that move,” says Socrates innocently. “Not at all,” says the professor, “we encouraged it. They wanted their students to read texts in the original!” “But surely,” says Socrates, “you're interested in language: what do you do about linguistics?” “Not very much, Pm afraid. The study of language has got so scientific we don’t understand it any more.” After the conversation has run through several further areas of humanistic learning—all of which it turns out, those bastions of the humanities exclude— Socrates interrupts. SOCRATES: Do I understand that you English teachers, after giving up the great works of religion, philosophy, and rhetoric, after lumiting your literary interests substantially to writings in English, and after jettisoning linguistics, com-
12 1983 parative literature, creative writing, speech, journalism, American Studies, the
theatre, and oral performance—that after eliminating all these valuable studies from your English curriculum, you are now in the process of eliminating training in writing as well? PROFESSOR: You put it too sweepingly, Socrates, but I guess what you say is more or less true. SOCRATES: My only conclusion 1s that you English teachers have developed the
most oversized death-wish that I have seen in the last twenty-four centuries!—-Will you join me in a drink? PROFESSOR: Why, yes, Socrates. That’s very nice of you. What shall we drink? SOCRATES: Hemlock.
Whatever qualifications we might wish to make in Gerber’s account, and
plainly we are entitled to make a few, his main thesis stands. During my lifetime, we have very considerably lowered our sights on what the life of learning in almost any field entails; and not only our sights but our standards.
During the recent unpleasantness, we permitted, or abetted, on most campuses the dissolution of almost every aspect of educational structure—not only distributional requirements, but philosophical, historical, mathematical, scientific, and literary requirements as well—all the while pretending to ourselves and the public that this was some sort of triumph of academic statesmanship rather than the penalty of having so lost our own bearings that we could not agree on what a liberal education should contain. One can only remark with sorrow the similar paralysis in the current Rockefeller Report of the Commussion on the Humanities. As the New York Times reviewer points out,
nowhere in that report 1s it made clear what, exactly, the humanities are, or why, explicitly, they are worth pursuing; and though we all know that in a scattering of institutions some portion of this lost consensus is in process of being regained, much remains to do.
We have likewise lost face, in my opinion, by fostering a vast deal of solemn nonsense about “published scholarship.” To study, to keep learning, to read widely and reflectively: these pursuits are essential to our profession, it goes without saying. I believe, too, that it is crucial to one’s intellectual vitality to try to write (when one has something to write worth writing), partly for the same reason that hard physical exercise is important for physical health, partly because there is truth in the old chestnut: how do I know what I mean till I see what I say? But the qualifying clause is essential: one is to write when one has something to write worth writing. As a contemporary poet once happily put it: “In poctry everything is permitted. With this one condition, of course: you have
MAYNARD MACK | 13 to improve on the blank page.” For much too long, I think, we have committed the lives of our young people to an idol of our own manufacture which
rewards those who can persuade themselves to go numbly or cynically through the humbug it requires; penalizes those, often our very best, who are
unable or unwilling to measure out their minds in three-year book-length sections; and encourages even the most seasoned scholars to blow up into mediocre treatises what might have made acceptable essays. Meantime, how many of us take the pains to share our learning, and the delight and wonder of the work to which we devote our lives, with a wider public? Surely that is an activity that would peculiarly become us as “humanists?” And yet when we practice it, how many of our colleagues can be counted on not to turn up their noses at us for “popularizing?” I omit the extenuations for lack of time. Obviously, we are not the sole begetters of this unedifying scene. Administrators, legislators, students, the state of the economy, and a certain easy-going dislike of hard choices embedded deep in the American grain have all collaborated with us. So has a rancid fag end romanticism currently infecting our lives, which identifies the self with tender shapeless breathings from some sort of psychic flower
within and so recoils from all tasks “not personally fulfilling’—-a phrase which frequently turns out to mean any task that is hard. We are not notably to blame either, I think, for what in some ways may be the most ominous
single feature of our present environments. I mean the mistrust of people by people that turns almost every aspect of decision-making—-institutional, professional, even pedagogical—into an adversary situation. An educational
world that for all its flaws was once intensely humanistic because it was intensely personal and interpersonal on the order of the family is in the process of being depersonalized and routinized, on the model of the corporation, into contract hours, grievance committees, employer regulations, union work rules, and systematic resort to litigation by teachers, students, and administrations alike. The old situation may have been patriarchal or matriarchal, and sometimes as 1n families, injustice was done. The new sit-
uation 1s an institutionalized cold war, in which mutual trust, without which no educational endeavor can thrive long, is already becoming on some campuses the first casualty. Though I would be the first to admit that there
is no idyllic solution for the problem of governance in colleges and universities, as there is not for the governance of peoples, it cannot be beyond human ingenuity, one thinks, to moderate the confusions of authority into which, in the academy, we have allowed ourselves to drift since World War IL.
14 1983 Vv
It is the habit of old men, as Horace says, to be praisers of time past: laudatores
temporis actt. You will forgive that, I hope. It 1s similarly their habit to go about giving people advice. Stull, I cannot help thinking that during the next ten years all of us face a rather pressing agenda if we are to be again what we have always claimed to be and if we are to be perceived by either the public or the private sectors as worthy of the privileges we receive. Let me, in conclu-
sion, put the points that I would place at the top of that agenda, if it were mine to make, in the form of three questions: 1. Can we not discover ways to communicate with a far larger public than most of us do now? I was gratified to discover in a current newsletter a plea from Professor Edmund Morgan, whose credentials as a scholar are impeccable, for his colleagues in history to give over addressing minute groups of their peers, while meantime there exists a huge American public that hungers
and thirsts after, and buys, hundreds of books by the so-called “popular” historians, who may often be less well-informed but do know how to combine instruction with pleasure. “We have all but surrendered,” Professor Morgan writes, “the custody of the past to outsiders” (a term I secretly wish he
had forborne to use). “It is time now to take it back, to stop sneering at popular historians and become, like them, professional writers.” ‘This is, in my view, so important an objective for all our disciplines at this time that if I were a graduate dean I would insist that every student be able to produce not only a
dissertation acceptable to his instructors but an essay on the same subject of sufficient popular interest to find its way into a journal of general circulation. And I would try to make an arrangement with the local radio or television station to carry one such half-hour, live, each week. This has been done by a few in our profession. It should be done by many. The Edmund Wilsons and Malcolm Cowleys of this world do not spring full-armed from the head of Zeus. It takes much harder work to write for the public than to write for one’s colleagues. And it has a cleansing effect on one’s vocabulary. Pompous gibberish rapidly disappears. 2. How can we best lend a hand to our fellow-workers in once the vineyard, now the ravaged landscape of the secondary school? They, as you don’t need me to tell you, are in desperate straits. Even in abstract intellectual terms,
the situation is bleak cnough—TI borrow now from the commission report mentioned earlier: “The rate of illiteracy in this age group has been estimated at over ten percent and as high as twenty.” This is at the end of high school!
MAYNARD MACK 1S Consider the social loss in those figures and the future costs. Consider also that the abler teachers, those who can leave, are leaving, leaving now in the hundreds, and if you wonder why, visit their turf, smell the fear in the washrooms, watch grown men and women look helplessly on as overgrown boys, having checked 1n for roll-call, spit on the teacher’s desk and check out for the day. It is almost as if Yeats had an American inner-city high school in mind when he wrote The Second Coming. For there, indeed, in these fallen days, the falcon does not hear the falconer, mere anarchy zs loosed upon the world, the ceremony of innocence 1s drowned, the best lack all conviction, and the worst are full of passionate intensity.
About some of this there is nothing we can do except as private citizens. But as groups of faculty on our respective campuses, there is much we can do for local secondary-school morale and for the improvement at that crucial level of the subject-matters we teach, by creating opportunities for interchange. We need to provide occasions when those teachers can meet with us and we with them in a common intellectual setting for a common intellectual purpose. Though I don’t wish to puff my own university, I do believe it 1s now supporting the best program I know of for accomplishing this end. It 1s called the Yale/New Haven Teachers Institute and has lately gained sufficient national attention to bring presidents, deans, and school superintendents to New Haven to learn about it. I won't bore you with the details. Suffice it to say that it annually attracts fifty to sixty-five teachers from the New Haven area, gives them the stimulus that comes from small-group and person-toperson discussion, and guides the development by each participant of a teaching “unit” of about a semester’s length in his or her chosen field. The teachers choose the unit, choose with the help of an advisory group the professors with whom they wish to work, and the learning process, as [ have been told by more than one participating faculty member, soon becomes a two-way street. An important dividend for the faculty participant has proved to be the necessity of thinking very hard about ways of treating a complex problem with clarity and force while avoiding vulgarization. An equally important dividend for the schoolteacher, if I may yudge from those to whom I have talked, is a deep sense of renewal, partly from the intellectual excitement of wrestling with a professional problem in the company of other adults, partly from the reassurance that they and their work matter, that somebody cares about what they are trying to do. They are as proud of their stack-cards to Sterling Library as any young lawyer of the announcement that he has made the firm. 3. My third question, though brief, 1s the one that matters most, and, if answered with the self-knowledge and the generosity that we sometimes
16 1983 show elsewhere, would take care of all the rest. Can we not be less selfimportant? posture less? swagger less? strut less? Can we not recognize that the three most dangerous giants-in-residence in every scholar/teacher’s House
of Pride are the temptation to view a younger colleague’s excellence as a threat, the temptation to feed on the adulation that attends a cult, and the temptation to prostitute one’s independence to some Establishment, academic, governmental, corporate, or-—and now I will use the word for the first and last time in my life—hermeneutical? Above all, can we not try harder
to remember that what happens when we see that unmistakable moment of excited revelation in a pair of eyes in the classroom where we teach—the moment that makes hours of drudgery fall away like the Ancient Mariner’s albatross—has only a very little to do with us. We are catalysts, at best. We can
sometimes help, with tact, to dissipate the clouds that obstruct a vision. But we do not make that vision. It comes from elsewhere—through the text, through the student’s DNA perhaps, and perhaps through mysteries of which we still know little. On the morning when William Blake died, we know from a friend’s testimony, “he composed songs to his Maker, so sweetly to the ear of his Catherine [that was his wife’s name] that, when she stood to hear him, he, looking upon her most affectionately, said: ‘My beloved! they are not mine. No! They are zot mine.” When we remember, as we often have reason to do, those wry amusing words on the professor’s door at Coe College, must we not also be careful to remember these?
1985 LAWRENCE STONE Dodge Professor of History
Princeton University
Iam, as you may well imagine, extremely flattered to have been invited by the
ACLS to give the third Charles Homer Haskins Lecture. I feel particularly honored—or I think 1 do—to find that I am the first of the three speakers so far to be a working scholar and teacher, and not already emeritus. Why the Board saw fit to invite a relative yuvenile like myself for this occasion I do not
know. Further consideration of this question tempers my feeling of pride at
being chosen at so youthful an age. Since the topic for the evening was described to me by President Ward as “reflections and reminiscences on a lifetume of work as a scholar,” his presumption presumably was that for me that life had come to an end—that I had run out of gas—which of course may well be the case. To prepare for this lecture I asked to read what my two predecessors had said on this occasion, and was sent the first by Professor Maynard Mack. As I read it, my heart sank. There was no way that I could be as wise and as witty, as erudite and as amusing, all at the same time. It just wasn’t possible. At a
later stage, after I had written a first draft, I was particularly upset by his report of the acerbic remarks of his wife when he showed her the first draft of his intellectual autobiography. “I suppose there is no fool like an old fool. Do as you like: but don’t forget that what you’ve got here is at least an hour and ten minutes too long.” I frantically counted the pages of my text, which came to st. Reeling from this withering blast from Mrs. Mack, I very nearly threw in
the sponge and told President Ward that [ couldn’t and wouldn’t do it. But then I gritted my teeth and set to work, and here is the result. This is the story of my intellectual odyssey over the last fifty-some years through an ocean full of storms, whirlpools and hidden rocks. We can, I 17
18 1985 think, safely omit the first eight years of life, which are anyway only of interest
to dedicated Freudians convinced that the personality is fixed in concrete at this period—and sexual concrete at that. All that might be significant during these years is that this 1s when I first became a fanatical collector—a collector of anything: postage stamps, butterflies, fossils, cigarette cards. There is obviously a relationship between this early but unfocused collecting instinct, and the adult pursuit by the scholar in libraries and archives of facts and yet more facts to buttress his hypotheses and illustrate and give plausibility—it would be impertinent to say prooft—to his arguments. At eight I went off to an English prep school and there began to serve what was to prove an eight-year term as a slave in the intellectual salt-mines of intensive training in the classics. As a form of instruction, what I got was whimsically known at the time as “a liberal education.” In reality it was an intensely
narrow program, a perverted derivative trom the educational curriculum worked out 400 years earlier by Vives and Erasmus. By the 1930s it consisted of a mechanical and dreary memorization of the vocabulary and grammar of
two long-dead languages. The pronunciation of one of them—Latin—was then taught in England tn a manner altogether unintelligible either to the ancient Romans or to the twentieth century natives of any other country. Thus, whereas my French father-in-law when in a concentration camp during
the Second World War found it possible to communicate in Latin with Hungarian aristocrats and Polish intellectuals, the Latin I learnt—the socalled “old pronunciation”—could have served no such practical purpose. What I did learn that was useful—though I learnt it the hard way, with blows as a punishment for error—were the rules of Latin grammar, which may perhaps have been helpful in improving my style in English later on. But Iam even doubtful about that. Trained in this manner, it 1s easy to fall into the stately Ciceronian prose of rolling periods perfectly balanced one against the other. Much as I admire the prose of Gibbon and Lord Chesterfield, the style
just does not suit me, for by nature I am most at ease in a free-wheeling atmosphere. Let me make the point by analogy. Once, at the age of twelve, I was thought to be a promising cricketer—a batsman. So my school hired a kindly but unimaginative elderly ex-professional cricketer to teach me to hold a straight bat. He succeeded only too well. My bat was forever straight but I never played a successful game again, for he had managed to kill my natural instinct to swipe the ball in a thoroughly unorthodox but effective manner. There is perhaps something to be learnt about teaching in general from this
sad little story—a tragedy for me since I used to dream about playing for England one day.
LAWRENCE STONE 19 What I acquired——and let me stress that during those eight years from 8 to 16 I was taught little else—was a facility in translating a London Times editorial from English prose into Latin prose, from Latin prose into Latin verse,
from Latin verse into Greek prose, and from Greek prose back into English prose. You will have guessed that I was not very good at it, partly from natural ineptitude, partly from lack of will. I could not for the life of me see the point of it at all, and I still don’t. Even the Latin books we read were dull. Virgil and Livy were, to my perhaps philistine sensibility, a bore. We were never intro-
duced to books that would both stimulate our interest and provide useful information about adult life, such as Tacitus on the court politics of tyranny, or Ovid on the art of heterosexual love. Like most people, I would imagine, I was eventually taught to love scholarship by a handful of gifted teachers. I will not dwell upon my experiences at an English public school—in my case Charterhouse-—because this is a topic that novelists and autobiographers have already made something of a bore. If at
the time I had known anything about social anthropology or the political theory of totalitarianism, I could have understood a great deal more. It would have helped, for example, if I had realized that what I was experiencing was merely an extended male puberty rite, very similar to those of many other, more primitive, societies in the world: total segregation from the other sex; regular beatings to be endured in stoical silence; humiliation rituals; a complex formal hierarchy symbolized by elaborate dress codes; inadequate food; sexual initiation by older males; and the learning of a secret language, in this case Latin. I obtained my freedom from enslavement to the classics thanks to the direct intervention of a new headmaster, Sir Robert Birley, who single-handedly changed my life. He took me under his personal tuition, and in one-and-a-half
years of intensive coaching enabled me to obtain an open scholarship in history at Oxford. What made him so dazzlingly successful as a history teacher was his endless fund of enthusiasm for whatever topic happened to be upper-
most in his mind. Birley did not merely pull off the remarkable coup of training me in 18 months to get a history scholarship to Oxford. He also changed the course of my intellectual development a second time: immediately after the examination, he dispatched me to Paris for six months’ exposure to another European culture. There I first encountered (though not in the flesh) that remarkable phenomenon, the Paris mandarin intelligentsia, as well as the great Annales school of historians, then represented by Marc Bloch and Lucien Febvre. It was the beginning of a lifelong admiring but critical relation-
20 1985 ship with French intellectual culture which has deeply influenced my lite of learning.
Let me return for a moment to Sir Robert Birley. He was an eccentric figure—half loyal member of England’s ruling elite and of the Church of England, and half reforming rebel and idealistic visionary. As a young master
at Eton, for example, he openly expressed his sympathy with the strikers during the General Strike of 1926, a position for which some people never forgave him. He was headmaster first of Charterhouse, then of Eton, sandwiching in between a stint as Educational Adviser to the Deputy Military Governor of the British Zone of Germany after the war. Later he was Professor of Education at the University of Witwatersrand in Johannesburg. He was a conservative radical, whose nickname among the backwoods Etonian Tories was “Red Robert.” He was not only a great teacher, but a great moral reformer. Before the war he had fought Nazism, and spent hours trying to argue me out of my incipient pacifist tendencies. After the war he worked to bring a new generation of liberal Germans back into a federation of Europe. Later still, in the 1960s, he fought to bring education to the blacks of South Africa, personally conducting classes in Soweto; and finally he did his best to humanize and civilize those great barbarian institutions, the English public schools of Charterhouse and Eton. If Sir Robert Birley provided the inspiration for my scholarly interests, and deeply affected my moral and political attitudes, the second great influence upon me was an Oxford medieval history tutor, John Prestwich by name. He was-~~indeed, still is—one of those all too common Oxford figures, with a towering local reputation but no international visibility for lack of publications. I studied the Third Crusade under him, as a special subject. At first, I would read my weekly essay, which he would then systematically demolish, leaving me with little but a pile of rubble. I finally decided that my only hope of self-defense was to overwhelm him with data. Since the prescribed texts were entirely taken from the writings of the Christian crusaders, I sought out little-known chronicles by Moslem Arabs, of which I found a fair number in French translation. Artfully, and with studied casualness, I inserted into my essays some recondite facts from these obscure and dubious sources, as a result of which IJ at least got Prestwich momentarily rattled. I never won the battle, my arguments were always effectively demolished, but even the minor victories improved my self-esteem. The experience taught me the importance of sheer factual information—erudition, if you like—in the cut-throat struggle for survival in the life of learning. I discovered that knowledge is power. It
LAWRENCE STONE 21 was the experience of that term with John Prestwich which made me decide to be an historian, and an archive-based historian at that. The third great influence upon my development as an historian was R. H. Tawney. Every one knows about Tawney, the Christian socialist, at once the eminence grise and the conscience of the English labor movement in the first
half of the twentieth century, the cloquent preacher of equality, the stern denunciator of the evils of unbridled capitalism, the re-interpreter to the Anglo-Saxon world of Weber’s ideas about the relation of Protestantism to Capitalism, and the great historian of “Tawney’s Century,” the period 154.0— 1640 in England. He was a saintly, if not altogether practical, figure, the only person I have ever met who had a genuine dislike of money. He simply hated
the stuff, and tried, as far as humanly possible, to do without it. It was his impassioned book about the sufferings of the sixteenth century English peasantry from the enclosure of the land by ruthless capitalist landlords, and his
equally impassioned denunciations of the evil and corrupt machinations of early modern merchants, entrepreneurs and money-lenders, which drew me to the sixteenth century, and stimulated two of my first forays into print. I first met Tawney during the war, and I eagerly cultivated his company whenever I came back to London on leave from my ship. Although I was only an ignorant undergraduate from Oxford, and a sailor, he nevertheless always greeted me warmly. By then he had been bombed out of his house and was living in indescribable squalor in a leaky mews in Bloomsbury, surrounded by a chaos of books, papers, cats and leftover plates of food. Draft blueprints for
the Labor Party’s program for a more egalitarian post-war Britain were | jumbled up with notes on early seventeenth century English social history and tattered yellowing fragments of jottings about the Chinese peasantry. I had many long talks with Tawney, bundied up in overcoats in these unappetising surroundings, and I listened carefully to what he had to say, both about the
state of the world and how it could be put right, and about seventeenth century England. Listening carefully, I should add, was not easy, since one had constantly to be on guard lest he set himself on fire. This often happened when the long stalks of wild herbs, which he stuffed loosely into his pipe, caught fire and fell out on to his jacket or trousers, which as a result were always full of black burn holes.
What I learnt from Tawney was that the documents for carly modern history were preserved in sufficient quantity to make it possible to enter into the very minds of the actors. This single fact converted me from a medievalist into an early modernist. Second, I learnt that in this period there had taken
22 1985 place in England nearly all the greatest transformations in the history of the West: the shifts from feudalism to capitalism, and from monolithic Catholicism to Christian pluralism, and later to secularism; the rise and fall of Puritanism; the aborted evolution of the all-powerful nation-state; the first radical revolution in Western history; the first large-scale establishment of a relatively
liberal polity with diffused power, religious toleration, and a bill of rights; and the creation of a society ruled by a landed elite unique in Europe for its entrepreneurship, paternalism, and near-monopoly of political power. Finally I learnt from Tawney, as I had from Burley, that history can be a moral as well as a scholarly enterprise, and that it ought not, and indeed cannot, be disas-
sociated from a vision of the contemporary world and how it should be ordered.
The fourth important teacher to influence my thinking was yet another eccentric, Sir Keith Hancock, whom I did not meet until immediately after the war. It was his scholarly career and conversation that first proved to me that there was something to be said for an interdisciplinary and trans-cultural approach to history. For in his person and his writings he demonstrated that it was possible, and indeed fruitful, to know about such apparently diverse matters as land tenure in Tuscany, the career in South Africa of General Smuts, the economic development of Australia, and the history of modern
warfare. |
I was very lucky to have been brought at an early stage into contact with four such remarkable men. As a result of them, I survive today as something of a dinosaur, the last of the Whigs, and in many ways still a child of the Enlightenment. [ emerged from their tutelage with an abiding faith in reason, in the possibility of limited material and moral progress, in paternalist responsible leadership, and in the rule of laws, not men. It is a fading, tattered faith these days, a survival from that older liberal world of the Victorian professional class from which both Tawney and Birley sprang, and the ethos of which the Australian Hancock had absorbed during his long stay at All Souls College, Oxtord. During World War II, I spent five years at sea with the Royal Navy. As anyone who has experienced it knows, war is 99.9% boredom and discomfort and 0.1% sheer terror. In my case the discomfort was substantially mitigated by occupation of a cabin—admittedly shared—and plentiful and regular sup-
plies of food and, above all, alcohol. If there has ever been a just war in history, then this was it, and I do not regret my five-year diversion from the life of learning. In fact the diversion was not complete, for I wrote my first historical article
LAWRENCE STONE 23 while navigator of a destroyer patrolling the South Atlantic. I may not have
been a very good navigator of that destroyer—I confess I ran it aground twice—but at least I began my life of learning while on board. The subject of
the article was the shameful treatment by the government of the English sailors who had taken part in the Armada Campaign of 1588. The topic was obviously related to my immediate experience, but what is of more interest is whence I got my data. The answer is from that characteristic mid- Victorian
institution, the London Library, which right through World War I cheerfully and efficiently dispatched rare and valuable research books to the furthest ends of the earth, which often arrived some three to six months after they had been ordered. The contribution of that private library to the life of learning in Britain, especially during the war, can hardly be overestimated.
The end of the war found me attached to the American Seventh Fleet off Japan. Immediately after the armistice, [ was flown home from the Pacific, since for some unknown bureaucratic reason top priority for demobilization
in Britan had been decreed for three classes of persons: coalminers, clergymen—and students. The flight was one of my most dangerous experiences during the war, since the pilot was a psychological and physical wreck, with trembling hands, as a result of flying fifty missions over Germany. But it got me back to Oxford in early November 1945, just in time to enroll for the year, and so to take the final examination and graduate as a B.A. in June 1946.
By paying an extra five pounds I also got an M.A. degree on the same occasion, so that I was a B.A. for only about ten minutes, just time to change my gown and hood. I submit that this may be something of a record. I must also be one of the few people alive today to have bought a degree from a major university for hard cash and no work at all. I did not proceed to embark on a doctoral dissertation, since in those days this was still something that a graduate of Oxford or Cambridge felt to be beneath his dignity—a peculiar academic rite de passage that foreigners went in for, like Germans, or French, or Americans. Instead, I settled down with a research grant, and began, all by myself, to write a book, quite unaware of any foolhardiness in so doing. It was, of course, a terrible mistake, for I badly needed the close discipline and advice that only a conscientious official supervisor can provide. As a result, I had to learn from my own mistakes—and I made plenty.
I chose to write a biography of a late sixteenth-century entrepreneur, a financier of governments, an espionage agent, a diplomat engaged in the recruitment of mercenary armies, a world monopolist of alum (an essential raw material for the dyeing of cloth), and a business tycoon with a finger in
24. 1985 many, usually unsavory, pies. This bizarre figure began life as a member of a
distinguished Genoese merchant family and ended it as a Cambridgeshire country squire with a wealthy Dutch wife and an English knighthood conferred by Queen Elizabeth. Urbane and unscrupulous rogue that he was, in the end I found that I got to like him, although the book (An Elizabethan: Sir Horatio Palavicino) certainly served my original purpose of illuminating the seamuer side of early international finance capitalism. My next topic was inspired by the seminal articles of R. H. Tawney about the rise of the gentry in the century before the English Civil War—a theory which, if stripped of its Marxist ideology about the rise of the bourgeoisie and some of its dubious statistical props, has in fact turned out to be largely true. My first preliminary foray into this area was a disaster. I published an article claiming that most of the late Elizabethan aristocracy were hovering on the verge of financial ruin. Unfortunately the data were badly handled by me. It
was my tutor, Hugh Trevor-Roper, who had first drawn my attention to them, but without pointing out the problems inherent tn their interpretation. This mistake of mine provided him with the opportunity for an article of vituperative denunciation which connoisseurs of intellectual terrorism still cherish to this day. What I learnt from this episode—learnt the hard way—is that before plunging into a public archive, it is first essential to discover just why and how the records were kept, and what they signified to the clerks who made the entries. Before describing how I reacted to this setback, I must pause to explain a peculiar intellectual diversion: in 1946 I also began work on a large textbook on medieval English sculpture in a classic art history series edited by Sir Nikolaus Pevsner. This implausible venture into the field of professional art history came about in the following archetypical English way. First, that passion for collecting everything and anything, to which I have already referred, had somehow driven me in my middle teens to assemble photographs of English Romanesque sculpture. Equipped with a car—which cost all of three pounds—and a Kodak Box Brownie camera—which cost five shillings but happened by some miracle to have a near-perfect lens—I roamed the countryside during the holidays between 1936 and 1939, taking photographs of Romanesque sculpture in English churches. In 1938 I made contact with Sir Thomas Kendrick of the British Museum. He was then engaged on a national survey of Anglo-Saxon sculpture, and in his generous way took me—then still a schoolboy and first-year undergraduate—on his photographic team for two summer expeditions in 1938 and 1939.
After the war, in early 1946, Kendrick had been invited by Nikolaus
LAWRENCE STONE 25 Pevsner to write the volume on English Medieval Sculpture in his Pelican History of Art Series. He declined, perhaps because he already had hopes of becoming Director of the British Museum, as happened soon after. Asked who could do the job instead, Kendrick, who was ot a cautious man, named me. I was at that moment a history undergraduate at Oxford; I had never taken a course in art history or written a line about it in my life; and I had only
just come back from five years at sea. Trained in the professional German school of art history, Pevsner was understandably horrified by Kendrick’s irresponsible suggestion. But he felt he had to give me a contract, for reasons which he explained to me very frankly when we met: “Tom Kendrick won’t do it,” he said, “and there appears to be no one else in the country in the least
interested in the subject. Kendrick says that I should give you a contract. | don’t trust you at all, for you have absolutely no credentials for the job, but I don’t see what else I can do. I would like to see a draft chapter as soon as possible.” On this rather menacing note our interview ended, and a few days later [ happily signed the contract. Secretly, I was as uneasy as Pevsner himself about the capacity to pull it off of this ignorant, ill-educated, amateur ex-sailor and now undergraduate. This bizarre episode could only have happened in a society like England, which had remained as profoundly imbued with the cult of the amateur as it had been in the eighteenth century heydey of the virtuoso.
The episode was also only possible in a society which still operated on the eighteenth century patronage network system, in which a tiny entrenched elite distributed jobs and favors to their clients, friends, and proteges. Before resuming the narrative, something should be said about the intellectual atmosphere at Oxford in those far-off days just after the Second World War. In the mode of teaching and in the prescribed curriculum for the examination in the School of Modern (as opposed to Ancient) History at Oxford, nothing much had changed since its foundation at the end of the nineteenth century. It was a curriculum stifling both in its national insularity and in its limited late Victorian conception of what subjects were embraced within the canon of historical scholarship. It was perfectly possible and indeed normal to graduate with first class honors without having studied the history of any continent save Europe and indeed with only minimal knowledge of any country but England—not even Scotland or hapless Ireland. It was also not unusual to have studied little or no social, economic, demographic, cultural, artistic, intellectual, educational or familial history, and to be wholly innocent of any contact with quantitative methodology or the history of the working class. The social sciences were unknown, or if known were cordially despised.
On the other hand, under the guidance of gifted and dedicated tutors, the
26 1985 undergraduate education offered by Oxford was unsurpassed 1n its capacity to teach swift and clear writing, to encourage careful analysis of the evidence, and to produce a mind open to varying interpretations of a single event or set
of events. I consider myself extremely lucky to have had that remarkable experience.
In Britain, the post-war period was a time of boundless optimism and confidence—a fact which ts hard to remember, much less to comprehend, in these depressed and disillusioned post-imperial times, when England has sunk to the level of a third-rate power in almost all fields of endeavor except those of the pure intellect. To us young men who returned from the war in 1945, the whole world seemed to be our oyster, and all problems of scholarship—to say nothing of those of suffering humanity—were thought to be soluble. Some of this confidence in the future may have been stimulated by close cohabitation with our American Allies during the war. At all events, this was an optimism shared by nuclear physicists, Oxford philosophers, social historians, and Keynesian economists, as well as politicians. I well remember a dinner conversation with Peter Strawson, today one of Oxtord’s most distinguished philosophers, during which he expressed his anxiety about what he would find to do in his late middle age, since it was clear from the way things were going that
by then there would be no major philosophical problems left to solve. In history, some of us had much the same hubristic confidence in a wholly new approach. We were dedicated converts to the Azmales school of history based in Paris, and we were certain that the most intractable problems of history would soon fall to the assaults of quantitative social and economic investigation. The political narrative mode of our elders—“L’histoire historisante” as it was derogatively called—was beneath our contempt. In time, we believed, such hitherto unsolved problems as the causes of the English or the French
Revolutions, or the origins of capitalism and the rise of the bourgeoisie, would be solved by our new tools and new approaches. Bliss it was to be alive and _a radical social historian in 1945. We waited breathlessly for each new issue
of Annales or the Economic History Review, every one of which seemed to contain an article which opened up great new vistas of historical exploration and interpretation. I stress this atmosphere of self-confidence and heady excitement, since nothing could be more different from the self-doubt, uncertainty, caution, and scepticism about the very existence of truth or about ways to get at it, which afflicts all branches of the humanities today in 198s. Inspired by the mood of optimism of the late 1940s about the possibilities of the new social history, and stung by Trevor-Roper’s onslaught on my scholarly credentials, | decided to undertake a large-scale investigation into
LAWRENCE STONE 27 the economic resources and management, social status and military and political power, life-style, values, education, and family structure, of the English aristocracy in the century before the outbreak of the English Revolution. My initial assumption had been that the English aristocracy in that period was the epitome of an incompetent, frivolous, and decadent ruling class about to be set aside by a rising bourgoisie. Fifteen years of careful investigation, how-
ever, convinced me that this sumplistic model failed to fit the facts. The Marxist interpretation of the role of the aristocracy in the English Revolution,
with which I had set out, had been shattered by close contact with the empirical evidence.
The solution to my dilemma came from my belated discovery of Max Weber, whose writings, as they slowly appeared in more or less intelligible English translations, have probably influenced me more than those of any other single scholar. Weber’s subtle distinction between class and status, and his intense preoccupation with the relationship of ideas and ideology to social and political reality have guided my thinking and inspired my research from the mid-19sos to the present day. But the influence of neither Marx nor Weber explains why I have chosen to spend most of my life of learning studying the acts, behavior and thoughts of a ruling elite, rather than of the masses. One justification for such concentra-
tion upon so tiny a minority is that this is the only group whose lives and thoughts and passions are recorded in sufficient detail to make possible investigation in full social and psychological depth. Only this handful were fully literate, in the sense that they wrote continually to each other and about each other, and their writings have been preserved. If one wishes to discover the quirks and quiddities of personality, the intimacies of love and hate and lust, the revelations of financial speculation or rascality, the backstairs intrigues of
power and status, one is inexorably forced to concentrate one’s attention upon the elite, since the evidence about individuals in the past much below this high social level only rarely exists. Although I have relied heavily upon quantification—most of my books and articles contain graphs and tables—I have always been primarily concerned with people, following the maxim of Marc Bloch: “Ma proie, c’est Phomme.” In this pursuit I have been inexorably drawn to the elite. The other justification for concentration on the elite is that it was from this group that for centuries were drawn the political rulers of the country, and the patrons and principal consumers of its high culture. An Englishman, far more than the resident in any other Western country, does not have to read Pareto to learn about the dominance of elites. From his earliest childhood he 1s made
28 1985 acutely aware of the horizontal layering of the society in which he lives. This elaborate stratification 1s displayed even today at every moment by such external features as accent, vocabulary, clothes, table manners, and even physical size and shape. I have therefore spent the best part of my life following the
trails left in the records by that English landed elite which for so many centuries largely monopolized so much of the three great Weberian entities of wealth, status, and power.
As it happened, I could not have chosen a better moment than the late 1940s in which to plunge into the private archives of the English aristocracy, which for the first time had become accessible, thanks to the financial plight of their owners. For fifteen years I enjoyed the dizzy excitement of turning over and reading in archive rooms, cellars, and attics great masses of papers which no one had ever examined before. The most dramatic moment always came
on first sight of a private archive, which could range from the supremely orderly to the supremely chaotic. At one great house, the late duke had spent a lifetime sorting, cataloguing, and filing his huge collection of family papers, and in his last illness was said to have asked to be taken down to the archive room and laid on the work table in order to die amid his beloved papers. His son was a playboy, too busy chasing girls to bother answering the importunate letters of scholars. But by sheer luck, a telephone call was answered by his
aged nanny, who graciously agreed to let me into what turned out to be an amazing and amazingly well-ordered archive, filling several rooms. I believe I
was the first person to sit at that table since the removal of the late duke’s corpse. At another great seat I scribbled away in the depth of winter huddled up in
an overcoat and blankets at one end of a long freezing room, while at the other end two aged servants sat beside a small flickering coal fire, leisurely polishing the seventeenth century armor for the benefit of next summer’s tourists, and gossiping endlessly—and maliciously—about their master and mistress. When my fingers became too cold to hold the pen, I would join them round the fire for a few moments. It was a scene which could well have occurred in the seventeenth century. Another house had been gutted by a fire some thirty years earlier, but the
contents of the archive room had been saved and thrown pell-mell into a room above the old stables, now the garage. Squeezing past his huge RollsRoyce, the owner led the way up the creaking stairs, turned the key in the rusty lock and pushed the door. Nothing happened. Further forceful pushing nudged it partly open, revealing a great sea of paper and parchment covering
LAWRENCE STONE 29 the whole floor to the height of one to three feet. The only way to enter was to step on this pile, and, as I trod gingerly, seals of all ages from the thirteenth to the nineteenth century cracked and crunched under my feet. Rarely have I felt so guilty, but the guilt was later assuaged by being instrumental in getting the great archive deposited in the local record office for safe-keeping and cataloguing. Occasionally the owner of the papers would invite me to lunch. The experience was nearly always the same: a spectacularly elegant dining-room with millions of dollars’ worth of pictures on the wall; exquisite wine; execrable
food, so unappetising that it was often very hard to swallow; and erratic service provided by a bedraggied and sometimes rather drunken butler. Such were the pleasures and pains of the life of learning, as I wrote my book on The Crisis of the Aristocracy.
During the late 1950s the expansion of my interests, which first began with the discovery of Weber, was further stimulated by two events. The first was that in 1958 I joined the Editorial Board of Past and Present, which in my— admittedly prejudiced—opinion is one of the two best historical journals in the world (Annales: Economies, Ctvilisations, Societies being the other). At that time the Board was equally divided between Marxists (many of them longterm members of the Communist Party who had only recently resigned after the Russian invasion of Hungary), and liberals like myself. But although it is a very active and contentious board there has never been an occasion, so far as I can recall, in which the division of opinion has been on ideological lines of Marxists versus liberals. This is a small fact about English intellectual history, which is, I believe, worth recording, although I have no explanation to offer for it. The second event that turned out to have a major influence on my life as a scholar was the shift from Oxford to Princeton in 1963. This move—the most sensible thing I ever did in my life apart from getting married—-was made partly as the result of push—TI was tired of the insurmountable disciplinary ring-fence erected at Oxford around the core of English political and consitutional history, and also of the crushing burden of many hours of monotonous tutorial teaching; and partly as a result of pull—the open-mindedness to new ideas and new disciplines and new areas of the world which I had observed at Princeton on a visit two years before to the Institute for Advanced Study. At Princeton, I discovered two things. The first was a world of historical scholarship, embracing not only all of Europe, but also America (of whose history at the time of my arrival I knew nothing) as well as the Near East, and East Asia.
30 1985 One of the earliest results of this totally new world view was a joint article, written with my colleague and friend Marius Jansen, comparing education and the modernizing process in England and Japan. Another area of scholarship which for a few years in the 1960s greatly influenced my interpretation of historical development was the work then being done by American political theorists on the problems of “moderniza-
tion” and revolution. In retrospect, I think that my enthusiasm for their model-building was probably exaggerated, but at least they provided me with two valuable tools with which to break open the tough nut of Te Causes of the English Revolution of the mid-seventeenth century, a book I published in 1972. The first was the somewhat arbitrary but useful division of causes of such an explosion into long-term, medium-term, and short-term. The second was the
concept of “relative deprivation” which allowed me to break free from the fallacious necessity of relating observed behavior to objective conditions of life. But in doing so, I fell into a small puddle of jargon, freely using words like “pre-conditions,” “precipitants,” “triggers,” “multiple dysfunction,” “J curve,” and so on. All this and relative deprivation theory annoyed my English critics, who enjoyed themselves making a mockery of my enthusiasm for these new-tangled transatlantic words and concepts from the social sciences.
If I had to write the book today, I would use jargon more discriminately. Another great discovery made at Princeton was the scope and range of computerized quantitative historical studies then in progress in the United States. In my enthusiasm for this brave new world, I first conceived and then obtained funds for a massive statistical investigation of social mobility in the higher reaches of English society from the sixteenth to the nineteenth centuries. The tasks of directing the researchers, encoding the data, negotiating with the computer programmer and making summary tables from vast stacks of green print-out were fortunately undertaken by my wife, who spent fifteen years working on this project. My own work was interrupted, for reasons I will explain in a moment, and the results were only published last year in our book An Open Elite? England 1540-1880.
Political theory and computerized quantification are far from being the only novelties I found on arrival at Princeton. Another influence on my intellectual evolution at that time was the writings of the sociologist R. K.
Merton, from whom I learnt, amongst other things, the importance of medium-range generalization. This search for the Aristotelian mean in terms of probiems to be solved, is, in my view, the best safeguard against shipwreck on the Scylla of unverifiable global speculation, or the Charybdis of empirical
research so narrow in scope and positivistic in attitude that it is of little
LAWRENCE STONE 31 concern to anyone except one or two fellow-specialists, as practiced by so many young scholars today. Although we were colleagues at Oxford, it was only after arrival at Prince-
ton that I first discovered the work of the great anthropologist EvansPritchard, and more recently still that I came under the influence of the newer
school of symbolic anthropologists whose most eminent and most elegant practitioner is my friend Clifford Geertz. Above all, the contribution of the
anthropologists has been to alert historians to the power of “thick description”—that is, how a close and well-informed look at seemingly trivial acts, events, symbols, gestures, patterns of speech or behavior can be made to
reveal whole systems of thought; and to draw our attention to problems of kinship, lineage, or community structures, whose significance would have eluded us without their guidance. Finally, interest in the history of the family and sexual relations inevitably drew me to psychology. Here I found Freud less than helpful, partly because his time-bound late nineteenth-century mid-European values cannot be projected back onto the past, and partly because of the fundamentally ahistorical cast of his thought which assumes that the human personality is more or less fixed for life in the first few months or years. The developmental models evolved by more recent ego-psychologists, such as Erik Erikson or Jerome Kagan, are much more useful to the working historian interested in the continuous interplay of nature and nurture, of innate drives and overriding cultural conditioning. Freud certainly admits to such cultural configurations in his Civilization and Its Discontents but only in a negative and pessimistic way.
Before summing up. I must explain why it was that [ interrupted my quantitative project on elite mobility for some five years to write a large book on Family, Sex and Marriage in England 1soo-1800. It is a work based almost entirely on non-quantitative printed literary materials mainly from the elite class, and it lays as much stress on emotional as on structural developments. It came about this way. I had long been tinkering with a lecture on the family, when in 1973 I suffered a mild heart attack and was hospitalized for six weeks
without telephone, visitors or other contacts with the outside world. I felt perfectly fit and, allowing eight hours a day for sleep, there stretched before
me the prospect of being able to read without interruption by anyone for sixteen hours a day for forty-two days. If my mathematics are correct—which some think they rarely are—this adds up to a total of 672 hours of reading. I therefore instructed my wife to remove from the University library shelves all English collections of family letters, autobiographies, advice books, journals,
32 1985 etc. from the sixteenth, seventeenth, and eighteenth centuries, and bring them to my bedside, along with a substantial supply of paper. Thus armed, I read and read and read, and emerged six weeks later with almost enough material to write a book. Hence the diversion from the computerized project on social mobility, to which I returned five years later in 1977. Throughout my time at Oxford and Princeton, I have never wavered in my (always qualified) admiration for the Avnales school of historians in Paris. But it is a reflection of the change of time and mood that today, while retaining my deep admiration for the Aznales group as the most talented, innovative, and influential historians in the world, I nonetheless have developed certain reservations about their basic principles and methods, which were expressed in my notorious article on “The Revival of Narrative,” published in 1979. I am unconvinced that their favorite methodological division between static “structure” and dynamic “conjuncture” is always the best approach. Even less do I
accept their three-tiered model of causal factors in history, rising from the economic and demographic base through the middle layer of the social structure to the derivative superstructure of ideology, religion, political beliefs, and mentalité. This wedding-cake mode of analysis presupposes the predominance of material factors over cultural ones—which I reject—and also precludes the possibility, so well brought out by Max Weber, that the three levels are in a constant state of dynamic interaction, rather than in a hierarchy of domination and dependence.
Finally, there 1s a strong positivist materialism behind the thirst of the Annales school for quantifiable data about the physical world, which even in the immediate post-war period I found impossible to accept without reservations. For example, despite its enormous length, the most brilliant pioneer work of this School, Fernand Braudel’s The Mediterranean in the Time of Philip I, barely mentions religion, either Christianity or Islam. My 1979 article on the revival of narrative was explicitly intended as a statement of observed fact about the way the profession of history was going, and not at all as a prescriptive signpost for the future. It was designed to bring
out into the open a subterranean drift back to something I loosely—and I now think misleadingly—defined as “narrative.” The paper was taken in many quarters, however, as a programmatic call to arms against social science quantification and analytical history. Agitated defenders, fearful for their turf and
their grants, criticized my alleged betrayal of the good old cause in almost every journal in the profession. More in sorrow than in anger, my old friend Robert Fogel, in his Presidential address to the Social Science History Association, solemnly excommunicated me from that church. In some quarters I
LAWRENCE STONE 33 became an instant pariah. And yet in the subsequent few years my prophecy has, I believe, been fully vindicated. Except in economic history, where it still reigns supreme, old-style grandiose clrometric social science history now has
its back to the wall. More humanistic and more narrative approaches to history are indeed growing, micro-history of a single individual or an event is becoming a fashionable genre, and a new kind of political history, now firmly anchored in the social and ideological matrix, 1s reviving. Even intellectual
history—no longer that dreary “History of Ideas” paper-chase that always ended up with either Plato or Aristotle—has undergone an astonishing transformation and resurgence. All my work has been based on two fundamental hypotheses about how the historical process works. The first is that great events must have important causes, and not merely trivial ones. The second is that all great events must have multiple causes. This eclectic approach towards causation has given rise to a certain amount of negative criticism. Many scholars whose judgement I respect have described the assemblage of a multiplicity of causes for any given
phenomenon as “a shopping list,” the mere unweighted enumeration of a whole series of variables of widely different types and significance. This is true, but an argument for multiple causation can be made on the grounds used by Max Weber. They are convincing, provided that they form a set of “elective affinities,” held together not by mere random chance but by a system
of logical integration that points them all in the same direction and makes them mutually reinforcing. Despite the criticism, therefore, I still adhere to a feedback model of mutually reinforcing trends, rather than a linearly ordered hierarchy of causal factors. I do admit, however, that sometimes I have neglected to show just how this glue of “elective affinities” has in practice worked. Looking back on it, it 1s clear that what is peculiar about my intellectual career is that I have never stayed long in one place. Most historians select a single fairly narrow field as their own intellectual territory, and spend a life-
time cultivating that same ground with more and more tender loving care. The advantage of such a procedure 1s that one becomes the world expert on that patch of turf, building a framework of knowledge, expertise, and experience which is cumulative over a lifetime. I have deliberately followed a differ-
ent course, preferring to roam unusually freely across the historical prairie, although I have confined myself to a single culture, namely that of England, and mostly to a single class, the landed elite. But first, [ have ranged over time from the middle ages to the nineteenth century. Second, I have jumped from
topic to topic, from biography to economic history to art history to social
34 1985 history to cultural history to educational history to family history. Third, conscious from an early age of the provisional nature of historical wisdom, I have moved in a restless quest for theories, concepts, approaches, and models more satisfying than the old, and in methodological inspiration from Marx to Weber to some of the modern American social scientists, first sociologists, then political theorists, and more recently anthropologists.
This drift from century to century, this flitting from topic to topic, and these changes tn inspiration have inevitably brought their dangers and defects. First, they have meant working very fast, a process which can lead to mistakes, often minor but sometimes serious. Second, the level of my scholarly expertise in any single topic in any single century in any part of England 1s
inevitably less than that of one who has spent a lifetime tilling that particular field. Third, the desire to bring order and shape to a complex problem, such as
the causes of the English Revolution or the evolution of the family, has inevitably given rise to over-schematization and generalizations which need more qualification. After all, there are no generalizations, in history or any other discipline, which do not need more qualification. And fourth, the fact that my range of expertise is primarily concentrated upon the elite at the top of the social pyramid has sometimes led to rash and ill-informed assertions about the behavior of the lower classes. These are the reasons why so many more cautious academic reviewers, on receiving a new book of mine, instinctively reach for their pens and write: “There he goes again.” On the other hand, I have been saved—aif saved is the right word—from Parsonian functionalism, French structuralism, and linguistic deconstruction, partly by my inability to understand what they are all about, but mainly by a gut feeling that they are too simplistic and must be wrong. I have always been concerned with public affairs, the effect of which upon the life of learning has taken two forms. First, I have tried to save myself from being trapped in an academic ivory tower by reaching out for a larger audience. This has meant reviewing—often rather critically—-a wide range of books for journals with a large national readership. This is of course a highrisk policy that usually brings its punishment, for many of my victims sooner or later find their revenge by savaging a book of mine. In addition to review-
ing in national journals, I have tried to make my books accessible to the general public, by following the production of a large-scale academic study in hardcovers with that of a cheap paperback abridged version. One result of this concern with the world outside academia has been more profound in its consequences. Although it was not clear to me at the time, it is obvious upon reflection that the subject matter of my historical interest in the
LAWRENCE STONE 35 past has tended to shift in reaction to current events and current values. My first article, on the life of seamen in the Elizabethan navy, was written in 1942 on board a destroyer in the South Atlantic Ocean. My next enterprise, a book about a crooked international financier, was largely written in the socialist euphoria of the early days of the first British Labor government after the war. The third, on the aristocracy of the late sixteenth and early seventeenth centuries, was researched at a time when that class was in full financial crisis, and when great country houses were being abandoned and allowed to tumble down by the score. My work on students and faculty at universities began in the 1960s during one of the eras of greatest expansion of, and of greatest optimism about, higher education that has ever existed. At that time, I was particularly intrigued by the causes of a similar educational boom between 1560 and 1680. My interest continued, in a more pessimistic vein, after the student troubles of 1968-70, and after the period of heady expansion and affluence had come to an abrupt end. My attention thereafter has been fo-
cused on the causes of the dramatic decline of enrollments in grammar schools, universities, and Inns of Court between 1680 and 1770. My book on the family, sexuality, and marriage was conceived and written in the 1970s, at a time of heightened anxiety about just these issues, provoked by rocketing divorce rates, sharply declining marital fertility, much greater sexual promiscuity, changes in sex-roles caused by the women’s liberation movement, and the abrupt rise in the proportion of married women in the labor force. An Open Elite was written at a time when the demise of the great landed families, and their role in both the rise and the fall of British greatness was reaching a crescendo of public interest, for example in the phenomenal success of the television version of Evelyn Waugh’s Brideshead Revisited or of Mark Girouard’s book, The English Country House. It was begun when the elite who lived in these houses was thought to be in its death-throes, and when critics were blaming English contemporary decline on the absorption of the sons of Victorian entrepreneurs into the idle life-style and amateurish value system of the entrenched landed elite. Although I was not aware of it at the time, I seem to have been constantly
stimulated by current events into diving back into the past to discover whether similar trends and problems have occurred before, and if so how they were handled. Whether this makes for better or worse history, I do not know. A serious danger in such a present-minded inspiration for historical inquiry, however unconscious, is that the past will be seen through the perspective of the future and not in its own terms. There 1s a clear risk of Whiggish teleological distortion if the main question uppermost in the mind of the historian is
36 1985 how we got from there to here. On the other hand, it 1s just this explanation of the present that 1s the prime justification for an interest in history. The main safeguard from teleological distortion ts to keep firmly in mind that people in the past were different from ourselves, and that this difference must always be investigated and explicated. The further safeguard 1s always to bear in mind that there is a contingency factor in history, a recognition that at all times there were alternative possibilities open, which might have occurred but in fact did not: the Cleopatra’s nose principle, if you will. This then, for the time being, is the end of my chequered odyssey through the life of learning. I have constantly been under attack from ogres, dragons, and sea-serpents; I have several times been seduced by attractive-seeming sirens; I have made mistakes of navigation, which at least once brought me close to shipwreck. Although I have survived and sailed on, I have not yet set eyes upon the shores of Ithaca. But the story is not, I hope, yet over.
1986 MILTON V. ANASTOS Professor Emeritus of Byzantine Greek and History Uniwwersity of California, Los Angeles
I began my academic career most inauspiciously by being detained after school in the first grade. My offense was that, when called upon to read, I could never find the point at which my predecessor in the reading lesson had left off. Then I aggravated the situation by a simular failure after school. I do not now recall how many times I had to submit to this humiliation. Nor can I understand why I did not have the wit to point out in my defense that I had learned to read with ease long before entering the first grade. My trouble was that, as soon as the class started to read about the enthralling adventures of Jack and Jill, Fannie and her apple, the house that Jack built, and the rest, I raced on excitedly to the end of the tale, so as to discover for myself how these noble characters had made out 1n their perilous confrontation with the untverse. Consequently, when my turn came to read, I was completely lost and had no idea of what part of the story had been left to me. As I ponder this lugubrious chapter in my history, I cannot remember how I explained my ignominy to my mother. She was a generous and kindly lady, who, however, was altogether intolerant of academic derelictions on the part
of her offspring. Had she known of the undeserved suffering I had been undergoing, she would have stormed upon my schoolhouse and torn the place apart—brick by brick. So much, then, for my agonies 1n the first grade. But at least I was never left back in kindergarten. This ignominious fate, candor compels me to point out, overtook my dear wife, Rosemary Park, that peerless college president and vice-chancellor of UCLA. Despite her virtuosity in such kindergarten exercises as Weaving—-Over one, under one; over two, under two, etc.—that poor girl got stuck in kindergarten for two years. Of course, she had an excellent 37
38 1986 excuse, as who does not? Her parents had decided that she should repeat kindergarten, so that she might watch over her brother and sister, who were too young to be trusted to make their way across the streets without guidance. So far, I have nothing to report that does me credit. Unfortunately, embarrassment continued to haunt me during my freshman year in high school. In the first place, since my friends had warned me that Latin was very difficult and should be avoided, I chose French as my only foreign language. In doing so, I violated strict orders from home that I was to take four years of Latin. When parental wrath descended upon me for this disobedience, I pleaded that it was too late to make any changes in my curriculum and that Latin would have to be sacrificed. My mother then threatened to go to school herself to straighten the matter out. Naturally, I fought bitterly against this pernicious suggestion. But, in the end, I lost, and had to submit to the excruciating pain
of witnessing a maternal visit to my homeroom teacher, who, of course, readily consented to inflicting the study of Latin upon me. Somehow, I managed to survive this painful ordeal. Nevertheless, despite the most valiant efforts by my parents, who even sank so low as to offer me all
manner of bribes, I could not be induced to pay serious attention to my studies. As a result, I finished my freshman year in high school with a dismal average in the middle eighties. I was somewhat discomfited by this melancholy performance, but not greatly concerned until I learned that a classmate for whose intellectual gifts I had scant respect had the highest ranking in the
school. That was a shattering blow for me, and I spent a good part of the ensuing summer calculating how many subjects I needed and what grades I should have to make to catch up and take the lead myself. This in time I managed to do, and I must confess that the spirit of contrariety engendered by my resentment over my fellow student’s success in getting higher grades than I, was, somewhat perversely, a major factor in shaping my career and convincing me that I actually enjoyed studying and was
eager to strain myself in doing so. By the time I entered Harvard College in 1926, I had overcome my aversion to Latin and was easily persuaded by my freshman adviser to concentrate in classics. Here my experience differs from that of Professor Lawrence Stone, my illustrious predecessor as the Charles Homer Haskins Lecturer, who, you may recall, resented the time and energy he spent in mastering what are called the “dead” languages. For my part, [ have now been studying and using Greek and Latin steadily in all my research for just about 60 years, and have never ceased to find that they demand my most strenuous efforts and are the source of my most abiding satisfaction.
MILTON V. ANASTROS 39 Though deeply devoted to the classics, I had long intended to study law,
and entered the Harvard Law School with the class of 1935. I found the common law extremely congenial, but soon wearied of the callousness of many of my fellow students, who used to argue that, as prospective lawyers, we should concern ourselves, not with justice, but with nothing except the law. I am not certain whether they really believed this to be a sound rule of conduct or whether they argued this way out of sheer contrariety. Anyhow, I was annoyed by their attitude and I transferred to the Harvard Divinity School, not with the intention of entering the clergy, but seeking instruction in the theological sciences, especially in ecclesiastical history. It
was at this juncture that I had the good fortune of becoming intimately acquainted with a number of brilliant personalities who exerted a profound influence upon my future. At the Divinity School, as in the Department of Classics, classes were in general very small, rarely exceeding fifteen or twenty students, so that we had the advantage of direct personal guidance from our instructors.
I am greatly indebted to a number of these altogether extraordinary scholars. In my earlier years as a graduate student, my chief friend and advocate was Harry Austryn Wolfson, Nathan Littauer Professor of Hebrew Literature and Philosophy, and a prodigious worker of enormous learning, the
author of many outstanding books. Uncle Harry, as I called him, entered Widener Library almost every morning with the cleaning staff at seven and stayed all day, except for brief interludes for his meals and an occasional foray to Boston to see a movie, preferably a double feature. He was never too busy
to be consulted on any problem—imaginary, personal, or academic. Then there was Robert H. Pfeiffer, Hancock Professor of Hebrew Literature, the most amiable of men, whose Christmas parties for students and colleagues are legendary and still warmly remembered. He was the victim of a cruel injustice which for many years denied him the professorial rank he richly deserved. But
eventually he prevailed, to the great joy of those of us who had fought for him, sometimes at great risk to ourselves and our careers. But in our personal devotion and feeling of gratitude to our instructors lurks a great danger which I mention now with some trepidation. Young students in their enthusiasm are often exploited by their seniors. I myself suffered two frustrating experiences, to which I must briefly allude. Both of the professors involved in these episodes are now dead, and I suppress their names in this chronicle. De mortuis nil nist malum. For one of them I wrote a whole book, consisting of texts (which I either constituted myself or revised and re-edited) and translations. These were brought together in a large and
40 1986 impressive volume, in which I am mentioned briefly in the preface without any acknowledgment of the extent of my contribution. The alleged author took over my work in its entirety, “jazzed up” my translations (as he told me orally) (without reference to the original texts), added a few brief notes, and took the whole credit for the volume for himself. For the second of this pair of plagiarists I worked about four whole years collating and checking Greek manuscripts to establish a critical text, but got no word of recognition except for one brief sentence in the preface. Perhaps I may be forgiven if I remark that these few words, though gracious enough in themselves, were the very least reward that could have been offered for my long and selfless services, especially in view of the fact that many of our colleagues were aware of the extent of my efforts and had become restive about this kind of academic exploitation. These were my first publications, and I now list them in my bibliography as
“works written in collaboration with other scholars.” Indubitably I learned a great deal in carrying out these assignments, but at enormous cost. Many young scholars have had to contend with this problem. In some European centers of research whole edifices have been constructed for senior professors by their assistants, who received only the most meager compensation for what were truly monumental achievements. In the United States a notorious professor at a great university—not Harvard or UCLA— published under his own name an entire series of books which had been written by his doctoral candidates. One of these, now a colleague of mine at UCLA, got wind of what was being done with his dissertation and made a loud outcry until his name was added to the title page. I hope and believe that
the present generation of students has been able to protect itself against plagiarism and outright theft of this kind. It was at the Divinity School that I decided I should devote my life to patristic studies and Byzantine intellectual history. The commitment was twofold. In the first place, it seemed to me that the Byzantine field had been less intensively cultivated than many others that I found appealing and therefore offered opportunity for original research. That was nearly fifty years ago; and the situation has changed radically since then, although, of course, a vast amount of work still remains to be done, especially in the editing and expounding of texts, a large number of which have never been published. In the patristic field, I felt that many topics had been dealt with improperly. For example, I was convinced that the theologian Nestorrus had been unjustly condemned by the oecumenical councils. He was bishop of Constantinople (428-31) and had been attacked by Cyril, bishop of Alexandria (412—
MILTON V. ANASTROS AI 44), for dividing Jesus Christ into two persons, the man Jesus and the divine Logos, who was the Son of God. But Nestorius constantly insisted that he had never been guilty of so heinous an error, which would have amounted to
introducing a fourth member into the Trinity. |
After reading his book, the so-called Bazaar of Heracleides, in which he repeatedly defends himself against this charge, I concluded that he was the victim of both contrariety and personal animosity. Cyril was determined to denounce him as a heretic in order that Alexandria might prevail over Constantinople and took advantage of every opportunity to do so. Actually, both Cyril and Nestorius were guilty of ambiguity in the use of technical terms; and Nestorius, it must be conceded, was guilty of unconscionable prolixity and obscurity. But a review of the evidence in my opinion indicated that both theologians meant to be what we call orthodox and would have conceded that Jesus Christ, as the orthodox maintain, had two natures, one divine and one human, joined in indissoluble union in one person. In a lengthy article, I have, I believe, proved that Nestorius was as loyal to this principle as Cyril was; and, moreover, that he described the union of the two natures in one person in an exemplary orthodox fashion. For he says that the human Jesus “received his prosopon [1.e., person] as something created in such wise as not originally to be man but at the same time Man—God by the incarnation | enanthropesis| of God” (Bazaar, 1, 1, 64, p. 603 cf. 92.1f., 237). This
is an extremely subtle description of the oneness of Jesus Christ, and shows that Nestorius conceived the Man—God to have been the divine Logos, plus what would have become the separate individual man Jesus, if the Logos had not been united with him from the moment of conception. For the child born of the Virgin was at no time, Nestorius states, a separate man but “at the same time Man~—God.”
Many critics, at least, have found my analysis of Nestortus’s Christology to
be persuasive. But in discussing this question with me one day, the archbishop of Athens said with the utmost courtesy, “Professor, your argument is
very learned and undoubtedly sound. But why, then, did the Holy Spirit anathematize Nestorius as a heretic?” To which I replied, “For a very simple reason. The Holy Spirit never read his book!” After getting the degree of S.T.B. from the Divinity School, I went on to a
Ph.D. in history, engaging for my thesis in a truly delightful exercise in historical legerdemain, whereby I attempted to demonstrate not unsuccessfully, I believe, that the discovery of America by Columbus in 1492 was
achieved indirectly in part by Columbus’s reliance on certain texts of the geographer Strabo as excerpted and presented at the Council of Florence in
42 1986 1438-39 by George Gemistus Pletho, a learned Byzantine gentleman, who was
widely recognized in western Europe to be the greatest scholar of his day. My doctorate was awarded in 1940, a year in which the academic market for
Byzantine history proved to be dismal and hopeless. Then, most unexpectedly, the kind of cosmic savior I had been reading about in the later Greek philosophical texts loomed in the form of Dumbarton Oaks, a new research center devoted to Byzantine civilization which had been endowed by former Ambassador and Mrs. Robert Woods Bliss. By great good fortune, it had been set up as a department of Harvard University, and my appointment as fellow and subsequently professor of Byzantine theology in this academic paradise was indubitably one of the major turning points in my career, Many have benefited from this extraordinarily munificent gift, but I doubt that anyone has been a greater beneficiary than I. The whole idea of the institution was made to order for me, and I am deeply obligated to it for the opportunity
it gave me for nearly twenty years to do my research and writing without interference or interruption. At first, we had a library of less than 10,000 volumes, mostly concentrated in the arts. For many years, therefore, I spent over half of my time locating desiderata throughout the world and persuading the administrative authorities to acquire them. This bibliographical acquis-
itiveness of mine had a number of far-reaching consequences. In the first place, obsessed by my passion for books, I soon began to realize that it would be prudent for me to buy as many of the pertinent materials as I could afford myself. My idea was, and is, that Byzantine civilization can be traced back to Homer, and that the subject embraces every branch of learning—art, science,
economics, history, literature, theology, political theory, philosophy, law, magic, and so on. Acting on this theory, I have now amassed a personal library of some 50,000 volumes, which 1s esteemed by many to be one of the great such collections in the world. Secondly, I have been attempting to build up a complete bibliography of
the entire Byzantine field. At present, I have approximately two or three hundred thousand bibliographical slips, suitably classified by subject. In earlier days, by the grace of God, the University of California, the work-study program, and a well-disposed vice-chancellor—wot, I should add, my wife, who has always been a completely disinterested person—lI have had the col-
laboration of as many as five, six, and more of my best students. Now, however, under the present economic and political conditions, I have been reduced to only the barest minimum of assistance. We desperately need funds
to complete the bibliography, and computerize it so that it can be made available to scholars and institutions throughout the world.
MILTON V. ANASTROS 4.3 Furthermore, my conception of Byzantium has led me to prepare a largescale intellectual history of the Byzantine Empire in all of its aspects, which I call the Mind of Byzantium (MOB), and which in typescript some years ago
was estimated by representatives of the University of California Press to amount to between four and five volumes. I have already published an abridged version of it in three substantial parts of the Natonal History of Greece. This is in Greek, and I am resolved by the end of this calendar year to
finish the first volume (in English) on the legal position of the Byzantine Emperor and his relation to the Church. In descanting so lengthily on Byzantium, I must not pass over the war. On
the day of Pearl Harbor, I made an attempt to enlist in the Navy, and was unceremoniously rejected by a yeoman because my eyes failed to meet naval standards. Then, after some vicissitudes, I joined the Office of Strategic Services (OSS), hoping that my physical defects would be waived by this department of the armed services. After many delays, I finally was examined for a post as an intelligence officer in a parachute division. Then, as a crowning infamy, the examining physician said, “Boy, if we dropped you in a parachute, you would split! You can serve your country here in Washington.” Then, I was taken on by SI (Special Intelligence) and, after a few months, transferred to R and A (Research and Analysis), headed by William Langer, a great master of many fields of learning, who exerted enormous influence over the entire operation. Here, I was put in charge of what was designated as the “Greek desk,” and found that the organization was dominated by colleagues from the universities, a large number of whom were historians, economists, archaeologists, and philologists, whose specialized training in a great variety of fields fitted them uniquely to provide the kind of analysis and information that were desperately needed for the conduct and planning of the war. At one point, for example, when the Third Army of the United States was laying plans to cross the Rhine, [ was asked to compile the available information on the Byzantine Emperor Julian IPs crossing of the Rhine in 359, when he was Caesar. This was easily done, and I have often wondered whether these data were made available to General George Patton 1n the vicinity of Oppenheim on the eve of March 23, 1945, when he managed to slip across the
Rhine, “without,” as his aide announced at Supreme Headquarters Allied Expeditionary Force (SHAEF), “benefit of airborne drop, without benefit of the United States Navy or the British Navy and not having laid down the greatest smokescreen in the history of modern war.” This was one of the turning points of the final assault upon the Nazis, combined with the capture of the Ludendorff railway bridge at Remagen a
44. 1986 few days earlier, on March 7, by the Ninth Armored Division of the First Army commanded by Courtney Hodges. Thus both the First Army of the United States and the Third succeeded in making their way across the Rhine before the British (who did not succeed in doing so until March 24), despite Churchill’s erroneous statement in a prerecorded speech that the British had been the first to surmount this obstacle.! Many profited greatly intellectually and spiritually from their service in the war. I doubt, however, that I gained very much personally except that I was forced to devote attention to the Mediterranean area as a whole and to probe further into Greek history, politics, and philology. Moreover, I had an opportunity to observe at first hand how professors in a huge bureaucracy were able to serve the Muses and civilization at the same time. I must also have acquired some fluency in writing in the course of preparing innumerable reports and struggling to combine mountains of disparate facts into some kind of logical unity. On a somewhat different plane I learned in the interminable exchange of memoranda and critical reviews that it 1s actually possible to recognize personal differences of style. Many have been skeptical of this kind of criticism, which has long been a favorite tool in literary and historical analysis. But, on more than one occasion I saw empirical proof of its validity.
After many interesting experiences in the OSS, when it became evident that the United States was not going to invade the Balkans, I was able to return to Dumbarton Oaks and Byzantium, though I continued to serve a few months longer as a consultant. At Dumbarton Oaks I had the great advantage of associating with a number of remarkable gentlemen. ‘The most notable of these was Albert Mathias Friend, Jr., who had the unique position of being Director of Studies at Dumbarton Oaks, a department of Harvard University, at the same time that he was Marquand Professor of the History of Art at Princeton University. Friend was an unusually gifted Byzantinist and had a genius for stimulating research both at Dumbarton Oaks and at Princeton. He himself never published more than about five articles, but he inspired a large number of projects, primarily in the field of Byzantine art and archaeology. Although he never completed any part of the great work he had planned, he gave the 1 Omar N. Bradley and Clay Blair, A General’s Life: An Autobiography by General Omar N. Bradley (New York: Simon and Schuster, 1983), 404, 411, 413; Ralph Ingersoll, Top Secret
(New York: Harcourt, Brace and Co., 1946), 32 f.; Charles B. MacDonald, The Last Offensive: The U.S. Army in World War Il, The European Theater of Operations, 9 [Washington, D.C., 1973]), 213 ff., 267 ff, 303 fF.
MILTON V. ANASTROS 45 impetus to others who carried out the schemes he had adumbrated. He had an
encyclopaedic and unexcelled knowledge of just about all of the relevant monuments. But his skills lay in imparting ideas and providing trenchant criticism. He spent a great deal of time, at the expense of his own projects, in encouraging others. In any academic institution, especially in a relatively small and highly specialized one like Dumbarton Oaks, there are always sensitive prima donnas and bruised egos; and Friend regularly made the rounds among scholars and staff to buoy up their morale. As he said to me on one occasion, after a particularly heroic effort in patching up a wounded spirit, “You can’t get milk except from contented cows.” His greatest contribution was in per-
suading the Harvard Corporation to organize Dumbarton Oaks as a fullfledged department of the University with the regular professorial ranks, awarded on the usual basis. I owe a great debt to Friend, as well as to Carl Kraeling, Ernst Kitzinger, and John S. Thacher, all of whom played an important role in the direction of Dumbarton Oaks, the first three as Directors of Studies, the fourth as Administrative Director. At Dumbarton Oaks, having been released from the OSS, I now redoubled my efforts to make real headway with MOB. In addition to the incomparable privilege of my association with Dumbarton Oaks, I am greatly indebted for grants to the ACLS, the Guggenheim Foundation on two occasions, and the Fulbright fellowship program. By 1964 I had been a member of the Harvard community for thirty-eight rich and satisfying years, when, to my own surprise and that of my friends, I was persuaded to accept an invitation to UCLA. The University has provided admirable facilities for research and has enabled my colleague, Professor Speros Vryonis, and me to establish a flourishing school of Byzantinology, which has produced eight doctoral dissertations and played an important role in several more. Though delighted with California in every way, we remain
deeply devoted to Harvard, to which we fondly refer as the UCLA of the hast. In the remaining moments I should like to sketch briefly some of the major features of my concept of Byzantine civilization. One aspect of my work has
been an attack on the paradoxographers. There is a type of scholar that delights in paradoxes born of little more than contrariness. Their method 1s very simple. In ancient and medieval history, for example, they accumulate and analyze all of the extant sources on a given problem or historical event, eliminate obvious errors and contradictions, and then summarize what seems to be a series of inevitable inferences based upon these data. One would think
46 1986 that historians could safely rely on these results. But the paradoxographers, cither out of some hope of coming up with a more original solution, or out of
sheer contrartety, from which who among us has not suffered, deny the validity of the seemingly logical conclusions and attempt to upset them with new and usually radical hypotheses. One of the most striking examples of how the paradoxographers work 1s the account of the Emperor Constantine’s conversion to Christianity as related by two contemporary historians, Eusebius (writing in Greek) and Lactantius (in Latin), both of whom were personally acquainted with Constantine. The former wrote a panegyrical biography (usually referred to as the Vita) of the Emperor, who himself supplied his biographer with many important details, most notably on his famous vision and dream. The latter was tutor of the Emperor’s eldest son, Crispus.
In the Vita (1, 28) Constantine is said by Eusebius to have sworn that, about noon, he and his soldiers saw in the sky the trophy of a cross of light bearing the inscription tuto nika, “by this [sign] conquer.” That night, in a dream (1, 29), Christ appeared to him with the same sign which he had seen in the sky during the day, and commanded him to make a replica of this cross for use in combat against his enemies. This Constantine did, making a banner in
the shape of a cross (i.e. the labarum), and surmounting it with a symbol consisting of the initials of Christ, probably in the form 3% . This is the socalled Christogram, which, Eusebius says, Constantine wore on his helmet and displayed on all of his battle flags (1, 30f.). Lactantius adds that this device was inscribed upon the shields of Constantine’s troops (Ox the deaths of the
persecutors, 44, 5). It occurs frequently on Christian coins and many other monuments. On the strength of these passages, especially the testimony of Eusebius, it had generally been assumed that Constantine had been converted to Christianity on that fateful night of October 28, 312. But the paradoxographers got
to work on this incident, denied that Constantine had been coverted in the manner described by Eusebius, and invented the elaborate theory that the whole account of Constantine’s conversion was interpolated into the text of Eusebius by pious historians in the early fifth century. They have no textual evidence to support this hypothesis, nor do they explain how or why both Eusebius and Lactantius should have been 1n-
terpolated in more or less the same way. What is more, and this 1s the chief argument I would advance against the paradoxographers, there ts substantial archacological evidence which amounts to a complete vindication
MILTON V. ANASTROS 47 of the Eusebian and Lactantian texts as I have summarized them. This is to be found in a series of coins which were struck between the years 350 and 353.
The first group of these was issued by the usurper Magnentius (350-53) and
his son (Decentius), who put out a number of coins showing the labarum bearing a Christogram. This iconography indicates their eagerness to conciliate the legitimate emperor, Constantius IT, the son of the Emperor Constantine I, both of whom used the Christogram on their coins. Magnentius even went so far as to inscribésthe name of Constantius on his coins to advertise widely that it was his ambition to be the latter’s colleague on the throne rather
than his rival. N ;
Later on, however, 1n 353, Magnentius adopted the large Christogram on the reverse of his coins, which was copied from a coin struck by Constantius, in order to indicate that it was he, Magnentius, not Constantius, who was the defender of orthodox Christianity as defined in 325 at the Council of Nicaea agounst his rival Constantius, who had espoused the cause of the Arian theologians opposed to the Creed of 325. But, most importantly, the use of the Christogram itself gives proof that Eusebius’s description of Constantine’s
conversion was known in the Latin West in 350, only ten years after the historian’s death, and was not, therefore, the invention of a pious Christian writer of the fifth century. This conclusion is reinforced by the coinage of the usurper Vetranio, who was Emperor from March 1 to December 25, 350. The interesting point here 1s that Constantia, Constantine [’s daughter, fearful that the usurpation by Magnentius would imperil her brother Constantius’s hold on the imperial power, persuaded Vetranio to assume the purple. At the time, her brother, the Emperor Constantius II, was occupied in fighting the Persians in the East and, under the circumstances, consented to Constantia’s request that Vetranio be invested with the diadem. Here again the coins are of decisive significance. For some struck 1n 350 by
Vetranio and Constantius IJ, as well as in 351 by Gallus (Constantine’s nephew, who married Constantia), used as a reverse legend the words HOC SIGNO VICTOR ERIS (“by this sign you shall be the victor”), that is, the Latin translation of the original Greek, tuto nika (“by this [sign] conquer”). What is most remarkable is that Vetranio, a sumple soldier, who could neither read nor
write, thus succinctly memorialized in his coinage the leading idea of Eusebius’ chapters on Constantine’s conversion. Obviously, Vetranio had not read Eusebius in the original Greek, nor in any Latin translation, if there had
48 1986 been one in his day. The obvious explanation is that he learned of this whole
episode from Constantine’s daughter, who undoubtedly had heard of her father’s vision and dream from his own lips. In sum, far from being a fabrication of a theologian of the fifth century, the story of Constantine’s conversion was a familiar one in the latter’s family. Thus the coins add a new dimension to one of the most memorable events in ancient history. Of course, this is not to say that Constantine had ever actually had this vision and dream. But he not only claimed that he had had it and was not reticent in discussing it with his relatives and friends but also heeded the advice he had been given in this supernatural way. On the other hand, given the fears and anxiety he must have felt in preparing to meet a mortal enemy in combat, there is every reason to believe that he really underwent the experiences Eusebius reports. A great deal of scholarly research indubitably comes about in this rather complicated way. One group of scholars, either out of sheer contrariety or even because they truly believe they have good reason to reject what they take to be the conventional and uncritical publications of their predecessors, sets out in search of more original results, which they present so provocatively as
to stimulate the opposite reaction from the contrary-minded, who in turn attempt, with equal fervor, to reverse their immediate precursors and return to the status quo ante. The dispute about Constantine’s conversion is a good example of this kind of contrariety. A similar one has developed over the famous Edict of Toleration promulgated in 313 by Constantine and Licinius. For generations historians had been content to accept this enactment at its face value. But then, in 1891, Otto Seeck published a paper entitled “Das sogenannte Edikt von Mailand” (Zettschrift fiir Kirchengeschichte, 12 [1891] 38:-386) in which he argued that
this document was not an edict, was not promulgated in Milan, and was not by Constantine. He was then joined by a multitude of other paradoxographers, who attacked the traditional view on all fronts, and have been seduced by the temptation of attempting to prove that, despite his friendly disposition towards the Christian church, Constantine did not issue the Edict of Milan in 313 but that Licinius, whom Eusebius condemns as a persecutor of the Christians (Ecclestastical History, 10, 8, 819), did. This is a titillating conceit, height-
ened by the additional paradox that it is claimed that Constantine did not
even participate in the Edict. |
I have devoted an article of thirty pages to this problem and can only touch upon a few major points this evening. My chief argument ts derived from the
two sources for this so-called Edict, which in its influence ranks with the
MILTON Vv. ANASTROS 49 Declaration of Independence and the Constitution of the United States. For in it the emperors, Constantine I (306-37) and Licinius (308-24), who are specifically named as its authors, granted to the Christians and all others “the right to follow freely whatever religion they wished,” so that, as they put it, “whatever divinity there is in heaven might be favorable and propitious to us and to all our subjects.” This pronouncement occurs in virtually identical form in both Eusebius (Ecclesiastical History, 10, 5, 4) and Lactantius (On the deaths of the persecutors,
28, 2-12). Despite close agreement on all essential matters, there remain enough minor discrepancies between Eusebius’ Greek and Lactantius’ Latin to demonstrate that Eusebius’ source could not have been the Edict as found in Lactantius or vice versa. Hence, we have two independent witnesses that corroborate each other most impressively on all the principal points at issue, although neither was copied or transcribed from the other. The excerpts from Eusebius and Lactantius that I have summarized prove beyond doubt (a) that the Edict was issued by both Constantine and Licinius and (b) that their versions of it, as posted individually and separately by the two emperors in their respective jurisdictions, must have been identical or nearly so. Otherwise, Eusebius would not have included it in what he specifically designates as the laws of both Constantine and Licinius. Nor would both emperors have stated in so many words as they do (“I Constantine Augustus and I Licinius Augustus”) that they had actively collaborated in the project. Thus, these passages from Eusebius and Lactantius make it altogether impossible to deny that Constantine was one of the authors of this ordinance, or that he had published it as a law for the portion of the empire over which he ruled. These conclusions follow inevitably from the opening sentences of the
Edict (as I have quoted them). It is difficult to imagine how Constantine could have discussed religious freedom at Milan, as he says he had, and then drafted, or assented to, a law couched in the terms described, as both Eusebius and Lactantius agree that he did, without enacting it in his own name for his part of the empire. It is much more likely that Constantine arranged the conference at Milan, as well as the matrimonial alliance between his half-sister, (another) Constantia, and Licinius, at least in part so as to win over his imperial colleague to his own policy of religious toleration. Whether this was really his aim or not, it is inconceivable that Constantine, the first and greatest imperial benefactor of the Christian church and its most influential patron in the early centuries, apart from its Founder, could have failed in his own realm to promulgate this
$O 1986 great charter of Christian liberty which explicitly and systematically enacted into law the principles of which he was the most notable imperial exponent. Finally, I should say, I am content to rest my case with what I take to be the incontrovertible fact that Constantine promulgated the Edict of Toleration in his own realm, and avoid going into other more technical details such as (a) whether this form of Constantine’s legislation could be defined legally as an Edict, as I think it was, and (b) whether it was published by Constantine in Milan or elsewhere. But enough has been said, perhaps, to indicate what wide scope historical research offers for contrariety, which, however, in its final outcome, when carried through to the end, is constructive, and leads the way, despite some obfuscations and not a little irritation on both sides over what often seems to be perversity and wrongheadedness, to positive results. The major part of my work so far, aside from the considerable portions of MOB published in the National History of Greece, has been in longish articles and monographs devoted to solving what I have taken to be key problems of Byzantine intellectual history. In many ways the most significant of these was a paper devoted to the theologian Basil’s treatise Against Eunomtus. Basil, as you know (c. 330-79), was one of the leading orthodox champions against the heretical views of the Arians, who were condemned for their belief that Jesus Christ, the Son of God, was not co-eternal with the Father but was subsequent to him. Basil has always been cited as one of the principal defenders of the orthodox doctrine of the co-eternity of the Father and the Son. Indeed, in his critique of Agaimst Eunomius, he makes a special point of arguing that the “Son always existed, and never had a beginning of being” (C. Eunom., 2,12). In my analysis I carefully drew attention to Basil’s oft-repeated denunciation of the Arian propositions and his vehement affirmation of the orthodox principles. But then, in the course of reading the proof, I discovered to my amazement and great chagrin that Basil had also in this same treatise maintained that the Son got the beginning of his being from the Father and was second to the Father. These texts are in flat contradiction to his previous statements and
do not differ from the Arian position on this subject. In other words, this extraordinary inconsistency on Basil’s part, discovered by chance in preparing a lecture for a symposium, compels a re-examination of his reputation as a leading exponent of Byzantine orthodoxy.
Preoccupied quite properly as we are with the origin and production of learned tomes, we should not ignore what is in many ways the most essential of our scholarly tasks. That is, we are bound to keep in mind that our chief function is to disseminate the fruit of our research not only by original works
of scholarship but also by instruction as presented to our students. Our
MILTON V. ANASTROS 51 lectures should in a real sense enlighten them by giving them more than mere facts and information. We must remember Thomas Jefferson’s ideal of public education and seek in all of our teaching by form and manner, if not in actual words or precepts, to remind our students of the scope and ideals of the free society of which we are among the principal custodians. Permit me to illustrate this aspect of the pedagogical process by my lectures on the Roman law. In expounding the legal principle, Princeps legibus solutus est (as set forth in Dig. 1. 3. 31 and elsewhere), I pointed out that the Roman and Byzantine emperors were exempt from the laws. But the President of the United States is not. I gave this course over a period of years and made the
same comment on the meaning of the text each time, but with particular poignancy during the period of the Watergate scandal. Similarly, I have regularly reminded my classes that, despite the despotic and arbitrary power of the Byzantine emperors, who abused the rights and property of their subjects whenever they chose, it was Justinian’s Digest in 533 that enunciated the cardinal principle of the sanctity of a man’s home, which many have thought to be of English origin. The Digest (2. 4. 18) quotes Gaius as saying that “the majority have thought that it is not lawful to summon a person [to appear in court] from his own home because ‘a man’s home is his most secure shelter and refuge’ (domus tutissimum cuique refugium atque
receptaculum), so that anyone who should cite him out of it is held to be using violence.” An equally precious bulwark of freedom, the writ of habeas corpus, which has been described as “the most important single safeguard of personal liberty
known to the Anglo-American law,” is patterned upon the Roman “exhibitory interdict” (Dig. 43. 29. 1 pr. 1) “de homine libero exhibendo,” according to which, “quem liberum dolo malo retines exhibeas,” 1.e., “bring forth the freeman whom you are unjustly detaining.” In Roman law this remedy applied to sons or slaves wrongfully held under restraint by unauthorized persons. In the common law, as further defined by numerous statutes and safe-
guarded by the constitutions of the federal government (1. 9. 2) and of the several states, the writ of habeas corpus enables a person held in prison by the police to demand that he be produced in court without delay, and that cause be shown for his detention. The same writ is used in cases involving custody of children and guardianship. Interesting as they are, however, these texts from the Roman law, I have pointed out, do not reaily mean that the Byzantine Empire in practice recognized what the Fourth Amendment to the Constitution of the United States
describes as “the right of the people to be secure in their persons, houses,
§2 1986 papers, and effects against unreasonable searches and seizures.” For in Roman
law, the rulings I have quoted had reference to litigation between private citizens and not to criminal law. Actually, the Corpus Iuris did not concern itself with what we call crvil liberty or civil rights, which are guaranteed by the common law, by Amendments 10, 13—15, 19 and 24. of the U.S. Constitution
and by a succession of decisions by the various U.S. courts that rank high among the glories of our civilization. In Byzantium, the emperor could, and often did, ignore the rights of his subjects, and was not bound to respect what we call “due process of law,” which, by the common law, as confirmed by the Fifth, Sixth, and Fourteenth Amendments of the U.S. Constitution, protects the life, liberty, and property of us all, and applies to every form of property and every personal, civil, and political right. To summarize, in conclusion, my ideal of scholarship as ultmmately transcending all contrariety, let me say that true learning, however it 1s attained, ihamines more than just the intellect. It has a moral, a political, and even, I
should like to add, a democratic goal. Or, in the words of the poet of the Harvard Tercentenary in 1936: Light that is ight lights not the mind alone, Light that 1s light . . . lights the whole man.
1987 CARL E. SCHORSKE Professor Emeritus of History Princeton University
My first encounter with the world of learning took place, if family account is to be believed, when I entered kindergarten in Scarsdale, New York. To break the ice among the little strangers, my teacher, Miss Howl, asked her pupils to volunteer a song. I gladly offered a German one, called “Morgenrot.” It was a rather gloomy number that I had learned at home, about a soldier fatalistically contemplating his death in battle at dawn. The year was to19, and America’s
hatred of the Hun still ran strong. Miss Howl was outraged at my performance. She took what she called her “little enemy” by the hand and marched
him off to the principal’s office. That wise administrator resolved in my interest the problems of politics and the academy. She promoted me at once to the first grade under Mrs. Beyer, a fine teacher who expected me to work but not to sing. Was this episode a portent of my life in the halls of learning? Hardly. But it was my unwitting introduction to the interaction of culture and politics, my later field of scholarly interest.
I When I taught European intellectual history at Berkeley in the early 1960s, I devoted a portion of my course to the way in which the same cultural materials were put to different uses in different national societies. One day, I gave a lecture on William Morris and Richard Wagner. The intellectual journeys of these two quite dissimilar artist—-thinkers involved stops at many of the same
cultural stations. Morris began by using Arthurian legend to champion a 53
$4 1987 religion of beauty, then became an enthusiast for Norse mythology and folk
art, and ended a socialist. Wagner traversed much the same itinerary as Morris, but in the reverse direction, starting as a social radical, then reworking
Nordic sagas, and ending, with the Arthurian hero Parsifal, in a pseudoreligion of art. In the midst of delivering my lecture, I suddenly saw before me a picture from my childhood that I thought to be by Morris. (The picture proved to be the work of George Frederick Watts, then close to the Pre-Raphaelites.) It was Sir Galahad, a painting that hung in color reproduction on the middle landing of the staircase in our family’s house. Here was a beauteous knight in the best Pre-Raphaelite manner: a figure in burnished armour with a sensitive, androgynous face, mysteriously shrouded in misty bluish air. After the lecture, I recalled how my mother loved that picture, how indeed she loved Morris’ Defense of Guenevere, and the literature of the Victorian medieval revival from Scott onward. Not so my father. He poured contempt
on that feminine Sir Galahad. Now Wagner's Lohengrin or the Nibelungenlted—that was a medievalism he could embrace. Father not only loved Wagner’s music, he believed in Siegfried, the sturdy mythic socialist, as inter-
preted by G. B. Shaw in “The Perfect Wagnerite,” and in the anti-feminist
interpretation of Wagner of that curmudgeon radical, H. L. Mencken. Mother accorded a hard-won tolerance—no more—for the Teutonic longueurs of Wagner's operas, but none for the abrasive virility of Mencken or my father’s Shaw. Recalling hot parental arguments on such matters, I suddenly realized that,
in contraposing Morris and Wagner in my teaching, I had hardly left the family hearth. Freud would say that, here in the midst of my professional work as a historian, I was addressing in sublimated form a problem of the family scene. In any case, the episode brought home to me the power of my family in shaping the cultural interests and symbolic equipment with which I came to define my life. As far as I know, my parents had no deliberate idea of pushing me toward an academic career. Autodidacts both, they respected learning, but what they cultivated was not scholarship but a kind of natural intellectuality. The concerts, theaters, and museums that were their recreation became the children’s education. They fostered our musical interests not just with private lessons but by taking us with them into their choral societies. On my father’s twoweek vacations we went by rail and ship on intensive sightseeings trips: to New England historic sites such as Concord or the old ports of Maine; Civil War battlefields, where my grandfather had fought in a New York German
CARL E. SCHORSKE 55 regiment; the great cities of the East and Midwest from Philadelphia to St. Paul.
Along with all the elite cultural equipment, my parents introduced us children, through their lives as well as by precept, to the realm of politics. My father, son of a German-born cigar-maker, inherited the radical propensities that went with that socially ambiguous trade. As a young New Yorker, father had campaigned for Henry George and Seth Low in their mayoral races, and followed the radical free-thinker Robert Ingersoll. World War I made father, despite his profession as banker, a lifelong socialist. His deep-seated hostility
to America’s entry into the war—both as an anti-imperialist and an ethnic German-——gave his political orientation, though still progressive in substance,
a bitter, alienated quality by the time I came along in his forty-fifth year. I inherited a marginal’s sensibility from him as a German. When my mother, who, unlike my father, was Jewish, encountered unpleasant social prejudice during my high-school years, I acquired a second marginal identity. Perhaps this sense of marginality enhanced history’s fascination for me and shaped my attitude toward it, at once wary and engage. For me, as for my parents, politics acquired particular importance, both as a major determining force in life and as an ethical responsibility.
It In 1932 [ entered Columbia College. From Seth Low Library the statue of Alma Mater looked upon a space that contained the principal tensions of the university's life: In the foreground was n6th Street, New York City’s bisecting presence at the center of the campus. On the south side of the street stood the Sun Dial, a great sphere of granite, Columbia’s Hyde Park Corner. Here were held the rallies for Norman Thomas, who swept the student presidential poll in 1932. Here I took the Oxford Oath, pledging never to support my govern-
ment in any war it might undertake. Here too I watched in ambivalent confusion as anti-war sentiment slowly turned into its own opposite, militant anti-fascism, after Hitler occupied the Rhineland and Mussolini invaded Ethiopia. Political radicalism then bore no relation to university rebellion; it only invigorated the university’s intellectual life. In Columbia’s strongly defined academic culture, Clio still presided over much of the curriculum. It is hard for us to remember in our day of disciplinary differentiation and autonomy how much all subjects were then permeated with a historical perspective. Having deposed philosophy and become queen
50 1987 of the world of learning in the nineteenth century, Clio, though not as glamorous as she had been, still enjoyed pervasive influence. She dominated the only compulsory course for undergraduates, a two-year introduction, Contemporary Civilization in the West. It was designed in the spirit of the New History of the early twentieth century, that amalgam of pragmatism, democracy and social radicalism that James Harvey Robinson, Charles Beard, and
John Dewey had injected into Columbia’s university culture. The course presented us in the first year with three textbooks in modern European history: One economic, one social and political, and one intellectual. Our task was to generate out of these materials a synoptic vision of the European past, leading, in the sophomore year, to analysis of the American present. The structure of undergraduate major programs also reflected the primacy
of history as a mode of understanding in contrast to the intradisciplinary analytic and theoretical concerns that tend to govern the program in most fields of the human sciences today. The programs in literature, philosophy, even economics, were saturated with the historical perspective on human affairs.
I avoided a history major, which I felt would tie me down. Instead, I enrolled in Columbia’s two-year humanities Colloquium, which allowed one to construct one’s own program. Colloquium was centered in great books seminars conceived in a more classical spirit than usual in the university's prevailing pragmatist culture. The seminars were team-taught by truly outstanding young faculty members, such as Moses Hadas and Theodoric Westbrook, Lionel Trilling and Jacques Barzun. Watching their play of minds on the texts awoke in me for the first time a sense of the sheer intellectual delight of ideas.
The thought of an academic vocation, however, was slow in coming. Actually, I aspired to a career in singing, which I had studied since high school days. By my junior year, the sad truth grew upon me that my voice simply had not the quality to support a career in Lieder and the kind of Mozart
roles I dreamt of. In the same year, I enrolled in young Jacques Barzun’s course in nineteenth-century intellectual history. Barzun simply overwhelmed his few students with the range of the subject and the brilliance of his exploration of it. At work on his biography of Hector Berlioz, Barzun injected much musical material into his course. While I shared with my classmates the exciting experience that this course turned out to be, I drew one rather personal conclusion from it: intellectual history was a field in which my two principal extra-academic interests—music and politics—could be studied not in their
CARL E. SCHORSKE §7 usual isolation, but in their relationship under the ordinance of time. I was
ready to pursue it. |
Yet something held me back. I felt myself to be an intellectual, interested in
ideas; but could I be a scholar? Oddly enough, my Columbia experience offered no basis for an answer. As an undergraduate, I had only once been asked to prepare a research paper. Written exercises took the form of essays, oriented toward appreciation and interpretation of an issue or a text, with no particular attention to the state of scholarship or to the marshalling of empirical material to sustain a point of view. I found scholarly works often uninteresting; and when they truly impressed or captivated me, I found them daunting, far beyond my powers to emulate. The hue of resolution thus sicklied o’er by the pale cast of doubt, I sought advice. It was arranged for me to see Charles Beard, who was attending the American Historical Association’s 1935 convention in New York. Perched on the bed in his overheated room in the Hotel Pennsylvania, Beard poured forth his scorn for the pusillanimity and triviality of a historical scholarship that had lost all sense of its critical function in the civic realm. He gave me a formula for a fine scholarly career: “Choose a commodity, like tin, in some African colony. Write your first seminar paper on it. Write your thesis on it. Broaden it to another country or two and write a book on it. As you sink your mental life into it, your livelihood and an esteemed place in the halls of learning will be assured.” The second counselor to whom I turned, Lionel Trilling, then in the fourth
of his six years as an instructor in a still basically anti-Semitic Columbia University, almost exploded at me. What folly to embark, as a half-Jew, upon
an academic career in the midst of depression! Thus both of my gloomy advisors spoke out of personal experiences that confirmed the gap between the high calling of learning and some seamier realities of the academy. Net-
ther, however, could touch my central doubt, which was about my own fitness for scholarly research. There seemed no solution to that but to put it to the test. When I entered Harvard Graduate School in the fall of 1936, it was in a receptive spirit, but hardly with a strong vocation.
Itt To pass from Columbia to Harvard was to enter another world—socially, politically and intellectually. My undergraduate stereotypes of the two institu-
58 1987 tions doubtless led me to exaggerate their differences. But stereotypes can have roots in realities. The very physical structure of Harvard seemed to express a conception of the relation between university and society different from that of Columbia. Harvard was in the city but not of it. Where Seth Low Library looked upon the city street, Widener Library faced the Yard, a greenspace walled off from the surrounding town. The Harvard houses, with their luxurious suites, dining halls with maid-servants, separate libraries and resident tutors, expressed a unity of wealth and learning in which each lent luster to the other. Whatever its social elitism, Harvard was, as Columbia was not, a citadel of learning seemingly impervious to political tensions. Harvard had no Sun Dial, no central space for student rallies. The students must have felt no need for one. If politics had a presence here, it did not meet the newcomer’s eye. I was glad, given my self-doubts about a scholarly career, to take advantage of the opportunity that the University’s calm environment offered for submersion in the work of learning. The form of instruction at Harvard diftered even more strikingly from Columbia’s than its architectural form. At Columbia, we thought of our instructors as teachers, guides in the exploration of texts to make us generate intellectual responses. At Harvard, the instructors were more like professors, learned authorities dispensing their organized knowledge in lectures. The prevailing nineteenth-century idea of history, with its strong architecture of development and narrative structure, reinforced the authoritative lecture mode. Thanks to the man who became my advisor and mentor, William L. Langer, I had no chance to follow the narrow road of Charles Beard’s sardonic counsel about the strategy of the specialist. Langer urged me to take not just one seminar, but many, to gain experience in a variety of historical research techniques: economic, diplomatic, intellectual, and social. Seminar experience—especially with Langer—slowly dispelled my misgivings about a life of research, and gave me the much-needed intellectual discipline to pursue it. The greatest impact on my scholarly outlook and value system came not from the seminars in modern history, but from an intensive exploration of Greek history with William Scott Ferguson. Despite the fact that I was a modernist
without usable Greek, Ferguson took me on for an in-depth tutorial. Each week I went to his house for a two-hour discussion of the books he had assigned, ranging from the anthropology of pre-political tribes to Aristotle’s Athenian Constitution or the structure of Roman rule in Greece. For my general examination I prepared a special subject on Aristophanes under
CARL E. SCHORSKE 59 Ferguson’s guidance—an exercise that enabled me for the first time to ground a whole literary oeuvre in a field of social power. Ferguson’s critical tutelage really opened my eyes, as the field of classics has done for so many, to the
possibilities of integrated cultural analysis. It also remained with me as a model of pedagogic generosity.
The comparative quiet of Harvard’s political scene that I found on my arrival in 1936 soon changed. After 1938, when America began to face the menacing international situation in earnest, political concern became more general and intense within the university—and in me. Divisions on the issue
of intervention ran deep, and many of us, young and old, felt impelled to debate it publicly. When political passions run strong, the relation between one’s obligations to the republic of letters and to the civic republic can become dangerously conflated. Two personal experiences at Harvard brought
this problem home to me. |
The first occurred in 1940 in History I, the freshman course in which I served as a graduate teaching assistant. Its professor, Roger B. Merriman, a colorful, salty personality of the old school, passionately devoted to aristocratic Britain, believed, along with a few other staff members, that instructors had a public responsibility to get in there and tell the little gentlemen what the war was all about, to make them realize the importance of America’s intervention. A few of us, across the often bitter barriers of political division, joined hands to resist the use of the classroom as an instrument of political indoc-
trination. My two partners in this effort were Barnaby C. Keeney, later the first director of the National Endowment for the Humanities, and Robert Lee Wolff, who became professor of Byzantine history at Harvard. Quite aside from the principle involved, the experience of History I taught me how shared academic values could sustain friendships that political differences might destroy. The second experience, of an intellectual nature, left a permanent mark on my consciousness as an historian. The graduate history club had organized a series of what were called, in jocular tribute to Communist terminology of the day, “cells,” in which the student members prepared papers on problems that were not being dealt with in regular seminars. My cell took up the problem of contemporary historiography. We inquired into historical work in different countries as it evolved under the impact of recent history. I examined German
historians under the Weimar Republic and the Third Reich, not merely in terms of the political pressures upon them, but also in terms of the way in which specific cultural traditions in historiography, in confrontation with a new present, led to new visions of the past. I was astounded to discover that
60 1987 | some of the most nationalist historians justified their doctrinaire nationalism by an explicit philosophic relativism. The value of this exercise in the sociology of knowledge was not only in understanding the work of historians of other nations. It also sensitized me and my fellow apprentices in history to the
fact that we too live in the stream of history, a condition that can both enhance and impede the understanding of the past. Above all, 1t made us aware as our elders, in their positivistic faith in objectivity, were not, of distortions that can result from our positions in society.
IV The Research and Analysis Branch of the Office of Strategic Services, which I joined a few months before Pearl Harbor, has been rightly known as a second graduate school. My own intellectual debt to my colleagues there—especially to the German emigrés and to a stellar group of economists, some Keynesian,
some Marxist-—is not easy to calculate. The whole experience, however, taught me that, much as I enjoyed contemporary political research, I was not by temperament a policy-oriented scholar. When I was released from service in 1946—over thirty, the father of two children, without a Ph.D.—I found what proved to be an ideal teaching post at Wesleyan University. I was to stay for fourteen years. Of all my mature educational experiences, that of Wesleyan probably had the strongest impact on the substance of my intellectual life and my self-definition as an historian. Basic to both were the larger shrfts in America’s politics and academic culture in the late 1940s and 1950s. I would have encountered them in any university. But only a small college could have provided the openness of discourse that made it possible to confront the cultural transformation across the borders of
increasingly autonomous disciplines. At Wesleyan in particular, thanks to President Victor Butterfield’s selection of imaginative faculty members at the
war's end, an atmosphere of vital critical exploration prevailed. From my colleagues I received the multidisciplinary education for the kind of cultural history I soon felt drawn to pursue. In the first two years at Wesleyan, I had no sense of either the intellectual dilemmas about to appear or the new horizons that opened with them. Like most returning veterans, whether students or professors, I felt only a joytul sense of resuming academic life where I had left it five years before. The freshman Western Civilization course that I was asked to teach had just been introduced at Wesleyan by assistant professors fresh from Columbia. For me
CARL E. SCHORSKE OI it was a throwback to my own freshman year fourteen years earlier. Teaching four sections, I had more than enough opportunity to explore the riches of the
course. Once again I encountered there, in all its optimistic fullness, the premise that the progress of mind and the progress of state and society go hand in hand, however painful the tensions and interactions may sometimes be.
In framing an advanced course in European nineteenth-century history, I also returned to a pre-war pattern to explore the relationship between domestic national histories and international development. Even my European intellectual history course, though fairly original in its comparative national approach to the social history of ideas, bore the stamp of the American neoEnlightenment in which [ had been formed at home and at Columbia. Its central theme was the history of rationalism and its relation to political and social change. Viable enough for constructing an architecture of intellectual development before the mid-nineteenth century, the theme proved less and less useful as the twentieth century approached, when both rationalism and the historicist vision allied with it lost their binding power on the European cultural imagination. In the face of the fragmentation of modern thought and art, I fastened on Nietzsche as the principal intellectual herald of the modern condition. He stood at the threshold between the cultural cosmos in which I was reared and a post-Enlightenment mental world just then emergent in America—a world at once bewildering, almost threatening, in its conceptual multiplicity, yet enticing in its openness. After Nietzsche, whirl was king, and I felt rudderless. The conceptual crisis in my course set the broad question for my later research: the emergence of cultural modernism and its break from the historical consciousness.
While in my teaching I tested the dark waters of modern culture, my research was still cast in terms set by my political experience and values from the years of the New Deal and the War. I could not bear, after five years of engagement with National Socialism in the OSS, to resume my dissertation
on its intellectual origins, despite a substantial pre-war investment in the subject. Instead I turned to German Social Democracy as a thesis topic, and. concurrently, to a more general study of the problem of modern Germany. Behind both lay a pressing concern with the direction of world politics. The
two superpowers were in the process of creating through their occupation policies two Germanies in their own images: one socialist and antidemocratic, the other democratic and anti-socialist. Accordingly, the sawtoothed course of the divide between East and West in German politics ran
62 1987 between the two working-class parties, Communist and Social Democratic. Before World War I, these two groupings had been part of a single party committed to both socialism and democracy. Why had that unity failed to hold together? What was the historical dynamic that made democracy and socialism incompatible in Germany? Contemporary questions surely stimulated my historical research, though they did not, I hope, determine its results. I realize now that I was writing not only analytic history, but a kind of elegy for a once creative movement that history had destroyed. Parallel to the historical work on German Social Democracy, I explored directly the contemporary problem of Germany and American policy toward it for the Council on Foreign Relations. There I had an experience of the life of learning quite different from that of either government or academia. The members of the Council’s German Study Group, headed by Allan Dulles, were intelligent, influential members of America’s business and political elite. Most of them viewed German policy not as an area in which, as in Austria or
Finland, some kind of accommodation was to be sought with the Soviet Union, but as a counter in the fundamental conflict between the two powers. I continued to believe in the goal of a unified but permanently neutralized
Germany. That policy, which had been espoused by the OSS group with which I had worked, still seemed to me the only way of redeeming in some measure the damage of the Yalta accord and of preventing the permanent division of Europe. Although the Council generously published my analysis of the German problem, it rejected my policy recommendations. It was my last fling at influencing U.S. policy from within the establishment. The swift transformation of the East-West wartime alliance into the systemically structured antagonism of the Cold War had profound consequences for American culture, not the least for academic culture. It was not simply that
the universities became a prey to outer forces that saw them as centers of Communist subversion. The breakup of the broad, rather fluid liberal-radical
continuum of the New Deal into hostile camps of center and left deeply affected the whole intellectual community. The political climax of that division was Henry Wallace’s presidential campaign in 1948, in which I myself was
active. The bitter feelings it left in its wake only served to conceal a more general change in climate by which most intellectuals were affected, namely the revolution of falling expectations in the decade after 1947. The coming of
the Cold War—and with it, McCarthyism—forced a shift in the optimistic social and philosophic outlook in which liberal and radical political positions alike had been embedded. Wesleyan was a wonderful prism through which these changes were re-
CARL E. SCHORSKE 63 fracted. Several liberal activists of the social science faculty, including non-
religious ones, turned to the neo-Orthodox Protestantism of Reinhold Niebuhr to refound their politics in a tragic vision. Young scholars in American studies transferred their allegiance from Parrington and his democratic
culture of the open frontier to the tough moral realism of Perry Muller's Puritans. For undergraduates, a new set of cultural authorities arose. Jacob Burckhardt, with his resigned patrician wisdom in approaching problems of power, and the paradoxical pessimism of Kierkegaard elicited more interest than John Stuart Mill’s ethical rationalism or Marx’s agonistic vision. Existentialism, a stoical form of liberalism, came into its own, with Camus attracting
some, Sartre others, according to their political persuasion. Nothing made a greater impression on me 1n the midst of this transvaluation of cultural values than the sudden blaze of interest in Sigmund Freud. Scholars of the most diverse persuasions to whom my own ties were close brought the tendency home. Two of my teachers turned to Freud: the conservative William Langer used him to deepen his politics of interest, while the liberal Lionel Trilling, now battling the Marxists, espoused Freud to temper his humanistic rationalism with the acknowledgment of the power of instinct. Nor can I forget the day in 1952 when two of my radical friends, the Wesleyan classicist Norman O. Brown and the philosopher Herbert Marcuse, suddenly
encountered each other on the road from Marx to Freud, from political to cultural radicalism. Truly the premises for understanding man and society seemed to be shifting from the social-historical to the psychological scene. All these tendencies pointed American intellectuals in a direction that Europeans, with the exception of the Marxists, had gone half a century before: a loss of faith in history as progress. At a less credal level, but one actually more important for the world of learning, history lost its attractiveness as a source of meaning. Formalism and abstraction, refined internal analysis, and a new primacy of the theoretical spread rapidly from one discipline to another as all turned away from the historical mode of understanding of their subjects. For intellectual history, this tendency had two consequences, one relating to its educational function, the other to its scholarly method.
Students now came to intellectual history expecting consideration of thinkers no longer studied in the disciplines to which they belonged. Thus in philosophy, the rising Anglo-American analytic school defined questions in such a way that many previously significant philosophers lost their relevance and stature. The historian became a residuary legatee at the deathbed of the history of philosophy, inheriting responsibility for preserving the thought of such figures as Schopenhauer or Fichte from oblivion. In economic thought,
64 1987 a similar function passed to intellectual history as the economists abandoned their historical heritage of general social theory and even questions of social policy to pursue an exciting new affair with mathematics. An opportunity for intellectual historians, you say? Yes and no. We were simply not equipped to assume such responsibilities. At best we had paid little attention to the internal structure of the thought with which we dealt. We had a way of skimming the ideological cream off the intellectual milk, reducing complex works of art and intellect to mere illustrations of historical tendencies or movements. The new ways of analyzing cultural products developed by the several disciplines revealed such impressionistic procedures as woefully inadequate. The historian thus faced two challenges at once: to show the continued
importance of history for understanding the branches of culture whose scholars were rejecting it; and to do this at a moment when the historian’s own methods of analysis were being revealed as obsolete and shallow by the very ahistorical analytic methods against which he wished to defend his vision.
For me, the issue first came to focus in dealing with literature. When I charged my Wesleyan friends in the New Criticism with depriving literary works of the historical context that conditioned their very existence, they accused me of destroying the nature of the text by my excess of relativization. One irritated colleague hurled at me the injunction of e. e. cummings: “let the poem be.” But he taught me how to read literature anew, how the analysis of form could reveal meanings to the historian inaccessible if he stayed only on
the level of ideas, of discursive content. Other colleagues in architecture, painting, theology, and so on, similarly taught me the rudiments of formal analysis so that I could utilize their specialized techniques to pursue historical analysis with greater conceptual rigor. By the 1950s, the problems I have thus far described—the blockage in my course after Nietzsche, the changes in politics with the external and internal Cold War, the dehistoricization of academic culture, and the need for higher precision in intellectual history—all converged to define my scholarly agenda. I resolved to explore the historical genesis of the modern cultural consciousness, with its deliberate rejection of history. Only in a circumscribed historical context, so it seemed to me, could a common social experience be assessed for its impact on cultural creativity. Hence, a city seemed the most promising unit of study. Like Goldilocks in the house of the three bears, I tried out several— Paris, Berlin, London, Vienna—in seminars with Wesleyan students. I chose Vienna as the one that was “just right.” It was indisputably a generative center
in many important branches of twentieth-century culture, with a close and
CARL E. SCHORSKE 65 well-defined intellectual elite that was yet open to the larger currents of Euro-
pean thought. Thanks to my Wesleyan colleagues, I had acquired enough intellectual foundation to embark upon a multidisciplinary study.
V In 1959, when I was on leave at the Center for Advanced Study in the Behav-
ioral Sciences at Stanford, a Berkeley colleague asked me to take over his course in intellectual history for two weeks. The class, although over 300 strong, had a spirit of collective engagement and responsiveness that I simply had not encountered before. I was seized by the feeling that Berkeley, with its bracing intellectual atmosphere, was the place I had to be. Ironically enough, I had turned down an offer there only four years before without even visiting the Berkeley campus. Throwing shame and protocol to the winds, I called a friend in the history department to ask if the job were still open. Fortunately it was.
To pass from Wesleyan to Berkeley in 1960 was surely to move from academic Gemeinschaft to academic Gesellschaft. Wesleyan, with its intimate and open interdisciplinary discourse, had helped me to redefine my purposes as a scholar. Berkeley influenced the direction of my historical work much less. But it forced me to think through issues that I had not considered since
Harvard: the relation of the university to contemporary society, and my vocation as a teacher. The crisis of the 1960s presented them in depth and urgency. As a public university, Berkeley was, of course, especially vulnerable to the pressures of both state and society. When I arrived there in 1960, the shadow of the oath crisis of the 1950s and the McCarthy years still lay heavily upon the
faculty. Moreover, 1oo-year-old regulations barring political and religious speakers and campus political organization were still in force. Devised to protect the university’s immunity from outside pressures of state and church, these rules had become under current conditions nettlesome restrictions of academic freedom. Until 1964, however, it was not students but faculty members who took the lead in pressing the issue of free speech. My department, for example, unanimously agreed to make a test case of the restriction rules by
inviting Herbert Aptheker, a self-proclaimed Communist historian with a Ph.D. and solid publications, to address its graduate colloquium. When the administration, as it had to do, refused permission for the speaker and denied the department the funds to pay him, we took the colloquium off campus and
66 1987 held it in a church hall to dramatize our point: that a responsible educational
function had, in the University of California, to be conducted as an unauthorized off-campus activity. In another action, when a well-funded right-wing group conducted a statewide campaign of “education in communism” in the towns of California, the history department offered a public lecture series on comparative communism to counteract propagandists masking as scholars. Our historians, of widely
different political persuasions and with varied regional expertise, demonstrated to a large public by their example how the university could serve society by intellectualizing in analysis and rational discussion its most burning public problems.
With the civil rights movement and the Vietnam War, American politics took a new turn, with profound consequences for the university. The pressure on it came not only from the right and the establishment, as in the tosos, but from the left and those with social grievances as well. This led at Berkeley to a
shift in university attention from academic freedom and autonomy—a primary concern of the faculty—to political nghts and the freedom of university members to pursue on campus their causes as citizens—a primary concern of students. In a liberal society, academic freedom and civic freedom are interdependent, but they are not the same. The first relates to the universal republic
of letters, the second to the limited body politic. The recognition each must pay the other produces a delicate balance, easily upset when contestants locked in political struggle begin to see the university as a weapon or an obstacle. ‘This is what happened at Berkeley. Political rights having been too long denied in the name of academic immunity, academic autonomy began to be put at risk in the name of political rights. I became deeply involved as a minor actor in the ensuing crisis, serving first
on the Emergency Executive Committee of the Academic Senate, then as Chancellor’s Officer for Educational Development. Let me say only that I went through the same rhythm of anguish, illusion, hope and disabusement that is so often the lot of participants in intense social crises. [ realize now, on reflecting back, that once again my outlook and actions were marked by a kind of basic archetypical mental disposition to synthesize or unify forces whose dynamics resist integration. An ironic thrust seems to have characterized my intellectual work: In my book on Social Democracy, I had tried to comprehend socialism and democracy in a single perspective. In my intellectual his-
tory of Vienna, I had sought to integrate politics and culture in substance, historical and formal analysis in method. Now, in the crisis of university and
CARL E. SCHORSKE 67 society, I tried to reconcile academic autonomy and anti-war activism; in educational policy, faculty authority and educational renewal. Those who experienced the university crisis will know how searing the sense of dissolution can be, even if tempered now and again by a sense of future promise. I certainly had hopes that a stronger university community would issue from the crisis, and drew strength from the fine group of collaborating colleagues who shared my convictions about both free speech and educational reform. But in the conflict-laden environment, two other, less homogeneous entities made the situation bearable: my department and my classes.
The history department was deeply divided over the issues of university policy; more, it contributed articulate spokesmen to almost every shade of opinion in the Academic Senate. Yet when the department met on academic business, its divisions on personnel or curricular problems did not follow those in Senate meetings on university issues. I could expect to find in a
colleague who had opposed me on the Senate floor a staunch ally on a department matter. Professional ethos and collegiality remained intact. How different it was in other departments, such as politics and sociology, where methodological divisions tended to coincide with and reinforce political faction! My classes, buoyant and intellectually engaged through all the troubles, also were a continuous source of stability. However, the pressures of the crisis caused me to rethink my teaching. Once, after a final lecture in intellectual history, [ had an experience that gave me food for thought. My students gave me the customary tound of yearend applause. After all the difficulties of that year, I floated out of the lecture
room on cloud nine. Then, as I walked down the corridor, I heard a girl behind me say to her companion, in a voice heavy with disgust: “And they call
that a dialogue!” The remark jerked me back to earth. Beneath it lay two problems: first, student hunger for closer relations with the instructor, always present to some degree, but intensified by the unrest into a widespread rejection of the lecture system as “impersonal.” Second, the passage of the student revolt from politics to culture. The gap that had opened between generations in both moral and intellectual culture was real—and in fact, wider than that in politics. How to bridge that gap, and make it possible for the professor of one generation to deal with new questions arising in another: that was the prob-
lem my jaundiced critic raised for me. It crystallized my interest in new educational forms suited to the mass university.
To bring my ideas of the intellectual tradition into a new relation to
68 1987 students’ questions, I restructured my course on polycentric lines. While I continued to present my interpretation of intellectual history in the lectures, I displaced the locus of instruction into a series of satellite seminars. These were organized on topics defined not by me, but by graduate teaching assistants. I
asked them to deal with the same thinkers as I presented in my lectures, but left each free to choose texts of those thinkers more suited to the particular theme each had selected. They came up with themes I could not have thought of at the time, such as “The Costs of Freedom,” or “The Idea of the Feminine in European Thinking.” The graduate T.A. thus became a mediator between my professional discipline and standards in which he had a vocational stake, and the concerns of the new generation of which he was a part. All gained by the enlargement of the T.A.’s authority. The satellite seminar not only helped satisfy the felt need for dialogue, which in fact any section system might provide; it also set up a healthy dialectic between the interpretive scheme of my lectures and the ideas and existential concerns of the students reflected in each seminar’s special theme. As I followed the intellectual yield of the seminars, I was made aware of the deep truth of Nietzsche’s observation that a new need in the present opens a new organ of understanding for the past. Many ideas that have become more widespread, such as Foucault’s, first arose for me there. The satellite seminar system was adopted by a few others both in Berkeley and Princeton, and was effective for its time. In the mid-1970s, however, when deference to the canonical in matters intellectual and social quiescence returned, it lost its appeal for graduate assistants. Well suited to its time, its tume soon passed. In educa-
tion as in scholarship, one must live in the provisional, always ready to acknowledge obsolescence and to adapt the forms of instruction to changes in both culture and society.
VI I went to Princeton in order to save, if possible, my scholarly work. It was not the fault of the University of California, which I dearly loved, that I invested so much psychic energy in institutional life and in my teaching. But, given a tendency to neglect research for the other claims on the academic man, I could not resist the temptation of an appointment at Princeton University coupled with a half-time fellowship for three years at the Institute for Advanced Study. At Wesleyan in the 1950s, in response to the impact of the rightward shift of
post-war politics and the de-historicization of academic culture, I had re-
CARL E. SCHORSKE 69 defined my mission and method as an interdisciplinary intellectual historian. At Berkeley in the 1960s, a university under the double pressure of America’s conservative establishment and a recrudescent youthful left, I grappled in
thought and action with finding the right relation between university and society. Part of a strong group of intellectual historians at Berkeley within a department of great diversity, I felt I was doing the work of my guild when I tried to adapt my subject to the intellectual and existential needs of a new generation of students. At Princeton in the 1970s, the center of my vocation shifted somewhat, from inside the history department to the humanities as a whole. Here again, a change in academic culture led me to redefine my function. Fundamental to it was the polarization of the social sciences and the humanities from each other. That process, which had begun in earnest in the 1950s, now reached a new intensity. The concern with aggregate, depersonalized social behavior on the one side, and the concern with linguistic and structuralist textual analysis independent of any social context on the other did not simply diminish the
relevance of history to both groups. Their mutually exclusive conceptual systems also penetrated the discipline of history itself. Social historians, seeking the “otherness” of past cultures or of classes neglected in previous historiography, became more interested in the static cross-section of culture in the
manner of anthropologists than in the dynamics of continuous transformation. At the other end of the spectrum, among intellectual historians, Hayden White lifted intellectual history clear of its social matrix by analyzing historiography as a literary construct. Synchronic recovery of a static slice of the past at one end of the spectrum, humanistic theory of forms at the other: these recapitulated within history itself in the 1970s the loss of interest in process and transformation that had marked the new academic culture outside history in the 1950s. In my Princeton history department, the dominant orientation was toward the social sciences. Iam no theorist and no methodologist. My way of addressing the problem
of polarization in the scsences humaines and in history itself was through teaching—-but this time not alone, and not purely within history. A small group of Princeton faculty from different departments joined me in devising
an undergraduate interdisciplinary program called European Cultural Studies. Its regnant idea was to bring to bear on the same objects of study the separate lights of social scientists, historians included, and humanists-—the groups that elsewhere were pulling so far apart. All courses in the program
were taught in two-person teams—hopefully one social scientist and one humanist. Few social scientists other than social historians could be induced
70 1987 to join the program. But the seminars did establish a field of discourse relating the social and ideational worlds to each other, despite the autonomism of our academic culture. In a more personal sense, teaching over some years with scholars in philosophy, architecture, Russian, German, and French literature made of my last teaching decade a quite new learning experience. From one of
the seminars, on Basel in the nineteenth century, issued a research project with my teaching partner, a study echoing the concern of my Berkeley years: the relation between university culture and social power. During much of my scholarly life, I worked to bring the arts into history as
essential constituents of its processes. In the last years, I have reversed the effort, trying to project historical understanding into the world of the arts, through work with museums, architecture schools, and critical writing for the larger public. The venue may change; the forms of one’s engagement alter as one grows older and the world changes. Preparing this account, however, has made me realize all too clearly that I have not moved very far from the issues that arose in my formative years, when the value claims of intellectual culture and the structure of social power first appeared in a complex interaction that has never ceased to engage me.
1988
JOHN HOPE FRANKLIN James B. Duke Professor Emerttus of History Duke Universtty
As I began the task of putting the pieces together that would describe how I moved from one stage of intellectual development to another, I was reminded of a remark that Eubie Blake made as he approached his ninety-ninth birthday. He said, “If I had known that I would live this long I would have taken
better care of myself.” To paraphrase him, if I had known that I would become an historian and the Haskins Lecturer for 1988 I would have kept better records of my own pilgrimage through life. I may be forgiven, therefore, if I report that the beginnings are a bit hazy, not only to me but to my parents as well. For example, they had no clear idea of when I learned to read
and write. It was when I was about three or four, I am told. My mother, an elementary school teacher, introduced me to the world of learning when I was three years old. Since there were no day-care centers in the village where we lived, she had no alternative to taking me to school and seating me in the rear where she could keep an eye on me. I remained quiet but presumably I also remained attentive, for when I was about five my mother noticed that on the sheet of paper she gave me each morning, I was no longer making lines and sketching out some notable examples of abstract art. I
was writing words, to be sure almost as abstract as my art, and making sentences. My mother later said that she was not surprised much less aston-
ished at what some, not she, would have called my precocity. Her only reproach—to herself, not me-—was that my penmanship was hopelessly flawed since she had not monitored my progress as she had done for her
enrolled students. From that point on, I would endeavor to write and through the written word to communicate my thoughts to others. 71
72 1988 My interest in having some thoughts of my own to express was stimulated
by my father who, among other tasks, practiced law by day and read and wrote by night. In the absence of any possible distractions 1n the tiny village, he would read or write something each evening. This was my earliest memory of him and, indeed, 1t was my last memory of him. Even after we moved to Tulsa, a real city, and after we entered the world of motion pictures, radio, and television, his study and writing habits remained unaffected. I grew up believing that in the evenings one either read or wrote. It Was always casy to
read something worthwhile, and if one worked at it hard enough he might even write something worthwhile. I continue to beleve that.
I Two factors always plagued my world of learning for all of my developing years. One was race, the other was financial distress, and each had a profound influence on every stage of my development. I was born in the all-Negro town of Rentiesville to which my parents went after my father had been expelled from court by a white judge who told him no black person could ever represent anyone in his court. My father resolved that he would resign from the world dominated by white people and try to make it among his own people. But Rentiesville’s population of fewer than 200 people could not provide a
poverty-free living even for one who was a lawyer, justice of the peace, postmaster, farmer, and president of the Rentiesville Trading Company, which, incidentally, was not even a member of the New York Stock Exchange. The quality of life in Rentiesville was as low as one can imagine. There was
no electricity, running water, or inside plumbing. There was no entertainment or diversion of any kind—no parks, playgrounds, libraries, or newspapers. We subscribed to the Muskogee Dazly Phoenix, which was delivered by the Missouri, Kansas, and Texas Railroad as it made its way southward through the state each morning. The days and nights were lonely and monot-
onous, and for a young lad with boundless energy there was nothing to do but read. My older sister and brother were away in private school in Tennessee, and one did not even have the pleasure of the company of older siblings. Now and then one went to Checotah, six miles away, to shop. That was not always pleasant, such as the time my mother, sister, and I were ejected from the train because my,mother refused to move from the coach designated for whites. It was the only coach we could reach before the train moved again, so my mother argued that she would not move because she was not to blame if
JOHN HOPE FRANKLIN 73 the train’s white coach was the only one available when the train came to a halt. Her argument was unsuccessful, and we had to trudge back to Rentiesville through the woods. There were the rare occasions when we journeyed to Eufala, the county seat, where I won the spelling bee for three consecutive years. There was Muskogee to the North, where I went at the age of five for my first pair of eyeglasses, the malady brought on, I was told, by reading by the dim light of a kerosene lamp. It was a combination of these personal and family experiences that forced my parents to the conclusion that Rentiesville was not a viable community. They resolved to move to Tulsa. First, my father would go, find a place, set himself up in the practice of law, and we would follow six months later, in June 1921, when my mother’s school closed for the summer recess. That June, however, we received word that in Tulsa there was a race riot,
whatever that was, and that the Negro section of that highly segregated community was in flames. At the age of six I sensed from my mother’s reaction that my father was in danger. We were all relieved several days later, therefore, when a message arrived that he had suffered no bodily harm, but
that the property he had contracted to purchase was destroyed by fire. He practiced law in a tent for several months, and our move to Tulsa was delayed by four years. In the month before I reached my eleventh birthday, we arrived in Tulsa. It was quite a new world, and although a city of less than moderate size at the time, it was to my inexperienced eyes perhaps the largest city in the country. I did not see much of it, however, for racial segregation was virtually complete.
I thought that Booker T. Washington, the school where I enrolled in grade seven, was the biggest and. best school until one day I saw Central High for whites. It was a massive, imposing structure covering a city block. I was later
to learn that it had every conceivable facility such as a pipe organ and a theater-size stage, which we did not have. I also learned that it offered modern
foreign languages and calculus, while our school offered automobile mechanics, home economics, typing, and shorthand. Our principal and our teachers constantly assured us that we need not apologize for our training, and they worked diligently to give us much of what was not even in the curriculum. Now that the family was together again I had the example and the encouragement of both of my parents. My mother no longer taught but she saw to it that my sister and I completed all of our home assignments, promptly. Quite
often, moreover, she introduced us to some of the great writers, especially Negro authors, such as Paul Laurence Dunbar and James Weldon Johnson,
74. 1988 who were not a part of our studies at school. She also told us about some of the world’s great music such as Handels’ oratorio, Esther, in which she had sung in college. While the music at school was interesting and lively, especially after I achieved the position of first trumpet in the band and orchestra, there was no Handel or Mozart or Beethoven. We had a full fare of Victor Herbert and John Philip Sousa, and operettas, in more than one of which I sang the leading role. Often after school I would go to my father’s office. By the time I was in high school, the depression had yielded few clients but ample time which he spent with me. It was he who introduced me to ancient Greece and Rome, and he delighted in quoting Plato, Socrates, and Pericles. We would then walk home together, and after dinner he went to his books and I went to mine. Under the circumstances, there could hardly have been a better way of life, since I had every intention after completing law school of some day becoming his partner. It was in secondary school that I had a new and wonderful experience which my parents did not share. It was the series of concerts and recitals at Convention Hail, perhaps even larger than the theater at Central High School which I never saw. As in the other few instances where whites and blacks were
under the same roof, segregation was strict, but I very much wanted to go with some of my teachers who always held season tickets. My parents would never voluntarily accept segregation; consequently, the concerts were something they could forgo. Even at court my father refused to accept segregation. Whenever I accompanied him, which was as often as I could, he would send me to the jury box when it was empty, or when there was a jury trial, have me
sit at the bench with him. They took the position, however, that if I could bear the humiliation of segregation, I could go to the concerts. Thus, with the money I earned as a paper boy, I could purchase my own tickets. To be more accurate, | was not the paper boy but the assistant to a white man who had the paper route in the black neighborhood. It was at one of these concerts that I heard Paul Whiteman present Gershwin’s Rhapsody in Blue while on a nationwide tour in 1927. I also attended the annual perfor-
mances of the Chicago Civic Opera Company, which brought such stellar singers as Rosa Raisa, Tito Schipa, and Richard Bonelli to Tulsa. I am not altogether proud of going to Convention Hall, and there are times, even now, while enjoying a symphony or an opera, when I reproach myself for having yielded to the indignity of racial segregation. I can only say that in the long run it was my parents who knew best, though later I made a conscious effort to regain my self-respect.
JOHN HOPE FRANKLIN 7 II There were many sobering experiences at Fisk University, which I entered on
a tuition scholarship in 1931. The first was my encounter with at least two dozen valedictorians and salutatorians from some of the best high schools in the United States. The fact that I had finished first in my high school class did
not seem nearly as important in Nashville as it had in Tulsa. Imagine my chagrin when a whiz kid from Dayton made all A’s in the first quarter while I made two B’s anda C+. My rather poor grades were somewhat mitigated by my having to hold three jobs in order to pay my living expenses. I was also absolutely certain that the C+ resulted from whimsical grading by the teaching assistants in a course called Contemporary Civilization. As I think of it
now I still become infuriated, and if there was anyone to listen to my case today, I would insist that my examinations be reevaluated and my grade raised
accordingly! I was consoled by my salutatorian girlfriend, now my wife of forty-seven years, who over the years has lent a sympathetic ear to my rantings
about the injustices in that course. She could afford to be charitable. She received a grade of B+. Another sobering experience was my first racial encounter in Nashville. In a downtown streetcar ticket window, I gave the man the only money I possessed, which was a $20 bill. I apologized and explained that it was all I had and he could give me any kind of bills he wished. In an outburst of abusive language in which he used vile racial epithets, he told me that no nigger could
tell him how to make change. After a few more similar statements he proceeded to give me $19.75 in dimes and quarters. From that day until I graduated, I very seldom went to Nashville and when I did I never went alone. It was about as much as a sixteen-year-old could stand. I thought of that encounter some three years later and felt almost as helpless when a gang of white
hoodlums took a young black man from a Fisk-owned house on the edge of the campus and lynched him. As president of student government I made
loud noises and protests to the mayor, the governor, and even President Franklin D. Roosevelt, but nothing could relieve our pain and anguish or bring Cordie Cheek back. Incidentally, the heinous crime he committed was that while riding his bicycle he struck a white child who was only slightly injured. Still another sobering, even shattering, experience was my discovery at the end of my freshman year that my parents had lost our home and had moved into a four-family apartment building which they had built. I knew that the country was experiencing an economic depression of gigantic proportions,
76 1988 that unemployment had reached staggering figures, and that my father’s law practice had declined significantly. I was not prepared for the personal embarrassment that the depression created for me and my family, and frankly I never fully recovered from it. The liquidation of all debts became an obsession with me, and because of that experience my determination to live on a pay-as-yougo basis is as great today as 1t was when it was not at all possible to live that way.
Despite these experiences, my years in college were pleasant if hectic, rewarding if tedious, happy if austere. Most classes were rigorous, and every-
one was proud of the fact that the institution enjoyed an A rating by the Southern Association of Colleges and Secondary Schools. The faculty was, on
the whole, first-rate, and they took pride in their scholarly output as well as their teaching. While the student body was all black, with the exception of the occasional white exchange student or special student, the faculty was fairly evenly divided between white and black. It was an indication of the lack of interest in the subject that we never thought in terms of what proportion of the faculty was white and what proportion was black. Because I was merely passing through college en route to law school, I had little interest in an undergraduate concentration. I thought of English, but the
chairman, from whom I took freshman English, discouraged me on the ground that I would never be able to command the English language. (Inctdentally, he was a distinguished authority in American literature and specialized in the traditions of the Gullah-speaking people of the Sea Islands. I was vindicated some years later when he chaired the committee that awarded me the Bancroft Prize for the best article in the Journal of Negro History.) My decision to major in history was almost accidental. The chairman of that department, Theodore S. Currier, who was white, had come into that ill-fated course in Contemporary Civilization and had delivered the most exciting lectures I had ever heard. I decided to see and hear more of him. During my sophomore year I took two courses with Professor Currier, and my deep interest in historical problems and the historical process and what he had to say was apparently noted by him. Soon we developed a close personal relationship that developed into a deep friendship. Soon, moreover, I made the fateful decision to give up my plan to study and practice law and to replace it with a plan to study, write, and teach history. My desire to learn more about the field resulted in his offering new courses, including seminars, largely for
my benefit. He already entertained the hope that I would go to Harvard, where he had done his own graduate work. I had similar hopes, but in the mid-1930s with the depression wreaking its havoc, it was unrealistic to enter-
JOHN HOPE FRANKLIN 77 tain such hopes. With a respectable grade point average (that C+ prevented my graduating summa cum laude), and strong supporting letters from my professors, I applied for admission to the Harvard Graduate School of Arts and Sciences.
Harvard required that I take an aptitude test that must have been the forerunner to the Graduate Records Examination. It was administered at Vanderbilt University, just across town but on whose grounds I had never been. When I arrived at the appointed place and took my seat, the person in charge, presumably a professor, threw the examination at me, a gesture hardly calculated to give me a feeling of welcome or confidence. I took the examination but cannot imagine that my score was high. As I left the room a Negro
custodian walked up to me and told me that in his many years of working there I was the only black person he had ever seen sitting in a room with white people. The record that Fisk made that year was more important. The Association of American Universities placed Fisk University on its approved list. On the basis of this new recognition of my alma mater, Harvard admitted me unconditionally. Apparently this was the first time it had given a student from
an historically black institution an opportunity to pursue graduate studies without doing some undergraduate work at Harvard. The university declined, however, to risk a scholarship on me. Admission to Harvard was one thing; getting there was quite another. My parents were unable to give me more than a very small amount of money and wish me well. I was able to make it back to Nashville, where Ted Currier told
me that money alone would not keep me out of Harvard. He went to a Nashville bank, borrowed $500, and sent me on my way. Shortly after my arrival in Cambridge in September 1935, I felt secure academically, financially, and socially. At Fisk I had even taken two modern foreign languages in order to meet Harvard’s requirement, and in Currier’s seminars I had learned how to write a research paper. Since I was secretary to the librarian at Fisk for four years, I had learned how to make the best use of reference materials, bibliographical aids, and manuscripts. Even when I met my advisor, Professor A. M. Schlesinger, Sr., I did not feel intimidated, and I was very much at ease with him while discussing my schedule and my plans. After I got a job washing dishes for my evening meal and another typing dissertations and lectures, a feeling of long-range solvency settled over me. Although I had a room with a Negro family that had taken in black students since the time of Charles Houston and Robert Weaver, I had extensive contact with white students who never showed the slightest condescension toward me. I set my own priorities, however, realizing that I had the burden of
78 1988 academic deficiencies dating back to secondary school. I had to prove to myself and to my professors that the Association of American Universities was justified in placing Fisk University on its approved list. I received the M.A. degree in nine months and won fellowships with which I completed the Ph.D. requirements. There were few blacks at Harvard in those days. One was completing his work in French history as I entered. As in Noah’s Ark, there were two in the law school, two in zoology, and two in the College. There was one in English,
one in comparative literature, none in the Medical School, and none in the Business School.
The most traumatic social experience I had there was not racist but antiSemitic. I was quite active in the Henry Adams Club, made up of graduate students in United States history. I was appointed to serve on the committee to nominate officers for the coming year which, if one wanted to be hypersensitive, was a way of making certain that I would not be an officer. When I suggested the most acitve, brightest graduate student for president, the objection to him was that although he did not have some of the more reprehensible Jewish traits, he was still a Jew. I had never heard any person speak of another
in such terms, and I lost respect not only for the person who made the statement but for the entire group that even tolerated such views. Most of the members of the club never received their degrees. The Jewish member became one of the most distinguished persons to get a degree in United States history from Harvard in the last half-century.
The course of study was satisfactory but far from extraordinary. Mark Hopkins was seldom on the other end of the log, and one had to fend for himself as best he could. I had no difficulty with such a regimen, although I felt that some of my fellow students needed more guidance than the university provided. In my presence, at the beginning of my second year, one of the department’s outstanding professors verbally abused a student, visiting from another institution, and dismissed him from his office because the student’s question was awkwardly phrased the first time around. Another professor confessed to me that a doctoral committee had failed a candidate because he did not /ook like a Harvard Ph.D. When the committee told him that he would have to study four more years before applying for reconsideration, the student was in the library the following morning to begin his four-year sentence. At
that point, the chairman of the committee was compelled to inform the student that under no circumstances would he be permitted to continue his graduate studies there.
JOHN HOPE FRANKLIN "719 III When I left Harvard in the spring of 1939 I knew that I did not wish to be in Cambridge another day. I had no desire to offend my advisor or the other members of my doctoral committee. I therefore respectfully declined suggestions that I seek further financial aid. It was time, I thought, to seek a teaching position and complete my dissertation im absentia. | had taught one year at
Fisk following my first year at Harvard. With five preparations in widely disparate fields and with more than two hundred students, I learned more history than I had learned at Fisk and Harvard. I early discovered that teaching had its own very satisfying rewards. For some fifty-two years, there have
been many reasons to confirm the conclusions I reached at Fisk, St. Augustine’s, North Carolina College at Durham, Howard, Brooklyn, Chicago, Duke, and short stints in many institutions here and abroad. After I committed myself to the study, teaching, and writing of history, I was so preoccupied with my craft that I gave no attention to possible career alternatives. Less than ten years into my career, however, when I was working on my second book, the president of a small but quite respectable historically black liberal arts college invited me to become dean of his institution. It was at that point that I made a response that was doubtless already in my mind but which I had not ever articulated. I thanked him and respectfully declined the © invitation on the grounds that my work in the field of history precluded my moving into college administration. When the president received my letter, he sent me a telegram informing me that he was arriving the following day to
explain his offer. During the three hours of conversation with him I had ample opportunity to state and restate my determination to remain a teacher and writer of history. Each time I did so J became more unequivocal in my resistance to any change in my career objectives. I believe that he finally became convinced that he was indeed wrong in offering me the deanship in the first place. From that day onward, I had no difficulty in saying to anyone who raised the matter that I was not interested in deanships, university presidencies, or ambassadorships. And I never regretted the decision to remain a student and teacher of history. There is nothing more stimulating or satisfying than teaching bright, inquisitive undergraduates. It was puzzling 1f dismaying when a student complained, as one did at Howard, that my lengthy assignments did not take into account the fact that his people were only cighty-five years removed from slavery. It was sobering but challenging when an undergraduate asked, as one
80 1988 did at Brooklyn, if I would suggest additional readings since he had already read everything in the syllabus that I distributed on the first day of class. It was reassuring to find that some students, such as those at Chicago, came to class on a legal holiday because I neglected to take note of the holiday in my class
assignments. It was refreshing, even amusing, when students requested, as
some did at Duke, that the date for the working dinner at my home be changed because it conflicted with a Duke—Virginia basketball game. As Harry Golden would say, only in America could one find undergraduates with so much chutzpah. There came a time in my own teaching career when I realized that with all my frantic efforts at research and writing I would never be able to write on all the subjects in which I was deeply interested. If I only had graduate students who would take up some of the problems regarding slavery, free blacks, the
Reconstruction era and its overthrow, 1t would extend my own sense of accomplishment immeasurably. That was a major consideration in my move in 1964 from Brooklyn College to the University of Chicago, where for the next eighteen years I supervised some thirty dissertations of students who subsequently have published more than a dozen books. In view of Chicago’s free-wheeling attitude toward the time for fulfilling degree requirements,
there ts a possibility that eight years after retirement, I might have more doctoral students who complete their work and write more books. Meanwhile, I continue to revel in the excitement of teaching in still another type of institution, the Law School at Duke University.
IV I could not have avoided being a social activist even 1f I had wanted to. I had
been barred from entering the University of Oklahoma to pursue graduate studies, and when the National Association for the Advancement of Colored People asked me to be the expert witness for Lyman Johnson, who sought admission to the graduate program in history at the University of Kentucky, I was honored to do so. After all, it was easy to establish the fact that Johnson
could not get the same training at the inferior Kentucky State College for Negroes that he could get at the University of Kentucky. Johnson was admitted forthwith. To me it was one more blow against segregation in Oklahoma as well as Kentucky. The defense argument collapsed when the University of
Kentucky placed one of its history professors on the stand and asked him
JOHN HOPE FRANKLIN 81 about teaching Negroes. He replied soberly that he did not teach Negroes, he taught history, which he was pleased to do! Then, Thurgood Marashall asked me to serve on his non-legal research staff when the NAACP Legal Defense Fund sought to eliminate segregation in the public schools. Each week in the late summer and fall of 1953 I journeyed
from Washington to New York, where I worked from Thursday afternoon to Sunday afternoon. I wrote historical essays, coordinated the work of some other researchers, and participated in the seminars that the lawyers held regularly, and provided the historical setting for the questions with which they were wrestling. I had little time for relaxing at my home away from home, the
Algonquin Hotel, but each time I entered this establishment, I made eye contact with an imaginary Tallulah Bankhead, Agnes DeMille, or Noel Coward, who were among the more famous habitués of its lobby. The historian, of all people, must not make more of his own role in events, however significant, even if it is tempting to do so. It would be easy to claim that I was one of the 250,000 at the March on Washington in 1963. I was not there, and perhaps the truth is even more appealing. Because I was serving as Pitt Professor at the University of Cambridge that year, I was something of a resource person for the BBC-TV. On Richard Dimbleby’s popular television program, Panorama, I tried to explain to the British viewers what transpired when James Meredith sought to enter the University of Mississippi. I suspect there was a bit of advocacy even in the tone of my voice. In the summer of 1963 I took British viewers through what the BBC called “A Guide to the March on Washington.” Here again, with film clips on Malcolm X, James Baldwin, A. Philip Randolph, and others, I explained why the March was a very positive development in the history of American race relations. Finally, in 196s, I was actually on the Selma March. No, I did not march with Martin, as some imaginative writers have claimed. I doubt that Martin ever knew that I was
there, far back in the ranks as I was. I was ot at Pettus Bridge in Dallas County, but joined the March at the City of St. Jude on the outskirts of Montgomery. I took pride in marching with more than thirty historians who came from all parts of the country to register their objection to racial bigotry
in the United States. And I want to make it clear that I was afraid, yes, frightened out of my wits by the hate-filled eyes that stared at us from the sidewalks, windows, businesses, and the like. It was much more than I had bargained for. One must be prepared for any eventuality when he makes any effort to promote legislation or to shape the direction of public policy or to affect the
82 1988 choice of those in the public service. This came to me quite forcefully in 1987 when I joined with others from many areas of activity in opposing the Senate confirmation of Robert H. Bork as Associate Justice of the Supreme Court of
the United States. In what I thought was a sober and reasoned statement, I told the Judiciary Committee of the United States Senate that there was “no indication—in his writings, his teaching, or his rulings—that this nominee has any deeply held commitment to the eradication of the problem of race or even of its mitigation.” It came as a shock, therefore, to hear the president of the United States declare that the opponents of the confirmation of Judge Bork constituted a “lynch mob.” This was a wholly unanticipated tirade against those activists who had merely expressed views on a subject in which all citizens had an interest.
Vv
It was necessary, as a black historian, to have a personal agenda, as well as one dealing with more general matters, that involved a type of activism. I discovered this in the spring of 1939 when I arrived in Raleigh, North Carolina, to do
research in the state archives, only to be informed by the director that in
planning the building the architects did not anticipate that any AfroAmericans would be doing research there. Perhaps it was the astonishment that the director, a Yale Ph.D. in history, saw in my face that prompted him to make a proposition. If I would wait a week he would make some arrangements. When I remained silent, registering a profound disbelief, he cut the time in half. I waited from Monday to Thursday, and upon my return to the archives I was escorted to a small room outfitted with a table and chair which was to be my private office for the next four years. (I hasten to explain that it did not take four years to complete my dissertation. I completed it the following year, but continued to do research there as long as I was teaching at St. Augustine’s College.) The director also presented me with keys to the manuscript collection to avoid requiring the white assistants to deliver manuscripts to me. That arrangement lasted only two weeks, when the white researchers, protesting discrimination, demanded keys to the manuscript collection for themselves. Rather than comply with their demands, the director relreved me of my keys and ordered the assistants to serve me. Nothing illustrated the vagaries of policies and practices of racial segregation better than libraries and archives. In Raleigh alone, there were three different policies: the state library had two tables in the stacks set aside for the
JOHN HOPE FRANKLIN 83 regular use of Negro readers; the state supreme court library had no segrega-
tion; while, as we have seen, the archives faced the matter as it arose. In Alabama and Tennessee, the state archives did not segregate readers, while Louisiana had a strict policy of excluding would-be Negro readers altogether. In the summer of 1945 I was permitted by the Louisiana director of archives to use the manuscript collection since the library was closed in observance of the victory of the United States over governmental tyranny and racial bigotry in Germany and Japan. As I have said elsewhere, pursuing Southern history was for me a strange career.
While World War II interrupted the careers of many young scholars, I experienced no such delay. At the same time, it raised in my mind the most profound questions about the sincerity of my country in fighting bigotry and
tyranny abroad. And the answers to my questions shook my faith in the integrity of our country and its leaders. Being loath to fight with guns and grenades, in any case, [ sought opportunities to serve in places where my training and skills could be utilized. When the United States entered the war in 1941 I had already received my doctorate. Because I knew that several whites
who had not been able to obtain their advanced degrees had signed on as historians in the War Department, I made application there. I was literally rebuffed without the department giving me any serious consideration. In Raleigh, where I was living at the time, the Navy sent out a desperate appeal for men to do office work, and the successful ones would be given the rank of petty officer. When I answered the appeal, the recruiter told me that I had all of the qualifications except color. I concluded that there was xo emergency and told the recruiter how I felt. When my draft board ordered me to go to its staff physician for a blood test, I was not permitted to enter his office and was told to wait on a bench in the hall. When I refused and insisted to the draft board clerk that I receive decent treatment, she 1n turn insisted that the doctor
see me forthwith, which he did. By this time, I concluded that the United States did not need me and did not deserve me. I spent the remainder of the war successfully outwitting my draft board, including taking a position at North Carolina College for Negroes, whose president was on the draft appeal
board. Each time I think of these incidents, even now I feel nothing but shame for my country not merely for what it did to me, but for what it did to
the million black men and women who served in the armed forces under conditions of segregation and discrimination. One had always to be mindful, moreover, that being a black scholar did not
exempt one from the humiliations and indignities that a society with more than its share of bigots can heap upon a black person regardless of education
84 1988 or even station in life. This became painfully clear when I went to Brooklyn College in 1956 as chairman of a department of fifty-two white historians. There was much fanfare accompanying my appointment, including a frontpage story with picture in The New York Times. When I sought to purchase a
home, however, not one of the thirty-odd realtors offering homes in the vicinity of Brooklyn College would show their properties. Consequently, I had to seek showings by owners who themselves offered their homes for sale. I got a few showings including one that we very much liked, but I did not have sufficient funds to make the purchase. My insurance company had proudly advertised that it had $50 million to lend to its policyholders who aspired to home ownership. My broker told me that the company would not make a loan to me because the house I wanted was several blocks beyond where blacks should live. I cancelled my insurance and, with the help of my lawyer, who was white, tried to obtain a bank loan. I was turned down by every New York bank except the one in Brooklyn where my attorney’s father had connections. As we finally moved in after the hassles of more than a year, I estimated that I could have written a long article, perhaps even a small book,
in the time expended on the search for housing. The high cost of racial discrimination is not merely a claim of the so-called radical left. It 1s as real as
the rebuffs, the indignities, or the discriminations that many black people suffer.
VI Many years ago, when I was a fledgling historian, I decided that one way to make certain that the learning process would continue was to write different kinds of history, even as one remained in the same field. It was my opinion that one should write a monograph, a general work, a biography, a period piece, and edit some primary source and some work or works, perhaps by other authors, to promote an understanding of the field. I made no systematic effort to touch all the bases, as it were, but with the recent publication of my biography of George Washington Williams, I believe that I have touched them all. More recently, I have started the process all over again by doing research for a monograph on runaway slaves. Another decision I made quite early was to explore new areas or fields, whenever possible, in order to maintain a lively, fresh approach to the teaching and writing of history. That is how I happened to get into Afro-American history, in which I never had a formal course, and which attracted a growing
JOHN HOPE FRANKLIN 85 number of students of my generation and many more in later generations. It is remarkable how moving or even drifting into a field can affect one’s entire life. More recently, I have become interested in women’s history, and during the past winter I prepared and delivered three lectures under the general title of “Women, Blacks, and Equality, 1820-1988.” I need not dwell on the fact that for me it was a very significant learning experience. Nor should it be necessary for me to assure you that despite the fact that I have learned much, I do not
seek immortality by writing landmark essays and books in the field of women’s history.
I have learned much from my colleagues both at home and abroad. The historical associations and other learned societies have instructed me at great length at their annual meetings, and five of them have given me an opportunity to teach and to lead by electing me as their president. Their journals have provided me with the most recent findings of scholars and they have graciously published some pieces of my own. Very early I learned that scholarship knows no national boundaries, and I have sought the friendship and collaboration of historians and scholars in many parts of the world. From the time that I taught at the Salzburg Seminar in American Studies in i9s1, I have been a student and an advocate of the view that the exchange of ideas is more healthy and constructive than the exchange of bullets. This was especially true during my tenure on the Fulbright Board, as a member for seven years and as the chairman for three years. In such experiences one learns much about the common ground that the peoples of the world share. When we also learn that this country and the western world have no monopoly of goodness and truth
or of skills and scholarship, we begin to appreciate the ingredients that are indispensable to making a better world. In a life of learning that 1s, perhaps, the greatest lesson of all.
This page intentionally left blank
1989 JUDITH N. SHKLAR John Cowles Professor of Government
Harvard University
Iam a bookworm. Since the age of eleven I have read and read, and enjoyed almost every moment of it. Yet I was very slow to learn how to read at all, and I hated school, avoiding it as long and as often as I could, without being an actual dropout. It was certainly not in the various schools that I attended so unwillingly that I learned to read or to write. In fact, my exasperated parents
had to hire a tutor to get me started. Nor were my first encounters with literature always happy, though they certainly made a deep impression upon me. The first book I ever read through by myself was a German translation of David Copperfield. 1 read it over and over again and I still love it. The second book was a children’s novel about two boys in the Thirty Years War, which led me to look it up in a wonderful illustrated world history in many volumes in my parents’ library. I was hooked for life on fiction and history. It was not, however, all pleasure. One day I picked up the first volume of Shakespeare in the Schlegel—Tieck translation. The first play was Titus Andronicus, and I read it all. To this day I can still feel the fear and horror it inspired. I was so afraid
and confused that I could not even bring myself to tell anyone what was bothering me. Finally I managed to spill it out to my oldest sister. As soon as I told her I, of course, felt infinitely relieved, especially as she assured me that these things did not really happen. The trouble was that both she and I knew that far worse was going on all around us. By 1939 I already understood that books, even scary ones, would be my best refuge from a world that was far
more terrible than anything they might reveal. And that is how I became a bookworm. It was also the end of my childhood. Biography, novels, and plays are the delight of young readers, and they 87
88 1989 certainly were mine. But I also very early on began to read about current events and political history. The reason for this precocious taste was obvious enough, just as there was nothing random about my later professional interests. Politics completely dominated our lives. My parents had had a hard time getting out of Russia, where the First World War had stranded them, after the Revolution, but they did manage to return home, to Riga, which was now a Latvian city. At first they prospered, but soon it too became a very hostile place. We were essentially German Jews, which meant that almost everyone
around us wanted us to be somewhere else at best, or to kill us at worst. My parents were well-educated, well-to-do, and liberal people, and in a wholly unobtrusive way they were completely unconventional. They had an absolute confidence in the moral and intellectual abilities of their children and treated us accordingly, which made the extreme contrast between a family with high personal standards and an utterly depraved external world inescapable. And this induced a certain wariness, if not outright cynicism, in all of us.
My father had wanted to leave Europe for many years, but we had many family ties binding us to Riga, and my mother, who was a pediatrician, ran a slum clinic which she could not easily abandon. In the event, just before the Russians arrived, my uncle put us on a plane to Sweden, where we remained for too long, until well after the German invasion of Norway. By then there was only one route out of Europe, the Trans-Siberian railroad, which slowly
took us to Japan. It was not an easy trip, but miraculously we escaped. In Japan we were able to buy, in effect, a visa to Canada, which had, as is now common knowledge, a less than generous immigration policy. Not long be-
fore Pearl Harbor we took a boat to Seattle where we were locked up for several surrealistic weeks in a detention jail for legal Oriental immigrants. If were asked what effect all these adventures had on my character, I would say that they left me with an abiding taste for black humor. When my father was at last able to settle his financial affairs, we finally went
to Montreal. It was not a city one could easily like. It was politically held together by an equilibrium of ethnic and religious resentments and distrust. And in retrospect, it is not surprising that this political edifice eventually collapsed with extraordinary speed. The girls school that I attended there for some three years was dreadful. In all that time I was taught as much Latin as one can pick up in less than a term at college. I also learned some geometry, and one English teacher taught us how to compose précis, which is a very useful skill. The rest of the teachers just stood in front of us and read the textbook out loud. What I really learned was the meaning of boredom, and I learned that so well that I have never been bored since then. I report without
JUDITH N. SHKLAR 8&9 comment that this was thought to be an excellent school. I dare say that there were better ones around, but I remain unconvinced by those who respond
with vast nostalgia to the manifest inadequacies of high school education today. I do not look back fondly to my college days at McGill University either. That may have something to do with the then prevailing entrance rules: 750 points for Jews and 600 for everyone else. Nor was it an intellectually exciting
institution, but at least when [| arrived there, just before my seventeenth birthday, I was lucky to be in the same class as many ex-servicemen, whose presence made for an unusually mature and serious student body. And compared to school it was heaven. Moreover, it all worked out surprisingly well
for me. I met my future husband and was married at the end of my junior year, by far the smartest thing I ever did. And I found my vocation. Originally I had planned to major in a mixture of philosophy and economics, the rigor of which attracted me instantly. But when I was required to take a course in money and banking it became absolutely obvious to me that I was not going to be a professional economist. Philosophy was, moreover,
mainly taught by a dim gentleman who took to it because he had lost his religious faith. I have known many confused people since I encountered this
poor man, but nobody quite as utterly unfit to teach Plato or Descartes. Fortunately for me, I was also obliged to take a course in the history of political theory taught by an American, Frederick Watkins. After two weeks of listening to this truly gifted teacher I knew what I wanted to do for the rest of my life. If there was any way of making sense of my experiences and that of my particular world, this was it.
Watkins was a remarkable man, as the many students whom he was to teach at Yale can testify. He was an exceptionally versatile and cultivated man
and a more than talented teacher. He not only made the history of ideas fascinating in his lectures, but he also somehow conveyed the sense that nothing could be more important. I also found him very reassuring. For in many ways, direct and indirect, he let me know that the things I had been brought up to care for—classical music, pictures, literature—were indeed worthwhile, and not my personal eccentricities. His example, more than anything overtly said, gave me a great deal of self-confidence, and I would have remembered him gratefully, even if he had not encouraged me to go on
to graduate school, to apply to Harvard, and then to continue to take a friendly interest in my education and career. It is a great stroke of luck to discover one’s calling in one’s late teens, and not everyone has the good fortune to meet the right teacher at the right time in her life, but I did, and I
90 1989 have continued to be thankful for the education that he offered me so many years ago. From the day that [ arrived at Harvard I loved the place, and I still do. By
that I do not mean that it was perfect. Far from it. In fact, I think it is a far better university now than it was when I got there. But whatever its flaws, I found the education there I had always longed for. The government department was then as now very eclectic, which suited me well, and I learned a lot of political science, mostly from the junior faculty. My mentor was a famous academic figure, Carl Joachim Friedrich. And he taught me how to behave, how to be professional, how to give and prepare lectures, how to deal with colleagues and how to act in public, as well as a general idea of what I would
have to know. And though he was not given to praise, he did not seem to doubt that I would manage to get ahead somehow. In fact I can recall only one nice comment he ever made to me. After my final thesis exam he said, “Well, this isn’t the usual thesis, but then I did not expect it to be.” Eventually I realized that he hoped that I would be his successor, as I indeed did, after many ups and downs. In retrospect it seems to me that the best thing he did for me was to let me go my own way as a student and then as a young teacher. Like many ambitious young people, I was inordinately concerned about what other people thought of me, but having seen a good many graduate students since then, I realize that I was relatively self-assured, and I have Car] Friedrich
to thank for it. There are always many very bright graduate students at Harvard and I really liked many of my contemporaries there, several of whom have remained
my close friends. Seminars were lively and there was a fair amount of good talk over coffee. There were also some brilliant lecturers, whom I found it thrilling to hear. And most of all I loved and still love Widener Library. In many respects the Harvard that I entered in 1951 was a far less open scholarly society than it now 1s. The effects of McCarthyism were less crude and immediate than subtle and latent. The general red-bashing was, of course, a colossal waste of energy and time, but I cannot say that it deeply affected day-to-day life at the university. What it did was to enhance a whole range of
attitudes that were there all along. Young scholars boasted of not being intellectuals. Among many, no conversation was tolerated except sports and snobbish gossip. A kind of unappetizing dirty socks and locker room humor and false and ostentatious masculinity were vaunted, With it came an odd gentility: no one used four letter words and being appropriately dressed, in an inconspicuous Oxford gray Brooks Brothers suit, was supremely important.
More damaging was that so many people who should have known better
JUDITH N. SHKLAR Ol scorned the poor, the bookish, the unconventional, the brainy, the people who did not resemble the crass and outlandish model of a real American upper-crust he-man whom they had conjured up in their imagination. For any woman of any degree of refinement or intellectuality, this was unappealing company. To this affected boorishness was added a slavish admiration for the least intelligent, but good-looking, rich, and well-connected undergraduates. The culture was in many respects one of protected juvenile delinquency. Harvard undergraduates were easily forgiven the misery they inflicted on the rest of Cambridge. High jinks included breaking street lights and derailing trolley cars. Conspicuous drunkenness on the street was normal on weekends. One of the nastiest nots I ever saw, long before the radical sit-ins, was an under-
graduate rampage set off by the decision to have English rather than Latin diplomas. Several tutors were physically assaulted and injured. All this was seen as high spirits, and secretly admired. Nor were these private school products particularly well-prepared. Few could put a grammatical English sentence together, and if they knew a foreign language, they hid it well. The real ideal of many teachers at Harvard in the 1950s was the gentleman C-er. He would, we were told, govern us and feed us, and we ought to cherish
him, rather than the studious youth who would never amount to anything socially significant. There was, of course, a great deal of self-hatred in all this, which I was far too immature to understand at the time. For these demands for overt conformity were quite repressive. Harvard in the 1950s was full of people who were ashamed of their parents’ social standing, as well as of their
own condition. The place had too many closet Jews and closet gays and provincials who were obsessed with their inferiority to the “real thing,” which
was some mythical Harvard aristocracy, invented to no good purpose whatever. What was so appalling was that all of this was so unnecessary, so out of keeping with America’s public philosophy. It was also a bizarre refusal to think through the real meaning of the Second World War. In some ways I found Harvard conversations unreal. I knew what had happened in Europe between 1940 and 1945, and I assumed that most people at Harvard also were aware of the physical, political, and moral calamity that had occurred, but it was never to be discussed. Any American could have known all there was to know about the war years in Europe by then. Everything had been reported in The New York Times and in newsreels, but if these matters came up in class, it was only as part of the study of totalitarianism, and
then it was pretty sanitized and integrated into the Cold War context. It was very isolating and had a lot to do with my later writings. Yet in an intellec-
92 1989 tually subdued way there was a shift in the local consciousness. A look at the famous “Redbook,” which was the plan for the general education program at
Harvard, is very revealing. Its authors were determined to immunize the young against fascism and its temptations so that “it” would never happen again. There was to be a reinforcement of The Western Tradition, and it was to be presented in such a way as to show up fascism as an aberration, never to be repeated. I would guess that in the pre-war Depression years some of the young men who devised this pedagogic ideology may have been tempted by attitudes that eventually coalesced into fascism, and now recoiled at what they knew it had wrought. They wanted a different past, a “good” West, a “real”
West, not the actual one that had marched into the First World War and onward. They wanted a past fit for a better denouement. I found most of this unconvincing. Harvard in the 1950s was in appearance in a conservative moment, but it was, in fact, steadily changing, becoming perceptibly more liberal and interesting. The 1960s as a period and a phenomenon did nothing, however, to
hasten this progress, quite on the contrary. I do not remember the 1960s kindly. What went on was brutish and silly and the spectacle of middle-aged men simpering about how much they learned from the young, and flattering the most uncouth of their students as models of intellectual and moral purity,
would have been revolting had it not been so ridiculous. The only lasting legacy of that time is a general flight from the classroom. Many teachers simply quit and withdrew to their studies when confronted with all that abuse. Moreover, a whole new generation has grown up unprepared and unwilling to teach. If you do not trust anyone over thirty in your teens, you will not like young people once you reach forty. Instead we now have a constant round of conferences and institutes which do not inspire scholarly work good enough to justify the tume and effort spent on them. Still, all in all, I don’t lament. As I look at my younger colleagues, I am heartened by their intelligence, competence, openness, and lack of false prejudices. And Har-
vard’s student body is certainly more alert, versatile, self-disciplined, and above all, more diverse and fun to teach, then it ever was before. What was it like to be a woman at Harvard at the time I came there? It would be naive of me to pretend that I was not asked to give this lecture because J am a woman. There 1s a considerable interest just now in the careers of women such as I, and it would be almost a breach of contract for me to say nothing about the subject. But before I begin that part of my story, I must say
that at the time when I began my professional life, I did not think of my prospects or my circumstances primarily in terms of gender. There were many
JUDITH N. SHKLAR 93 other things about me that seemed to me far more significant, and being a woman simply did not cause me much academic grief. From the first there were teachers and later publishers who went out of their way to help me, not condescendingly, but as a matter of fairness. These were often the sons of the
old suffragettes and the remnants of the Progressive Era. I liked them and admired them, though they were a pretty battered and beaten lot, on the whole, by then. Still they gave me a glimpse of American liberalism at its best.
Moreover, I was not all alone. There were a few other young women in my classes, and those who persevered have all had remarkable careers. Nevertheless, all was not well. I had hardly arrived when the wife of one of my teachers asked me bluntly why I wanted to go to graduate school when I should be promoting my husband’s career and having babies. And with one or two exceptions, that was the line most of the departmental wives followed. They took the view that I should attend their sewing circle, itself a ghastly scene in which the wives of the tenured bullied the younger women, who trembled lest they jeopardize their husbands’ futures. I disliked these women,
all of them, and simply ignored them. In retrospect I am horrified at my inability to understand their real situation. I saw only their hostility, not their self-sacrifices.
The culture created by these dependent women has largely disappeared, but some of its less agreeable habits still survive. Any hierarchical and competitive society such as Harvard is likely to generate a lot of gossip about who is up and who is down. It puts the lower layers in touch with those above them, and it is an avenue for malice and envy to travel up and down the scale. When I became sufficiently successful to be noticed, I inevitably became the subject of gossip, and oddly I find it extremely objectionable. I detest being verbally served for dinner by academic hostesses, so to speak, and I particularly resent it when my husband and children are made into objects of invasive curiosity
and entertainment by them.
These nuisances are surely trivial, and I mention them in order not to sound too loyal to Harvard. Though perhaps I am, because my experiences have not made me very critical. Certainly in class and in examinations I was not treated differently than my male fellow-students. When it came to teach-
ing Harvard undergraduates in sections, there was a minor crisis. It was thought wonderful to have me do Radcliffe sections, but men! It had never been done! I said nothing, being far too proud to complain. After a year of dithering, my elders decided that this was absurd and I began to teach at Harvard without anyone noticing it at all. When I graduated I was, much to my own surprise, offered an instructor-
04 1989 ship in the government department. When I asked, how come? I was told that I deserved it and that was that. I did not, however, know whether I wanted it. I had just had our first child and I wanted to stay with him for his first year. That proved acceptable. I rocked the cradle and wrote my first book. To the extent that I had made any plans for my professional future at all, I
saw it in high class literary journalism. I would have liked to be a literary editor of the Atlantic or some such publication. This was a perfectly realistic ambition and had obvious attractions for a young woman who wanted to raise a family. I was, moreover, sure that I would go on studying and writing about political theory, which was my real calling. My husband, however, thought that I ought to give the Harvard job a chance. I could quit if I didn’t like it, and I might regret not trying it out at all. So I more or less drifted into a university career, and as I went along there were always male friends telling
me what to do and promoting my interests. I did not mind then and I wouldn’t mind now, especially as thinking ahead is not something I do well or often.
For a number of years everything went smoothly enough. I was almost always exhausted, but like both my parents I have a lot of energy. The crunch came predictably when the matter of tenure finally came up. My department could not bring itself to say either yes or no. It had done this to several male
aspirants, who hung around for years while this cat-and-mouse game was being played. That was more humiliation than I could bear, so I went to the dean and asked him if I could have a half-time appomtment with effective tenure and lecturer’s title. It was not exactly what I wanted, but it was what I decided to arrange for myself, rather than wait for others to tell me what I was worth. My colleagues accepted this deal with utter relief, and it certainly made life a lot easier for them, as well as for me. I had three children by then and a
lot of writing to do. So it was by no means a disaster and it saved my selfrespect, no doubt a matter of excessive importance to me. It also relieved me for years from a lot of committee and other nuisance work, though half-time never turned out to be exactly that. Do I think my colleagues behaved well? It is, of course, unreasonable to be a judge in one’s own case. So I will answer the question indirectly. There are very many scholars whom I regard as my superiors in every way and whom I admire without reserve, but I have never thought of myself, then or now, as less competent than the other members of my department.
What did this experience do for or to me? Not much. In time things straightened themselves out. Do I think that matters have improved since then? In some ways I am sure that they have. We treat our junior colleagues
JUDITH N. SHKLAR Os with far more respect and fairness now. They have more responsibility and also a more dignified and independent position. Their anxiety about tenure remains, of course, but at least we do not go out of our way to demean them any longer. The atmosphere for women is, however, far from ideal. There is certainly far less open discrimination in admissions, hiring, and promotions, and that is a very genuine improvement. However, there is a lot of cynical feminism about that is very damaging, especially to young women scholars. The chairman who calls for hiring more women, avy women, for, after all, any skirt will do to make his numbers look good, and to reinforce his own liberal credentials. The self-styled male feminist who wildly over-praises every newly appointed young woman as “just brilliant and superb,” when she is in fact no
better or worse than her male contemporaries, 1s not doing her a favor, just
expressing his own inability to accept the fact that a reasonably capable woman is not a miracle. The male colleague who cannot argue with a female colleague without losing his temper like an adolescent boy screaming at his mother, and the many men who cannot really carry on a serious professional conversation with a woman, are just as tiresome as those who bad-mouth us overtly. And they are more likely to be around for a long, long time proclaiming their good intentions without changing what really has to change most of all: they themselves.
For me, personally, the new era for women has not been an unmixed blessing. It is not particularly flattering to be constantly exhibited as the “first”
woman to have done this or that, just like a prize pig at a country fair. The pressure, which is inevitably internalized, to do better than anyone else becomes debilitating and it erodes any self-confidence one might have built up with the years. Nothing now ever seems good enough, however hard I try. Nevertheless, in spite of these side-effects, I have much to be pleased about. Harvard has become a much less mean-spirited place than it used to be. In any
case, the idea of making an ideological issue of my own career difficulties
never occurred to me at all, which is one of the reasons I am not a real feminist. But it is not the only one. The idea of joining a movement and submitting to a collective belief system strikes me as a betrayal of intellectual values. And this conviction 1s an integral part of what I have tried to do as a
political theorist, which is to disentangle philosophy from ideology. I am obliged to acknowledge that this is a characteristically liberal enterprise, which is a paradox, but classical liberalism can at least claim that it has tried to
rise above its partisan roots, rather than to rationalize or conceal them. As I said at the outset, I took up political theory as a way of making sense of the experiences of the twentieth century. What had brought us to such a pass?
906 1988 In one way or another that question has lurked behind everything I have written, especially my first book, After Utopia, which I began when I was twenty-two. At the time the very idea of such an undertaking was dubious. There was some doubt whether political theory itself could or should survive at all. For over 150 years political thinking had been dominated by those great “isms,” and the outcome was plain to see. No one wanted to relive the 1930s. We had suffered enough intellectual disgrace. Ideologies were the engines of fanaticism and delusion, and we should never talk like that again. Instead we should limit ourselves to clarifying the meaning of political language, sort out
intellectual muddles, and analyze the dominant concepts. In this way we could help political planners to recognize the alternatives available to them and to make reasonable choices. We would clean up the ideological mess and
acquire an austere and rational style of exposition. It was not an ignoble intellectual ideal. Indeed, that passionate effort to free ourselves of affect can
be recognized not only in philosophy, but in the aesthetics of that time as well. I was deeply under the spell of these intellectual aspirations, which were
so obviously tied to hopes for a humane and efficient welfare state. The trouble with this way of thinking was that it did not help me much with the questions that I wanted to answer. So I turned to history. What puzzled me when I wrote After Utopia was that none of the explanations for Europe’s recent history made sense. And as I investigated them, it seemed clear to me that most were really updated versions of nineteenthcentury ideologies, whether romantic, religious, or conservative—liberal, and not one of them was adequate to cope with the realities for which they tried to account. Unhappily, I was so absorbed in the history of these ideas that I never quite got to my main topic, but I did get at least one point across: that the grand ideologically based political theories were dead and that political thinking might not recover from its obvious decadence. In this I was wrong, as were a great many other people. When Leo Strauss in a celebrated essay wrote that political theory was “a pitiable rump,” left over by the specialized social sciences, he was being comparatively optimistic—at least he thought something remained. What was gone was the “great tradition,” that had begun with Plato and expired with Marx, Mill, or possibly Nietzsche, a canon of commanding quality, encompassing scope, and philosophical rigor. No one was writing anything comparable to Leviathan, and no one ever would. Only Isaiah Berlin, ever hopeful, claimed that as long as people would argue about fundamental political values political theory was alive and well. Nonsense, I said, only political chatter and the vestiges of ideology were around. No Social
JUDITH N. SHKLAR 97 Contract, no Rousseau, no political theory! Most of us believed that in the age of the two world wars both the utopian and the social-theoretical imagination had dried up in disenchantment and confusion. Only criticism remained as a
vapid gesture of no substance, and as testimony to a general inability to understand the disasters that had overcome us, or to rise above them. What I thought was needed was a realistic adaptation to an intellectually pluralistic and skeptical eclecticism, but that could hardly get the old juices flowing. There were other explanations for the apparent paralysis, of course. It was suggested that theory was stifled in a bureaucratic political order, where only functional thinking was encouraged, as in Byzantium, for example, where there also was no speculative thought, but just guarded little bailiwicks of ideas appropriated by an unoriginal master and his troop. I was not persuaded by this line of thought, because I knew some Byzantine history, and could see no resemblance to us at all. More persuasive was the medieval analogy. There had been plenty of philosophical talent and imagination, but it had all been concentrated on theology and not politics. With us it was the natural sciences. A rather different claim was that it was just as well that speculative theorizing had stopped. To be sure, there once had been a wonderfully rich and diverse variety of ideas and forms of public argument, but it was no longer possible to go on in that manner. We could and should work at improving the quality of intellectual history. That appealed to both certain democratic as well as aristocratic impulses, rather hard to recapture now. For the aristocratic the great canon was a cultural treasure to be preserved by and for the very few who could appreciate it. But for others, myself among them, 1t was the hope that by making these ideas and texts accessible to as many people as possible there
now. |
would be a general deepening of the self-understanding that comes from confronting the remote and alien. The idea was to make the past relevant to all
What is now called “the linguistic turn” had very similar aspirations. Its hope was to be of use to all citizens by clarifying the entire vocabulary of politics and also to illuminate the alternatives available to those who had to make political choices. In addition, it might also serve the social sciences by giving them a stable, unemotive, and reliable language. I was certainly inclined to believe that the prospects of the social sciences as predictive and practical knowledge were good, and that theory could do much to sustain them. Theorists would analyze the prevailing terms of political discourse and see how it functioned in different contexts. This would help the public to free itself from ideological distortions and inconsistent impulses, and would provide the social sciences with an aseptic vocabulary. I think it fair to say that I
98 1988 was not atypical in caring more about being honest than about finding the truth, which only agitated traditional and radical critics at the margins of the intellectual map.
Those of you who have grown up in the midst of the vigorous debates around John Rawls’s_A Theory of Justice and the literature that it has inspired
can no longer even imagine this state of mind. In retrospect, it seems to me that there were stirrings of creativity under the surface all along, and that the inhibitions and hesitations of the post-ideological age were neither futile nor mindless. They were a pause, and not a worthless one either. It got us over the disgrace of the immediate past. To return to my younger self: The attention that After Utopia received had one funny result. My editors rather than I had hit upon the title, and many people thought that I had written a book about utopias. It was a fashionable topic, and I was soon asked to participate in scholarly conferences. I was in no
position to refuse at that stage of my young career, and so I boned up on utopias. No subject could have been less suited to my temperament or interests, but I plowed on and even got to be quite fond of the utopian literature and eventually became a minor expert. Utopian fantasies did not, however, liberate me from history or its burdens. I found, in spite of my dispiriting view of the discipline in general, that historical interpretation was not yet out of style, nor as irrelevant as I had originally feared. One could do more with it than just discuss who said what when. And so I soon returned to the events of the Second World War. I had been teaching a course on the history of modern legal theory for a number of years and had been reading up on the subject. Although it had nothing to do with the course itself, I thought that it might be interesting to take a good look at political trials generally and at the International Tribunals at Nuremberg and Tokyo specifically. In order to do that systematically I realized that I would have to think through for myself the very traditional problem of the relations between law, politics, and morals. As I did so, I was struck very forcefully by the difference between legal and political thinking and by the professional constrictions of jurisprudential thought, especially when it was extended beyond the limits of normal court business. Nothing could have been more remote from my mind, however, than to attack legal scholarship, lawyers, or the integrity of our legal system, but the majority of law journals
were really upset at the very notion that politics structured the law very significantly. Nor were they exactly thrilled to read that one could justify the Nuremberg trials only on political grounds and the Tokyo ones not at all. I was told in no uncertain terms that only lawyers could really understand the
JUDITH N. SHKLAR 99 perfection of legal reasoning. I look back with some amusement at this episode, because my skeptical inquiry into the traditional orthodoxies of legal thought was so mild and so qualified, compared to the assaults that critical legal studies have mounted against the basic assumptions of the legal establishment since then. And it is with some dismay that I now find myself treated as the purveyor of standard ideas, known to and accepted by all, even by the most conservative academic lawyers. To recognize that professions have their self-sustaining ideologies is hardly news today, but it was in 1964. And so Legalism, which is my favorite of the books that I have written, went quickly from being a radical outrage to being a conventional commonplace. Going through all the published and unpublished documents relating to the War Trials in the Treasure Room of the Harvard Law Library had a very liberating effect upon me. It was as if I had done all I could do to answer the question, “How are we to think about the Nazi era?” I knew that there was much that I would never understand, but perhaps I knew enough about the essentials. At any rate, I was ready to do other things. Since my undergraduate days I had been absolutely mesmerized by Rousseau. Watkins had given some absolutely first-rate lectures and had urged me to write short and long papers about him. I was not the first reader to discover that Rousseau was addictive. It is not just that debates about him always seemed to touch upon the most vital and enduring questions of politics, but
that when I read him, I knew that I was in the presence of an unequaled intelligence, so penetrating that nothing seemed to escape it. To read Rousseau is to acquire a political imagination and a second education. For someone as naturally and painlessly skeptical as I have always been, it 1s, moreover,
a continuing revelation to follow the struggles of a mind that found skepticism both inevitable and unbearable. Above all, Rousseau has fascinated me because his writings are so perfect and lucid, and yet so totally alien to a liberal mentality. He is the complete and inevitable “other,” and yet entirely integral
to the modern world that he excoriated, more so than those who have accepted it on its own terms. It is difficult to like the author of the Confessions, but it is a riveting work, and even if one disagrees with the Social Contract, who can deny the brilliance of its arguments, or not be compelled to rethink political consent? I read Rousseau as a psychologist—as he said of himself, he
was “the historian of the human heart”—-and a rather pessimistic thinker, which makes him unique among the defenders of democracy and equality. It is, I believe his greatest strength. As a critical thinker he has no rival, apart from Plato. I am not, however, so besotted with Rousseau that I do not admire the
100 1988 great writers of the Enlightenment upon whom he cast his scorn. Quite on the contrary, in reaction to him, I was especially drawn to them, and am convinced that just those intellectual bonds that identify that diverse group— skepticism, autonomy and legal security for the individual, freedom and the discipline of scientific inquiry—are our best hope for a less brutal and irra-
tional world. My favorite 1s Montesquieu, the most authentic voice of the French Enlightenment, its bridge to America, and an acute political scientist. Anyone who does intellectual history recognizes more or less clearly that she owes a debt to Hegel, who laid down its philosophical principles: that history, endured as the conflict between incomplete epistemologies, is resolved when we recognize this process as the totality of our collective spiritual
development. The study of that experience becomes the master science. No more powerful defense of the enterprise can be imagined, and in some more modest version, mtellectual historians cling to it. The grounds of Hegel’s argument were to be found in his Phenomenology. And so I spent some five years unravelling its endless allusions and tying its political theory together. It was not altogether successful, but I would still defend my reading of Hegel as the last of the great Enlightenment thinkers. I should also, for the sake of honesty, confess that I do not understand Hegel’s Logic and that the commentaries that I have read have not helped me. And while I am at it, [ must also admut that there are a vast number of paragraphs by Heidegger that mean absolutely nothing to me. I quite simply do not understand what he 1s saying. J am not proud of these lapses, and I have no one to blame but myself, but it is
better to own up than to hide them, especially from one’s students. Although I sometimes have students in mind when I write, I tend to keep writing and teaching apart. I have many friends who write their books as they lecture, but I somehow cannot do that, though I wish that I could. I think of the two as complementary, but different. In class I have to think of what the students must be taught, when I write I have only myself to please. I do not even find that the two compete for my time, and rather that mysteriously and semi-consciously, they interact. I have had the good luck to have taught some absolutely wonderful young people. Some of the Harvard seniors whose undergraduate theses I have directed are the most intelligent, stimulating, and delightful people I have known, and preparing for their tutorials has certainly done a lot for my own education as well. Graduate students are not as easy to get on with at first, because they are in such a difficult position, having just fallen from the top of their undergraduate class to the very bottom of a very greasy pole. I certainly prefer frank and independent students to ingratiating and flattering ones, and trust those who
JUDITH N. SHKLAR IO!r take charge of their own education most of all. Ultimately they can be the most gratifying people for a teacher. The graduate students who become professional quickly and develop a real passion for their studies may soon be one’s friends, their success is in some way one’s own, and they are often the best partners for discussion, whether we agree or not. The reason I teach political theory is not that I just like the company of young people, but that I love the subject unconditionally and am wholly convinced of its importance and want others to recognize it as such. It has therefore been quite easy for me to avoid becoming a guru or substitute parent. I really only want to be a mother to our three children, and do not like disciples. And I fear that the students who so readily attach themselves to idols lose their education along with their independence. Much as I have enjoyed teaching, I am inclined to think that I would have
written more or less the same kinds of books if I had not accepted that unexpected Harvard job. The one subject that I might not have taken up 1s American political theory. I originally started reading American intellectual history entirely in order to prepare an undergraduate lecture course, but it soon became an avocation and I have thought and written about it with much pleasure and interest. I do not treat it as a peculiarly local phenomenon, “a poor thing but our own,” but as intrinsically significant. Apart from the early establishment of representative democracy and the persistence of slavery, which do give it a special character, American political thought is just an integral part of modern history as a whole. The study of American history has certainly done nothing to lessen my awareness of the oppression and violence that have marked all our past and present. And it also has sharpened my skepticism as I consider the illusions, myths, and ideologies that are generated to hide and justify them. With these thoughts in mind I quite naturally turned to Montaigne’s Essays. He increasingly has become my model as the true essayist, the master of the experimental style that weaves in and out of the subject, rather than hitting the reader over the head. As I read Montaigne I came to see that he did not preach the virtues, but reflected on our vices, mostly cruelty and betrayal. What, I asked myself, would a carefully thought through political theory that “puts cruelty first” be
like? I took it as my starting point that the willful infliction of pain is an unconditional evil and tried to develop a liberal theory of politics from that ground up. That exploration led me to consider a number of other vices, especially betrayal, in their tendency to enhance cruelty. The book I built around these notions, Ordinary Vices, 1s very tentative, an exploration rather
than a statement, an effort to worry rather than to soothe.
102 1988 From betrayal to injustice is a short step. ] am now revising a short book on injustice, and I mean to be unsettling. I want to examine the subjective claims of the aggrieved and I try to look at injustice from the vantage point of those who have experienced it, not on the model of a court of law, but in a far less
rule-bound way. It is a perspective that does not make it any easier to tell misfortune from injustice, and it decidedly is not the way those who govern tend to draw the line between the two. I hope to shift the accepted paradigms a bit.
What makes a scholar choose the subjects of inquiry, and change her interests over time? Because I am too busy to be very self-reflective, I find that question hard to answer, and perhaps I had better begin by looking at others who are like me. My guess 1s that there is a mixture of external and internal
pressures that direct scholars working within a discipline such as political theory. I think that the years of post-war passivity did not exhaust the possibilities of textual commentary, though the methods of interpretation are now all up for intense rethinking, in response to too many repetitive readings. The practical limits of “the linguistic turn” duly emerged as well, and though we will certainly have to continue to refine and clarify our terms of discourse, few
if any of us still believe that this will improve the world or even the social sciences. To be sure, muddled, emotive, and intuitive thinking would only make matters worse. So the two main post-war endeavors did not lead us to a dead end, after all. In fact they opened the door to new prospects. Practical
ethics is now deeply engaged with the political choices imposed by new technologies and administrative institutions. Analytical thought, originally so finely honed for its own sake, now has a new function here. These theoretical ventures are, I think, inspired both by events in the social world and by the fatigue induced by the remoteness of pure analysis. The stimulus of political radicalism has, in contrast, been brief and less distinguished, leaving behind it
a desiccated and abstract Marxism. The career of social criticism has also floundered. As its rituals lost their charm, hermeneutics replaced prophecy, and a return to the cave in order to interpret rather than to judge politics suggested itself. Scholars now try to read their cultures as once they read their texts. I do not find this research particularly impressive, and often it amounts to little but an unspoken conservatism. What are the search for “shared meanings” and the articulation of deep intimations but celebrations of tradition? I
much prefer an open and direct defense of the habitual and customary. Far more exciting, to my mind, is the enlarged scope of political theory today, as literature and the fine arts are integrated into reflections about the nature of government and its ends. It preserves the canon by expanding it.
JUDITH N. SHKLAR IO3 Evidently I have some notion of how scholarly changes occur in general, but each one of us 1s, of course, different and has personal motives for making specific intellectual decisions. As I look at myself, I see that I have often been
moved to oppose theories that did not only seem wrong to me, but also excessively fashionable. I do not simply reject, out of hand, the prevailing notions and doctrines, but complacency, metaphysical comforts, and the protection of either sheltered despair or of cozy optimism drive me into intellectual action. I do not want to settle down into one of the available conventions. Perhaps this reflects the peculiarity of the kind of refugee I was. We had never
known poverty or ignorance. My sister and I both spoke elegant English when we arrived. It made it very easy for us to adapt quickly, but we did not have to alter fundamentally to do so. And I have participated happily enough in what goes on around me, but with no wish to be deeply involved. It is a very satisfactory situation for a scholar and a bookworm.
This page intentionally left blank
I990O PAUL OSKAR KRISTELLER Frederick J. E. Woodbridge Professor Emeritus of Philosophy
Columbia University
I welcome this opportunity to pay my personal tribute to Charles Homer Haskins, a great American scholar and historian whose work I have always admired, whose contributions are still valid and worth reading many decades after his death, and whom I consider a worthy guide and model for all present and future practitioners of historical scholarship. This honor comes to me quite unexpected, and even as a surprise, for my background is foreign and untypical (though I have lived in this country for over half a century); the subject that interests me most, the history of Western
philosophy and thought, has become increasingly unfashionable, and has been called irrelevant and useless, elitist and Eurocentric, and even undemo-
cratic (perhaps I could make it more palatable by calling it the study of interconceptual space); and my method, which tries to combine the philosophical interpretation of original texts with the pertinent skills of history and philology, and of their auxiliary disciplines such as diplomatics and chronology, palaeography and bibliography, is now considered hopelessly traditional
and even antiquated, as I have been told more than once by foundation officials, administrators and colleagues, reviewers and critics. This occasion prompts me to say a few words that reflect my experience and my opinions,
and I apologize in case they displease some of my listeners. I have been exposed over the years to many words and thoughts that displease me, and I might claim for once my right to the freedom of speech. I do not think that I am unwilling to stand corrected when some of my statements are refuted by solid facts or arguments, and I am quite ready to admit that there are many areas and problems that lie outside my special field of interest and that deserve 10§
106 I990 to be explored by other scholars. I also admit that my advanced age may make me obtuse and unresponsive to certain new subjects or methods that may turn out to be perfectly legitimate. Yet I cannot bring myself to accept or condone
certain views that are in agreement with some current political or other trends, but that are flatly contradicted by firmly established facts or arguments which their proponents, knowingly or unknowingly, choose to ignore, using what I like to call the argumentum ex ignorantia, a powerful argument when
the readers and listeners are as ignorant of the contrary evidence as are the speakers and writers. The lite of learning, as the title of our series has it, or the life devoted to learning, is a somewhat ambiguous term, if I may be allowed to borrow the method of my analytical colleagues in philosophy. It means, of course, that as scholars we always continue to learn (and unfortunately also to forget), from
the beginning to the end of our life. Yet the phrase also means that we are dedicated to learning or to scholarship, words that in modern English have been used to designate all knowledge outside the natural and social sciences. All other Western languages known to me speak of the philosophical, historical, and philological disciplines as sciences, recognizing that they contain and accumulate valid knowledge based on rigorous methods (they are even predictive when they discover new texts or documents that confirm a previously
proposed opinion). In English, the terms learning and scholarship do not indicate, as they should, that we deal with knowledge that 1s as valid and as methodical as that of the sciences, though it deals with different subjects and uses different methods. The more recent term /umanities has the additional disadvantage that it indicates a kind of knowledge that is at best useless and dispensable, and at worst provides some kind of genteel or snobbish entertamment. Moreover, the term humanities, and even more the term humanism, invites a confusion with philosophical or secular humanism and with humanitarianism, thus involving humanistic scholarship quite needlessly in philosophical and religious controversies, and confusing it with social and political ideals that are valid and desirable but completely irrelevant to our cause. All
this confusion serves as a temptation or excuse for diverting our meager resources towards other efforts and activities that may have great merits, even greater than we can claim for ourselves, but that are completely different from ours. We have to keep all this in mind. We cannot help following ordinary English usage, but we are confronted with a case where this usage, as attested by Webster or the short Oxford dictionary, turns out to be insufficient for the discussion of a serious philosophical or scholarly problem. Another such case is the word reason. In ordinary English, it denotes the capacity to draw valid
PAUL OSKAR KRISTELLER 107 inferences from ascertained facts, whereas a different and more comprehensive notion of reason, generally used and understood by philosophers from antiquity to fairly recent times, called Nous in Greek and Vernunft in German and identified by Kant as the faculty of principles, is admittedly untranslatable into modern English and has disappeared not only from current usage, but also from contemporary philosophical thought, to the great detriment of all philosophical, scholarly, and scientific discourse. When I now try to talk about my background and upbringing to serve as an explanation of my later development and work as a scholar, I should like to emphasize that I consider inheritance and education, in agreement with most serious scholars in both the Eastern and Western worlds, as necessary, but not as sufficient causes of intellectual development—otherwise all per-
sons with the same family background and education would turn out to be the same, whereas in fact each individual person 1s different and in a sense unique. I was born in Berlin in 1905 into a well-to-do Jewish middle-class family,
and since my father died at the time of my birth, I was brought up by my mother, the daughter of a banker, and by my stepfather (the only father I knew), the director of a small factory. My parents had no higher education, though several of their relatives did, but they respected all cultural pursuits and made many sacrifices to further my education. My mother was interested in literature and art, visited museums and exhibitions, attended lectures and
theatre performances, read a good deal, and assembled a small but good library which was at my disposal. She also knew French, English, and some Italian, and I probably inherited from her a great facility for languages that turned out to be very helpful in my later work and career.
From ages six to nine, I attended a good public elementary school and learned in short time reading and writing in two scripts, Gothic and Roman, as well as arithmetic and other simple skills. From 1914 to 1923, I attended a public classical school, the Mommsen-Gymnasium in Berlin, which offered nine years of Latin (eight hours a week), eight years of French (four hours a week), six years of Greek (six hours a week), a great deal of German composition and literature, some history and geography, a good deal of mathematics (including elementary calculus), some physics, chemistry, and biology, and a year or two of English (which turned out to be my fifth language). I had to do
a lot of homework, and I did not mind it, for I found the assignments interesting and challenging. I had some occasional difficulties and setbacks,
but I basically liked and enjoyed my school. In all the languages which I studied I learned grammar and parsing, instead of guessing the vague content
108 1990 of long sentences. In mathematics I learned precise reasoning since we were supposed to understand and repeat all proofs, solutions, and constructions, not just to memorize them. I learned to compose clear papers consisting of an introduction and a conclusion and of several well distinguished parts of which the first did not presuppose the second but vice versa, and I still write that way. In addition to some German and French classics (including Shakespeare
in German), I read in school Virgil and Tacitus, Homer and Sophocles, authors who have remained my favorites throughout my later life. My teachers were for the most part well-trained and knowledgeable, and they included such scholars as Walther Kranz and Ernst Hoffmann, well-known among students of the classics and of ancient philosophy. [ also had a lot of extracurricular activities. I was never interested in sports, but I did a lot of swimming and hiking, and later of mountain-clumbing. I had private lessons in French and English conversation, and I learned a bit of Hebrew, Dutch, and Russian, languages which I failed to keep up. [ later learned Italian, which 1s now one of my best languages, and a bit of Spanish. I was a voracious reader and began to buy and collect many books, especially of history and biography, and of German and French literature. I eagerly visited the Berlin museums and especially liked the older Italian, Flemish, and Dutch
masters (who were well-represented in these collections), and I saw many exhibitions of modern German and French art (since the seventeenth century Berlin was an outpost of French culture, in spite of the political and military conflicts between the two countries, a fact that seems to be widely unknown). On a number of vacation trips made with my parents or with my school and
later alone, I saw many parts of Germany, Austria, and Switzerland and enjoyed the landscape as well as the architectural monuments and art collections of many cities. [ also wrote a lot of poetry and continued to do so long afterwards. The study and practice of music played a major role in my life. I began early to take piano lessons, frequently heard opera and concert performances, and continued to take advanced lessons until my university days. I played the works of all major classical composers from Bach to Chopin and Brahms and
also some modern German and French music, but I disliked some of the Romantics, especially Liszt and Wagner. My interest in philosophy developed early. In my literary readings, I came upon Ibsen’s play, Te Emperor and the Galtlaean, and although this is not one
of his best works, it was my first exposure to Neoplatonism, as I was to discover much later. I read Plato in school and learned from him to follow reason as far as it leads us, to look for principles rather than for specific
PAUL OSKAR KRISTELLER 109 instances, and to distinguish clearly between knowledge and opinion. At home, I read more Plato, some Aristotle, and above all, a lot of Kant. By the time I graduated from school with honors in 1923, my mind was made up to
study philosophy and its history, and my parents accepted my decision, although they were disappointed in their hope of seeing me enter the family business, and also worried about my choosing a career unknown to them and offering no prospects for a well-paid position. After graduation, I went to study at Heidelberg. The university, apart from the beauty of the city and its surroundings, had a good reputation because of
the high quality of its faculty, and I was especially attracted by Ernst Hoffmann, my school teacher of Greek, who in the meantime had become a
professor of ancient philosophy at Heidelberg. Needless to say, for an eighteen-year-old youth, it was attractive to get away from home for the first time. All I had to do was to take a train, rent a furnished room, and sign up at the registrar’s office, showing my high school diploma. The tuition was modest because the university, as all others, was run by the government. I spent nine semesters at various universities: five at Heidelberg and the others at Berlin, Freiburg, and Marburg. At Heidelberg, my teachers in philosophy included Heinrich Rickert, whose interpretation of Kant never convinced me, but who developed a theory of the historical method that has great merit and
is not as well-known as it deserves; Karl Jaspers, who introduced me to Kierkegaard and to existentialism; and Ernst Hoffmann, who lectured on Plato and Aristotle and also on Plotinus. In Freiburg, I heard Richard Kroner, who introduced me to Hegel, and Edmund Husserl, the founder of phenomenology, who was not an interesting lecturer, but whose books I read with great profit. Encouraged by a fellow student, in 1926 I went to Marburg to hear Martin Heidegger. He was then working on Seiz und Zeit, his masterpiece, and gave impressive lectures, as well as a seminar on historicism. I also came to know him personally, but I resisted the temptation to stay with him because he kept his doctoral students waiting for many years, and I returned to Heidelberg to write a thesis on Plotinus under Hoffmann. Aside from attending lectures and seminars, I spent a lot of time preparing seminar papers and reading a great variety of books that interested me or that I considered important for my studies. I read all major philosophers from the Presocratics to Husserl, including Nietzsche, whom I never appreciated.
My formal minors were mathematics and medieval history. I attended some lectures and seminars on higher mathematics, though with difficulties, and I read reference books on calculus; the theories of functions and numbers; and differential equations. I once wrote a seminar paper on a complicated
110 1990 demonstration which occupied only a few printed pages and grew in my summary to a talk of half an hour because the author had presupposed and omitted many theorems and proofs which I had to supply from other sources. This showed me that I was not able to become a professional mathematician. But I learned once and for all that we should always concentrate on problems that we may hope to solve, and that in our attempts to prove something, we should always prefer a simple and “more elegant” proof to a complicated one. I also began to understand the link between the mathematical and philosophical doctrines of such thinkers as Descartes and Leibniz.
In medieval history, I heard Karl Hampe and Friedrich Baethgen in Heidelberg, and Stengel in Marburg, and thus I acquired valuable skills in diplomatics, chronology, and the source analysis of documents. Without the knowledge of medieval rhetoric which I acquired from Hampe, I should not have been able to recognize the links between medieval and Renaissance rhetoric which [ described and developed many years later. I also studied without credit a number of other subjects that I found useful or interesting. German literature and philology, comparative linguistics, physics and psychology, church history, musicology and art history (including Far Eastern art, of which I have remained an admirer ever since). I defended my thesis on Plotinus in 1928 and published it in 1930. It contained only a fragment of what I had planned to write about Plotinus, and the published thesis is quite different from the one I presented for the defense. My hope of writing a more complete monograph on Plotinus was never fulfilled, because I was later occupied with other topics. I still consider Plotinus one of the greatest philosophers of all times, and J am convinced that his influence on later thought has been much greater than is usually recognized. After I had passed my doctoral defense rather successfully my hopes for an academic career were suddenly disappointed, and I chose to study classical
philology in Berlin to improve my classical training for further work on ancient philosophy and also to prepare for a state board examination in Greek and Latin that would enable me to teach these subjects in a gymnasium. I took
courses with Werner Jaeger and Eduard Norden, and with Ulrich von Wilamowitz-Moellendorf, Friedrich Solmsen, Richard Walzer, and others. I learned textual criticism and palaeography from Paul Maas and classical linguistics from Wilhelm Schulze. I also read or reread most major classical authors (without translations, but with commentaries, where necessary), and I also learned to write (though not to speak) Latin. [ learned from Jaeger to
pay attention to the literary genre of a text and to the patterns and topics
PAUL OSKAR KRISTELLER Itt connected with that genre, and from both Jaeger and Norden the importance of ancient rhetoric and of its complicated relationship with philosophy. I also learned to study the history of philosophical terminology. All this proved to be very useful for my later work. During the same period, I revised my thesis on Plotinus, wrote several book reviews, a seminar thesis on Cicero’s inter-
pretation of Plato’s ideas, a paper on the origin and meaning of the term Orexis (desire) in philosophical literature, and a state board examination thesis on one of the speeches of Pericles in Thucydides. Having passed my state board examination with honors in 1931, I went to
see Heidegger in Freiburg, asked him whether he would sponsor me for an academic career (Habilitation), and proposed as my subject Marsilio Ficino, who interested me as the leading representative of another important period in the history of Platonism. Heidegger agreed, and I went to Freiburg to work on Ficino, while attending Heidegger’s lectures and seminars. I frequently saw him about my thesis, played the piano at his home (as I had done also in Marburg), and obtained the friendly support of several historians and classical scholars on the Freiburg faculty, including Eduard Fraenkel. In 1932, I received from the German Research Fondation a fellowship of a type reserved
for future university teachers. By 1933, [ had finished the groundwork for a book on the philosophy of Ficino, and in the spring, I went to Italy to look at manuscripts and early editions of Ficino, especially in Rome and Florence. The results were greater than expected, for I found a number of unpublished works of Ficino, especially letters and early treatises, some of them unknown to previous scholars, and I planned to publish them in an appendix to my book. While I was in Italy in March and April 1933, [ suddenly learned that the newly installed Nazi government had issued a law that excluded all persons of Jewish descent from academic positions and from a number of professions. I immediately realized that this was the end of my career in Germany, and that
it would be necessary for me to emigrate 1f I wanted to pursue my work. Instead of staying in Italy, I returned to Germany to settle my affairs, moved from Freiburg to the home of my parents in Berlin, and stayed there until early in 1934, writing nearly one half of my book on Ficino. I also corresponded with foreign scholars and institutions to prepare my emigration, and taught in a private school directed by Vera Lachmann. When some Italian scholars, including Giovanni Gentile, expressed interest in my work, I went to Rome in February 1934, met Gentile and others, did some work as a translator,
and also pursued extensive manuscript research in the Vatican and other Roman libraries. It was at this time that I became aware of the large number of potentially interesting works, not only of Ficino, but also of his friends and
I{2 I990 correspondents, predecessors, contemporaries and followers, that were not only unpublished but even unknown to exist, and I began to collect descriptions and microfilms of these writings. In the fall of 1934, I moved from Rome to Florence, where I obtained a position as a teacher of Greek and Latin at a
private school for German refugee children, and also a modest post as an assistant lecturer of German at a branch of the University of Florence. I also pursued extensive manuscript research in Florentine libraries, and the results
were as rewarding as they had been in Rome. In the summer of 1935, I obtained, through the help of Gentile, the post of a lecturer of German at the Scuola Normale Superiore and at the University of Pisa, where I spent three fruitful years. The Scuola is a community of graduate and postgraduate fellows, selected on the basis of a national competition; its students are among
the best in Italy, and many of them later become college and university professors. Living among them and working with them, I made many friends and became a part of the Italian academic scene. Gentile had my Ficino texts published in two volumes as Supplementum Ficinianum (1937) and started to publish with me a series of unknown humanistic texts edited by students and
graduates of the Scuola Normale and by other scholars. The first volume appeared in 1939, and volume 19 1s now in press. I also finished my book on Ficino, translated it into Italian, and made arrangements for its publication. In addition, I published several articles and book reviews in various Italian periodicals. At the same time, I traveled throughout Italy to explore all major
libraries for Renaissance texts, with the usual encouraging results. I was _ helped by a method learned from more experienced scholars, that is, I did not merely work from indexes, but systematically scanned all available printed
catalogues and unpublished inventories, and I thus came upon authors and texts which had not even been known to exist. These trips also gave me an opportunity to see a large number of Italian cities, their monuments and art collections as well as their libraries and archives, and as a tourist interested in art history, I learned to appreciate the variety of local schools and traditions, in architecture, sculpture, and painting as well as in the decorative arts and crafts, and to understand the political and cultural history of a town or region from its monuments, museums, and libraries. In the spring of 1938, Hitler visited Italy, and during his stay, all German refugees were either put into jail or had to report daily to the police. This was an omen of worse things to come. In the summer of 1938, the Fascist government issued a decree that excluded all persons of Jewish descent from academic and other public positions, and those who were not Italian citizens had to leave the country within six months. I immediately lost my position in Pisa
PAUL OSKAR KRISTELLER IT3 and had to make plans for a second emigration. I wrote to many colleagues
and friends, especially in England and the United States, and received a number of encouraging replies, yet they did not lead to a quick solution. Ludwig Bertalot, a German private scholar living in Rome, offered me a paid position as his research assistant, and from October 1938 to January 1939 I
worked for him in Rome, mainly on a volume of the catalogue of Vatican manuscripts which he was preparing, and at the same time, I continued my own work. In January of 1939, I received an invitation from Yale University, where I had several American and German friends, to come there as a teaching fellow and to give a graduate seminar on Plotinus. I left Italy, where everybody whom I knew behaved with the greatest kindness, and arrived in New
Haven in February, where I was cordially received by many old and new friends, especially by Herman Weigand and Roland Bainton, who obtained for me the hospitality of the Yale Divinity School. I had received a non-quota immigration visa on the basis of my invitation, and I enjoyed the opportunity to continue my work as a teacher, although my spoken English was at first
quite halting and improved but gradually. I also learned that my position would not last for more than a term, and thus I had to look for another job for the fall of 1939. Fortunately, I received several lecture invitations upon my arrival, and I visited many scholars at various universities whom I had met before or to whom I was introduced. I also was able to get one of my lectures
published right away. I finally was offered a position at Columbia as an associate in philosophy for one year, with a modest salary, part of which was paid by the Carl Schurz Foundation. In June 1940 I married Edith Lewinnek,
whom I had first met in Germany and who has been my companion and adviser ever since, while pursuing her own career as a specialist in rehabilitation medicine and as a faculty member of the New York University Medical School. It turned out that I was to spend the rest of my life at Columbia. For nine years I remained as an associate with an annual contract, then I was made an associate professor with tenure in 1948, a professor in 1956, 2 Woodbridge Professor of Philosophy in 1968, and retired in 1973. The university was my home before I became a citizen (in 1945), and over the years I have had close
relationships with many colleagues at Columbia and other institutions, not only in philosophy, but also in history and classics; in Italian, French, German, Spanish, and English literature; in Oriental studies and in bibliography; in religion, musicology, and art history; and also in political science and sociology. I have learned a lot from my colleagues, especially about Aristotelianism from John H. Randall and Ernest Moody, and about the history
14 1990 of science from Lynn Thorndike, and I greatly profited from the rich resources of our library and from the competent help of its staff. I regularly taught lecture courses on late ancient philosophy and on Renaissance philosophy, mostly for graduate students in different departments, and also seminars,
often jointly with Randall and others, on some of the major philosophers from Plato to Hegel. I also taught a course on research techniques, and was involved in a large number of dissertations in many departments. Since 1945, I have taken an active part in the University Seminar on the Renaissance and in other faculty seminars at Columbia. I have lectured frequently and widely at
many colleges and universities both in this country and in Europe. For my major scholarly projects, I received support from various research councils at Columbia, from the American Philosophical Society (of which I have been a member since 1974.), from the Bollingen Foundation, the American Council of Learned Societies and the National Endowment for the Humanities, as well as from the John Simon Guggenheim Foundation, the MacArthur Foundation, the Renaissance Society of America, the Institute for Advanced Study in Princeton, and the Warburg Institute in London. I served as a member of the Committee on Renaissance Studies of the ACLS, and 1t was with its help and support that I was able to organize, with the help of several colleagues, the Catalogue of Latin Translations and Commentaries, a cooperative international project, of which volume VII 1s now in press. I also published, with the support of the Warburg Institute and of several foundations, my finding list
of Renaissance manuscripts (ter Italicum), of which volume V is now in press. This work is based on the notes I collected in Italy before the war and also on the results of many more recent trips through Europe and the United States. As a by-product of this work, I published 1n several editions a bibliography of printed catalogues and unpublished inventories of Latin manuscript books. I have been active in the Renaissance Society of America from the time it was founded 1n 1954, and also in the Medieval Academy of America. This latter association has helped me to understand the criticism directed by many medi-
evalists against some conventional views of the Renaissance and to redefine
the Renaissance and its humanism in a way that takes this criticism into account. I have also done much reviewing, especially for the Journal of Philosophy, and much editorial work, especially for the Journal of the History of Ideas, the Journal of the History of Philosophy, and for Renaissance Quarterly.
My book on the philosophy of Ficino appeared in English in 1943, the somewhat better Italian version in 1953, and the German original only in 1972.
Most of my other books are collections of lectures or articles, composed on
PAUL OSKAR KRISTELLER ITS different occasions and for different audiences. Yet 1 always pursued topics that interested me, and not those that others asked me to investigate. I never
hesitated to intrude on neighboring fields, when pertinent, and I never minded hiding some of my best remarks in footnotes or digressions or in the middle of a paragraph, and thus most of them have been predictably overlooked. I did not learn early enough that to make an idea stick you have to put it into the title and conclusion or abstract of a paper, never mind what the rest of the article contains. Yet I did present my views on some major Renaissance
thinkers in a series of lectures delivered and published at Stanford, and a similar series on Hellenistic philosophy was recently given at Pisa. My studies on Ficino emphasize his metaphysics and its original as well as its ancient,
patristic, medieval, and early humanist elements. I hope I have helped to clarify the significance of Renaissance humanism, showing from its literary production and from its professional activities that it was centered in the study of grammar, rhetoric, poetry, history, and moral philosophy, and not in logic, natural philosophy, or metaphysics. Humanism did not destroy Aristotelian scholasticism, as often asserted, but coexisted with it through the sixteenth century and beyond. On the other hand, Renaissance humanism had close connections with medieval grammar and rhetoric. I called attention to several neglected texts of medieval rhetoric, and to the fact that medieval rhetoric was not limited to sermons and letters, as often claimed, but included a sizable body of secular oratory, especially in Italy with its city republics. I also showed that the school of Salerno, often presented as purely practical, made a notable contribution to theoretical and scholastic medicine from the twelfth century on, and also initiated that alliance of philosophy with medi-
cine (rather than with theology) that was to remain characteristic of the Italian tradition down to the seventeenth century. I also showed that Aristotelian scholasticism as taught at the University of Bologna had a document-
able influence on the early Tuscan poets of Dante’s generation, and that humanists, contrary to a widespread belief, did not oppose or try to abolish the vernacular, but actually cultivated and promoted it as part of a bilingual culture that is characteristic of both the Middle Ages and the Renaissance. I also emphasized that for a proper understanding of the various philosophers and other authors, we should not only study their writings, but also their life and professional activities, the curriculum of the schools and universities where they studied and taught, the place their works occupied within the contemporary classifications of the arts and sciences, the traditions of the literary genres to which their various writings belong, and finally the meaning, sources, and origins of the technical terms used by them.
116 1990 I have attempted throughout my life to pursue scholarship for its own sake, not for the pursuit of personal or political goals, and I survived two emigrations because the international academic and scholarly community had a sense of solidarity and of objective standards of work. Anybody who had something to contribute was welcome, and I found friends and supporters in two foreign
countries because they liked my skills and my knowledge, not my face or gender, race or religion, national background or political opinions. My career
was based on my work as a scholar and teacher, which I pursued with the support of my university, several other institutions, learned societies, and foundations, including the ACLS. The humanities, as I understand them, were then an integral part of the curriculum of all colleges and even of the better secondary schools. A graduate student could expect to teach in college even before he got his degree. The colleges gave a general cultural background
to the majority of their students who went into business or the various professions, and a solid preparation, including foreign languages, to the future graduate student who thus acquired a familiarity with the general context of his field before he began the more specialized research for his dissertation. Moreover, there was at least some tolerance among the various disciplines, and each specialist was willing to respect the contributions made by other fields or even by other schools within his field. Also, the general public and the news media had some interest in, and a moderate respect for, the representatives of scholarship. This situation has profoundly changed for the worse since the 1960s. The general public and the news media show complete indifference or even contempt for scholarship, except when a sensation 1s involved, as in the case of the Leonardo manuscripts discovered in Madrid in 1967, which were widely and also wrongly reported; or when a subject is related to entertainment, tourism, or financial investment, as is the case with the history of music and the theatre,
with archaeology and art history, or with the study of manuscripts and rare books. The admission and graduation requirements of the secondary schools have been constantly lowered and have almost disappeared, and it is now quite common for students to graduate from high school without having learned the most elementary skills of literacy or arithmetic. More recently, the colleges, and even many of the most renowned colleges, have admitted students who are in need of instruction in remedial English and composition. A diploma from a high school or even from a college is considered as a mere entrance ticket for a better paying job, and nobody seems to understand any longer that such a diploma should be earned by acquiring a certain amount of knowledge and skills. The graduate schools have thus far managed to main-
PAUL OSKAR KRISTELLER iI17 tain their standards, but it remains to be seen how long they will be able to do
so if the preparation of their incoming students continues to deteriorate. Within academia, the natural sciences have been able to maintain their standards, thanks to their prestige and the acknowledged usefulness of their technological applications, though some of their less useful fields, such as pure mathematics, theoretical physics, or taxonomic biology, are receiving less support than they deserve. The social sciences, which have made excellent contributions based on historical scholarship, have to some extent been dehistoricized, tended to base their findings on the dubious evidence of statistics, questionnaires, or opinion polls, and to disregard the fundamental distinction between facts and goals or values. They have made ambitious claims in defining, predicting, and solving social and political problems, and these claims, though they have been more often wrong than right, have been taken at their face value, not only by the general public, but also by many philosophers and other academics, and have contributed to the public and academic decline in the support of the humanities. Finally, many representatives of the humanities have changed their emphasis toward contemporary history and literature, topics that should not be
neglected, to be sure, but that often do not require the rigor of detailed research, as it is needed for the study of earlier periods. Faddist theories based on sensational claims rather than on solid evidence are widely acclaimed, and
the advocacy of political, ideological, or religious causes is brazenly proclaimed to be a substitute for evidence. A widespread antihistorical bias has put historical studies on the defensive, and in the academic power game, faddists and ideologists are often preferred to serious younger scholars, not to speak of the steady loss of teaching positions in subjects that are no longer required or considered useful or interesting. It has been noticed in more than one area that the number of persons of any age, properly trained and compe-
tent to deal with certain specialties, amounts to less than a handful in the entire country. We have witnessed what amounts to a cultural revolution, comparable to the one in China if not worse, and whereas the Chinese have to some extent overcome their cultural revolution, I see many signs that ours is getting worse
all the time, and no indication that it will be overcome in the foreseeable future. One sign of our situation 1s the low level of our public and even of our academic discussions. The frequent disregard of facts or evidence, of rational discourse and arguments, and even of consistency, is appalling, and namecalling is often used as a substitute for a reasonable discussion. Every interest group demands immediate action in favor of its own goals, and easily resorts
118 1990 to noisy demonstrations or even to violence. What we need is a careful examination of all pertinent facts and arguments, followed by a rational decision
that may be a fair compromise between the groups and interests involved. Instead of recognizing that along with many problems that we cannot solve (at least at the present) there is a solid core of knowledge to which we should hold on and which should set a limit to our arbitrary thoughts and actions, we encounter a pervasive kind of scepticism or relativism which claims that any opinion 1s as good or justified as any other. Every statement made before the last five years or before the latest fad 1s considered hopelessly antiquated, and traditional scholarship has become a term of opprobrium. I hold on to the view that our scholarship 1s cumulative, though not static, that many thoughts and opinions of the distant past may have some interest or validity for the present
and future, and that there is a large body of important subjects that has not even been mentioned in the literature of the last five years. Our relativists proclaim that words have no fixed meaning and that we are free to decide what each word in a past writer must have meant (like Humpty Dumpty, who decides by himself that a word means just what he chooses it to mean). I gladly admit that words change their meanings in the course of time and that old words disappear while new ones are coined. Yet we can and must rely on a firm tradition of lexicography in the various languages concerned, including the large Oxford English Dictionary, that tells us what a given word meant at a given time, im a given context, and to a given writer. We are thus able to refute obvious blunders by the straight evidence of original texts and documents and with the tested methods of historical and philological scholarship. Intepretations based only on translations or secondary literature should be reyected outright unless they are confirmed by original texts. While sticking,
to problems that we can solve, we should reject the claim that all solvable problems are trivial and that all important problems are insolvable except by speculation. The world of history and philosophy is a puzzle that can be solved but slowly, by constantly adding new questions and answers and thus modifying the earlier ones. It may be argued that religious faith transcends the limits of secular knowledge, but within the range of human knowledge, the plain facts ascertained by experience and reason cannot be contradicted by an
appeal to conventional or fashionable opinions that claim to be true in a higher sense. I do not know what the future will bring, and my expectations are rather grim, not only for our education and scholarship, but also for our economic, legal, and political future. Like Cassandra, I hope I shall be wrong.
PAUL OSKAR KRISTELLER Il9 I am holding on to the methods and convictions that I have followed all my
life, and nothing that has recently been proposed to replace them has convinced me as valid. I wish the best of luck to the many younger scholars who follow our methods and ideals and who will invariably correct our results and
make new additions to our knowledge. They are likely to have a harder struggle than we did in getting recognized in a world that has become basically hostile to scholarship and to learning. I hope I am speaking also on their behalf, since they cannot afford to say what they think. I also hope that
the ACLS will use its influence, along with that of the NEH, to support humanistic scholarship and research, to help it defend its place in the colleges and universities, and also in the schools and in the public world, and perhaps to regain some of the ground that it has lost mn recent years. I have no illusions about the good old days (they had their own troubles), and I am aware of the limits imposed on philosophy, science, and scholarship in this immense world which is complex and irrational and to a large extent (contrary to what many people believe) removed from our control and manipulation. I sometimes wonder whether my life in many of its phases (as the
lives of many other human beings) has been like that of the rider on Lake Constance who barely reached the shore before the ice he had crossed melted behind his back. Yet I like to justify my fascination with the past, and especially with the
history of philosophy and of learning, by stating my belief that the past remains real even after it has disappeared from the scene. It is the task of the
historian to keep it alive, and to do justice also to the defeated and to the neglected, at least to the extent to which they deserve to be remembered.
This page intentionally left blank
I9QIT MILTON BABBITT William Shubael Conant Professor Emeritus Princeton University
I am grateful and flattered to have had my talk . . . included under the ongoing rubric of “A Life of Learning,” but in all accuracy and necessary realism I must be permitted the protective sub-rubric of “A Composer of a Certain Age,” for how might a composer justify his presence before learned representatives of learned bodies, when the very term learned has appeared and disappeared in the history of music only in the most apologetic and fugitive of roles, in such expressions as learned writing or—more specifically—
learned counterpoint, usually with the intimation of the anachronistic, the factitious, and—even—the jejune? There does appear to have been a fleeting moment or so in eighteenth-century France when the term Jearned was invoked to characterize a “taste” distinguished from the “general.” Apparently, compositions were deemed to be “learned” if it was thought that their understanding demanded some musical knowledge. But this elitist distinction did not, could not, survive the guillotine, and never was to be reheaded, certainly not with the subsequent and continuing triumph of what Goodman has called the Tingle—-Immersion theory, which—when applied to music—demands that music be anyone’s anodyne, a non—habit-forming nepenthe. I could dig even deeper historically and dare to remind you that, in the medieval curriculum, music was a member of the quadrivium, but that curriculum, like so many demanding curricula after it, has long since been banished.
And, in any case consider the company that music kept in the quadrivium: arithmetic, geometry, and astronomy. [f that curriculum had survived, music would be burdened further with guilt by association, because—for reasons apparently more sociological than methodological—there is no characteriza121
122 1991 tion that guarantees music more immediate, automatic, and ultimate derogation and dismissal than mathematical, thereby joining /earned and, above all, academic.
But it is as academics that we join here. . . . I trust it does not come as a surprise, or as an unpleasant embarrassment, or as further evidence of the Greshamization of the university, to learn that there are composers in your very midst on your faculties. Apparently there still are those who remain unaware of our presence, and even more who are unaware of the significance and causes of our presence. But there is no more consequential evidence of the
intellectual, institutional reorientation of musical composition in our time and country than the fact that the overwhelming majority of our composers are university trained and/or university teachers, and that—for this and other reasons—the university has become, awarely or unawarely, directly and indirectly, the patron of and haven for not just composers, but for music in all of its serious manifestations. This state of affairs began at that crucial moment for music in this country in the mid-1930s, was interrupted by World War II, and accelerated and spread after that war.
There were isolated spots of enlightenment much earlier, but the fate of Edward MacDowell at Columbia University early in the twentieth century was a more characteristic symptom of the state of music in the academic community. MacDowell, having recently returned from musical training in Europe—the customary journey of the American composer at that time—and hardly a wild radical, either musically or otherwise, was determined to—in his own words—“teach music scientifically and technically with a view to teaching musicians who shall be competent to teach and compose.” But the new
president of Columbia University, Nicholas Murray Butler, who—in this regard at least—was slightly ahead of his time, set a precedent for future administrative attitudes toward music in the university, by opposing MacDowell, proposing instead what MacDowell described as a “coeducational department store” at Teachers College. Butler trrumphed; MacDowell resigned. That was three decades before the cataclysm which carried the transformation of thinking in and about music in that sudden reversal of its former path between Europe and this country and, not entirely coincidentally, carried me
to the chief port-of-call at the end of that journey, making my musical academic life chronologically co-extensive with that decisively new musical era and the subsequent, almost immediate change in the role of the composer in academic society. For I, in very early 1934, transferred to New York University’s Washington
MILTON BABBITT 123 Square College because of a book, a book that by current standards would appear to be a modest makeshift of a book, but it was the first book written in
this country on twentieth-century music, and—indeed—it was entitled Twentieth-Century Music. The author was Marion Bauer, an American composer born in Walla Walla who had studied in France and returned to teach here and—it must be admitted—to collect and assemble snippets of musical journalism and other trifles into a book. But that book, published in 1933, displayed tantalizing musical examples from Schoenberg’s Erwartung and Pierrot Lunatre, Krenek’s piano music, late Scriabin, Casella, and other music performed little or never in this country, and difficult to obtain for study. In the book, unknown names were dropped in droves. And so a young Missis-
sippian, whose curiosity and appetite for contemporary music had been aroused by summer visits to his mother’s home city of Philadelphia, decided that if the works discussed by Marion Bauer were, as she strongly suggested, music to be reckoned with, then music’s day of reckoning must be at hand, and he wished to be there. There were other stimulations at the Washington Square College by 1934 beyond Marion Bauer’s enthusiasms and the music itself; there were Sidney Hook, William Troy, and the early James Burnham. Burnham and Wheelright’s Philosophical Analysis had just appeared, and the periodical Symposium was being published. But, overshadowing all of that for a young composer/ student, just a bit over three months before [I arrrved in New York, Arnold Schoenberg had arrived, from Berlin, by way of Paris, to teach in Boston, but soon to live in New York. Schoenberg was one of the first to do many things, including landing on these shores, but soon to be followed by Krenek, Hindemith, Stravinsky, Milhaud, Barték, and others less celebrated—Rathaus, Schloss, Pisk, Wolpe—yet all of whom had contributed to that intricately tesselated territory which contemporary music had become over the preced-
ing quarter of a century. They were not all at or straining at the various and varied musical frontiers; there were even—-among them-—prelapsarians
(Hugo Kauder, for instance), who believed that contemporary music had gone wrong when it had gone anywhere. But almost all of these composers became college and university teachers, whereas in Europe they had taught, if at all, only in conservatories. And just that suddenly and summarily the complexly convoluted path of contemporary musical creation crossed the ocean and critically transformed our musical environment at a crucial moment in
music. The once musical innocents abroad now became the hosts to and custodians of a host of traditions, old and new. There were, on both sides of the engagement, the unavoidable shocks of new cognitions, the awareness of
124 1991 the effects of deeply different informal and formal conditioning: the European musicians had heard and been shaped by what we could not hear, but had not learned in their vocational schools what we had 1n our universities, both in music and beyond. When I graduated from college in 1935, I chose to remain in New York, and
to study composition privately with Roger Sessions, who, though a product of American universities, had returned only recently to this country after some eight years in Europe. His compositions, here regarded as complex and even forbidding, were actually a skilled and sophisticated but highly personal prod-
uct of European compositional attitudes and thought. He had written about both Schoenberg and Schenker, who are to concern me here and already concerned me then, and also of European “music in crisis,” a crisis that he hoped to see and hear resolved in this country. Soon thereafter, Sessions began teaching at Princeton University, where I joined him on the faculty in 1938. But even during those three years of intensive private study with him, the powerful presence of Arnold Schoenberg, or to be more accurate, of Schoenberg’s music affected, even directed me, as it did many others, some in very different directions, for all that the music still was seldom heard, and Schoen-
berg himself had emigrated across the continent to California. When Schoenberg had arrived in New York, he embodied—far more than any other composer—within his creative achievement the revolutionary road that music had taken. It is too easy to say, albeit with some slight accuracy, that he was a reluctant revolutionary, a revolutionary in spite of himself, but not—surely—in spite of his music. The designation revolutionary may smack of hyperbole, even of hype; it may suggest music’s presuming to reflect the glamour of such entrenched expressions as “the revolution in physics,” “the revolution in philosophy,” but while eager to avoid any intimation of that undisciplined, interdisciplinary dilletantism which has so bedeviled music, I can find no evidence that any other field has undergone more fundamental and pervasive a conceptual transformation so affecting the fields’ practitioners’ relation to their field, or to the world outside the practice. There are even those who locate the first shot of the revolution as the last movement of Schoenberg’s Second String Quartet of 1908, and even suggest that Schoenberg
himself did in the words of the soprano in that movement: “I feel the air of another planet,” for all that the words were those of Stefan George. After all, Schoenberg selected them. The works that followed, many of them now familiar, include the Five Preces for Orchestra, Erwartung, Pierrot Lunatre, and they and a few yet to follow soon were termed atonal, by I know not whom, and I prefer not to
MILTON BABBITT 125 know, for in no sense does the term make sense. Not only does the music employ tones, but it employs precisely the same tones, the same physical materials, that music had employed for some two centuries. In all generosity, atonal may have been intended as a mildly analytically derived term to suggest atonic or to signify a-triadic tonality, but, even so there were infinitely many
things the music was not; what it was is better described by such terms as automorphic, contextual, self-referential, and others, all agreeing on a character-
ization of the music so context-dependent as to be highly sensitive to its statement of its initial conditions, and defining its modes of relation and progression within itself, that 1s, within each composition. Later, Schoenberg described his procedures of that period as “composing with tones” and “com-
posing with the tones of a motive,” which are not equivalent characterizations, the first suggesting as a referential norm a pitch-class collection, the second a registral and temporal instantiation of such a collection, but both confirmed the notion of the highly autonomous nature of the individual compositions’ structure, and both placed the composer in the position that an idea for a piece was, necessarily, the idea of a piece. Aimost immediately after the appearance of Schoenberg’s Second Quartet, his Viennese students Webern and Berg created works sharing only the property of being comparably self-contained, and soon compositions by those not of this inner circle began to appear. The “paradigm shift” was on. I dare to
employ this expression, not to give my once colleague Tom Kuhn an unneeded plug, or to demonstrate that music is or was “with it,” but because the concept is, at least, suggestive in describing the subtle effect of Schoenberg’s new music. Almost immediately, there was the attempt to patch the old paradigm by attempting to describe, to “understand” the new, unfamiliar in terms of the old, familiar. But the result was only to create a picture of an incoherent, unsatisfactory familiar, inducing the normative conclusion either that this music was “nonsense,” or required a different construal. Here we are talking of discourse about the music, a theory in some sense, and I intend to return to some of the senses of music theory. But for a time, neither the music nor the observations of the music had any other widespread effect than that of puzzled wonderment or bitter antagonism. There appears to have been little or no effect on composers in this country, but Schoenberg himself was critically affected by this music, his music, which still remains in many respects fascinatingly refractory. For, at about the age of forty, this composer not only of those “problematical” but of such “traditional” early works as Verklarte Nacht, Gurrelieder, and Pelleas und Melisande suffered nearly a decade during which no considerable work was completed. Later, he said of those works imme-
126 1991 diately preceding that hiatus, that he felt that he (and Webern and Berg) could not produce, by those compositional means, works of “sufficient length or complexity.” The term complexity 1s a particularly startling one here, if one thinks in terms of quantitative complexity, for surely, few works have as many
notes per square inch or elapsed second as the fourth and fifth of his Fzve Orchestral Preces, only for example. It 1s clear that he was referring to that kind of structural complexity, that relational richness which tonal music manifested
in its capacity for successive subsumption, cumulative containment which musical memory demands if a work eventually is to be apprehended, entified as a unified totality, as an “all of a piece of music.” One must infer that Schoenberg failed to find such structural “complexity,” such a realization of his version of musical concinnity in compositions which, for all of their fresh and fascinating local linkages, novel rhythmic and instrumental modes, asso-
ciative harmonic structures, could not achieve such a realized unity. For a silent decade, then, he proceeded to pursue, not by word, but by musical deed, a new synthesis, a truly new conception of musical structure. At this point, perhaps I should confess that-—-whereas I was contracted to offer an aspect of my autobiography here—I appear to be presenting Schoenberg’s biography. But I am offering my highly autobiographical version of his biography, and without at least such a brief overview of those unprecedented developments, my own activity would appear and sound 77 pacuo, in a quarantined region. What Schoenberg’s works, beginning in the mid-1920s, and Webern’s and Berg’s soon thereafter, instantiated was a conception of musical structure that altered fundamentally the hierarchical positions of the primitive musical di-
mensions, beginning with the primary realization that music proceeds in time, an observation made by even so non-professional a musical analyst as
T. S. Eliot. The works that displayed such features of organization were commonly, all too commonly, termed fivelve tone, or compositions in the twelve tone system. Schoenberg particularly objected to the term system since it con-
noted for him, with his rather special view of the English language, imperatives and prescriptives, as would be associated with such expressions as a system for winning, or losing, at roulette. And he did describe the conception
far better himself as “composing with twelve tones related only to one another,” or, as amended and extended by picky Americans: “composing with
pitch classes related to one another only by the series of which they are members.” Observe that the autonomous, inceptually context dependent features of those so-called atonal works are preserved, but the shared characteristics are now embodied in the word series, thus serial. For this shared mode of
MILTON BABBITT {27 pitch class formation is indeed a serial relation: irreflexive, asymmetric, and
transitive, and its compositional interpretation is usually and primarily, although not exclusively, temporal. Our colleague Leibnitz once asserted that “time is order”; from this I promise not to commit the illicit derivation that “order is time,” but most often it 1s so interpreted in the twelve tone case. But
music also presents order in space, and it is in these representations of the series, transformed by interval preserving operations, that the new communality resides, for Schoenberg was passionately attempting to restore a common practice, but a new common practice, in order to regain, for the composer and the listener, that interplay of the communal and the proprium, of the shared and the singular, with the attendant consequences of contingency and dependency of progression susceptible to inter-compositional regularity. When Schoenberg arrived in New York, his name was far better known than his music because we had no opportunity to hear his later works, and we were able to study only a few of his later scores, one of which, a piano work written just two years before his arrival here, had just been published, not in Austria, but in California, by the New Music Edition. It was customary for this publication to include a biographical and program note with each work, but in the case of Schoenberg’s composition the editor wrote: “Arnold Schoenberg has requested that we do not publish either biographical notes or musical explanations concerning his work, since both he and his musical viewpoint are
well-known.” Although Schoenberg remained in New York only a few months, that was certainly sufficient time for him to discover that what was well-known as “his viewpoint” was derived not from a knowledge of his music
or even his words, but from misapprehensions derived from a tradition of absurdities, originated and propagated by newspapers, magazines, and, textbooks. The few of you who can recall and the more of you who are aware, I hope, of the climate of those times, the mid-19308, will not be surprised to learn of the grotesque ideological turns taken by discussions of the so-called “twelve tone system” by concerned observers. “Was it or was it not ‘democratic?’” After all, because all twelve pitch classes were permitted and included in the series, the referential norm of such a work, the self-declared champions announced that, therefore, “all the notes were created free and equal,” “one note, one vote”; but there were those who demurred and declared the music, the “system” fascistic, because it imposed an “order,” and each work imposed “a new order” upon the pitch classes. This latter compares well in intellectual sophistication with that pronouncement of a celebrated French intellectual that language is fascistic, because it contains “subjects,” “subordinate clauses,”
and the like; and for those of you concerned with cultural lag, a Dutch
128 I991 composer recently revealed that serialism is socialistic, on the basis of the same
old equivocation. It is a particularly distasteful reminder that in those countries that proclaimed themselves “socialist,” music they labeled—accurately or
otherwise—serlal, atonal, or twelve tone was denounced and banned as “bourgeois modernism,” “imperialist formalism,” or . . . “degenerate Jewish music.” And those concerned with vocational lag might care to know that the more serious, or—at least—more pretentious misunderstandings and misrepresentations, offered in the form of putative “rules,” prescriptives, permissives, often accompanied by that most decisive term of dismissal, mathemattcal (“twelve tone” contains that recondite mathematical term twelve, for all
that these twelve “tones” are the same ones that had been employed by composers since the time of Bach) persist until today in otherwise consctentious periodicals whose primary field are literary, or political, or cultural. My concern is less that such misleading assertions have been and are being made than that they reflect how some apparently attempt to hear this music and
misguide others in their hearing, understanding, and experiencing of the music. So, if you happen to encounter a reference to “Schoenberg’s twelve tone scale,” immediately cast the offending document into the Humean flames.
Given this congeries of conditions, one could not have expected a large audience to gather in 1937 when the Kolisch String Quartet, transplanted from
Europe, presented Schoenberg’s latest work in its first New York performance. In a small, noisy room in the 42nd Street Library, the remarkable Fourth String Quartet was played. It was an extraordinary example of the profoundly new means and innovative ways of twelve tone composition, where the range and reach of reference they made available, the richness of relatedness they made obtainable were revealed as decisively as the implications and intimations for extension to other personal realizations, to satisfy other composers’ musical dispositions. There was no issue here of replacing or displacing “tonal” music, or of teaching old notes new tricks, but of creating another music, whose compositional instances already were and were to become even more distinguished and distinct, not just on the surface, but well beneath the surface. When I began teaching at Princeton in 1938 there was little academic or pedagogical reason to flaunt my dodecaphonic involvement. ‘The music department was new, and—strictly speaking—was not even a department but a section of the art and architecture department, and I did not wish to burden
its beleaguered chairman with the presence of one who would have been certain to be viewed as a musical recusant, particularly by those many mem-
MILTON BABBITT I29 bers of the academic community and their wives who made no effort to mute their claim to musical authority. Even so, the trme came when I gave them occasion to give vent to their offended aesthetic. An innocent little string trio of mine was performed on a concert sponsored by the section of music. Well, not exactly performed: it was a three-movement work, and the three members of a fairly well-known string quartet, also recently transplanted and no true believers in the abilities of an unknown American composer, decided—first— not to play the first movement, and—second—not to play the third movement, leaving a lonely little slow movement. But such were the times and place that the modest movement created some embarrassment for my chairman, who now was revealed as harboring a no longer latent musical anarchist. So, in an attempt to demonstrate my possession of other than deviant capacities, I wrote a post-Regerian work for a cappella chorus, entitled Music for the
Mass, a setting of sections of the Ordinary of the Latin mass, which may explain partially why, in a recent volume entitled Serenading the American Eagle, the author refers to that work as Music for the Masses, in pursuance of his
thesis that in those times, no one—not even I--was above or below pursuing proletarian politics, and this when I was attempting merely to be academically politic. And Music for the Mass was awarded a prize by what would have been considered a very conservatively inclined Columbia University panel, long after MacDowell. In my few years of teaching between my beginnings and the considerable interruption by World War IT, instructing in one musical syntax and composing in yet another one was less schizophrenic than beneficially-—dare I say it——
symbiotic. The necessary examination and self-examination attending a venture into a new and largely untested domain, where still few composers had ventured, induced reconsiderations of aspects of music and their associated terminological categories as they had figured in traditional music and theory, where terms had been allowed glibly and uncritically to slip through and slide about in a swamp of ambiguity. For instance: “register,” “pitch class and pitch member of such a class”; properties which had been treated as independent
primitives proved to be derivable, and the compositional and perceptual susceptibilities to structuring of the four notationally independent musical dimensions: pitch, temporal, dynamic, timbral, each subject to different scalings, one of the unique and rich resources of fully conceived musical composition, demanded thorough reexamination. And there was another powerful influence on our thinking, our rethinking
about the music of the past, an influence that landed and settled in this country at about the same time as Schoenberg’s, further affirming the United
130 1991 States as a musical melting and even melding pot at an unprecedented level of
both theoreticity and musical actuality. I can best broach the subject anecdotally. The pianist and composer, Eduard Steuermann, who had been closely
associated with Schoenberg in Europe as the pianist in many of the first performances of his music, settled in New York in 1939 and soon became a valuable friend. One evening, with that characteristic timidity whenever he spoke of his new country, he finally dared to say to me: “This is surely a strangely remarkable country. Back in Vienna there was this funny little man who haunted the back streets exposing his analytical graphs, which no one
understood. Webern said he understood them, but everyone knew that Webern didn’t. Now, here he is a household name.” The “funny little man” was Heinrich Schenker, and the not entirely objective, mildly depreciatory characterization of him reflected the disjunction between the musical worlds inhabited by Steuermann, and those by Schenker and his students and disciples. That Schenker was a “household name” in his country was an exaggeration, but in some New York music circles he had become already an exalted name, as in Schoenberg’s case, known far more by name than by the content of his accomplishment. His writings, covering some three decades of evolving activity, were as little understood and as difficult to obtain as Schoenberg’s music, and all were only in German, but he—too—soon was represented here by those who knew his work by having studied with him or with his pupils. The ideological antagonisms that separated the composers of the Schoenberg circle from the theorists of the Schenker circle were not imported to this country. For example, Roger Sessions, who was surely a contemporary composer, wrote a searching article on Schenker as well as on the more speculative
writings of the composers Krenek and Hindemith, and all of the articles appeared in a magazine named Modern Music, for all that for Schenker, music (or, at least or at best, great music) ended with Brahms, and he had dedicated his early, but already penetrating analysis of the Beethoven Ninth Symphony to
“the last master of German composition,” which meant—for Schenker—all composition, and that last master composer was Brahms. Schenker never altered this judgment, for all that he lived and worked for another twentythree years. I have lived to see Schenker’s analytical method change its status from the heretic to the nearly hieratic, from the revolutionary to the received. Here the notion of a paradigm shift 1s pertinent, for Schenker analysis has largely displaced, replaced, and subsumed analytical theories of the past. From Steuermann’s reference to a “graph” one might assume mistakenly some quasi-mathematical procedure, but 1t was nothing of the sort; it was an
explanatory theory, the tracing of the pitch progression of a total work
MILTON BABBITT 13] through sucessively more extensive and imbedded, but generatively parallel, structural levels. For me, it was, and is, among other of its achievements, the most powerful hypothesis as to the performance of musical memory, how an appropriately equipped listener perceives, conceptualizes a triadically tonal work. Previous theories, which had been the basis of compositional instruction from the time of Bach, have consisted mainly in the form of rules abstracted from past practice in the small, in the very local, often with the added fillip of compounding generality with causality. Then would come that enormous leap to those few context-free patterns of dimensionally synchronous repetitions which were taken normatively to define musical “form.” There was no such abruption from the detail to the global in Schenker’s analytical theory. Its manifest explanatory scope and repleteness; its entailing of compositional constancies that were not revealed by other theories; its providing a framework for yet further insights not explicitly discerned by the theory; all these attributes made its eventual influence irresistible. Never before had there been even such an attempt, and therefore no such achievement. The later and continuing mountain of literature, mainly in this country, spawned by Schenker’s thought includes its applications to other compositions, its further methodological explication and refinement, revisionism, demurrers, concerns with a concealed derivation of the “should” from the “was” as post diction become dictum, as Schenker concentrated his analyses upon the few composers who constituted his pantheon, in yet but another instance of the Viennese “genius mystique.” His evaluatives never are coherently stated or even clearly inferable, nor are the bases from which he derived the prediction that no further great music could be written, with which he dismissed even the aspirations of those who shared his ideological appetites. Although today there is scarcely an elementary text that does not attempt to pay lip service, at least, to Schenker analysis (a method largely unknown half a century ago), the first generation of Schenker specialists entered the academic mainstream only slowly and against more opposition than did the composers from abroad. I was in the happy position of meeting and learning from Oswald Jonas, who was a private student of Schenker and the author of the first book expounding his method, and Ernst Oster, Jonas’ student and subsequently underground guru for many celebrated virtuosi who wished to conceal their intellectual aspirations. I remained close to both Jonas and Oster until their deaths by never discussing music written after 1897. This chronological disjunction between the music with which Schenker analysis was concerned, and the music (and soon, the musics) of post-I909 Schoenberg and others to follow did not conceal Schenker and Schoenberg’s
132 1991 cultural affinities. They both sought ties to bind them to the past by convincing themselves that they only minimally mutilated that past: Schenker by invoking the theories of the eighteenth century as his true predecessors, and
Schoenberg by identifying himself with tradition by identifying tradition with himself. After all, Schenker and Schoenberg were both of Vienna, of a sort, in a competitive embrace with its past. So, when Schenker, in 1926, wrote
to Hindemith: “You would do better to have the courage to declare that contemporary music is wholly new, rather than attempt to anchor it in the past,” this may have been self-serving, serving the covertly predictive aspect of his theory, but it 1s not without its sense and value, particularly if one under-
stands “wholly new” as conceptually new. Yet, when we were composing “new music” in the 1950s while studying and teaching the music of the past, with a considerable component of Schenkerian thought in that teaching, we found, just as our thinking in the music of the present affected our thinking about music of the past, so did our obligatory thinking about the music of the past deeply affect our thinking in our music of the present. While construing
the structure of a total, tonal work as the ever-expanding and subsuming manifestation of parallel processes—just a few such processes which had been
adumbrated in the often routine instruction of the eighteenth century—we were aware that such processes had operated only in the pitch dimension. The
serial principle of formation, interpreted as order in time, ultimately suggested not just such intra-dimensional parallelism but inter-dimensional parallelism, with the realization that the temporal domain was (and always had been) susceptible to interval scaling, almost precisely analogous with the pitch domain. There were other, many other, leaps across the systematic boundaries, in the ways of translating means of compounding the retrospective and
the proleptic in the course of a work, reinforcing and reflecting the epistemological condition of acquiring knowledge of a composition as it unfolds in time. Musical structure, necessarily, is in the musical memory of the beholder. The listener for whom the present event erases the memory of the past events creates for himself in a genuinely epistemic, non-journalistic sense, random music, music without inter-event influences. In the r9sos discussions of these matters, these awarenesses, even these urgencies (for composers facing new and puzzling choices) took place privately, within a few classrooms, from a rare podium. There was not a single medium of printed professional communication for composers and theorists. My first article on twelve tone serialism, containing necessarily only brief discussions of such even then familiar, now “old-fashioned” concepts as combinatortality, derivation, and generalized aggregates which I had developed
MILTON BABBITT 133 during those war years and which had and has shaped my composition since that time, could not be published until 1955, and then only in Britain, footnoted for Britain, in a short-lived British periodical. But in 1957 the Yale Journal of Music Theory was founded, and within a few years was edited by Allen Forte, whose own writings (I note, I hope significantly) were strongly influenced by Schenker’s writings and by Schoenberg’s music. So, by the time only a few years later when Perspectives of New Music began publication, the word-gates were open; articles came out of the closets; responsible, informed thinking and writing about music changed the climate of non-popular musical society. A few years ago I addressed the annual meeting of the Society for Music Theory (now a great, flourishing society) and I thanked the assembled theorists for, among their many substantial accomplishments and therapeutic achievements, having made it possible for me to stop passing as a part-time theorist, and to return to my full-tume vocation as a part-time composer. This was a self-protective, as well as grateful gesture, for the profession of theorist, replacing that of those teachers of theory who enforced rules and regulations from self-replicating textbooks, has become not just academically installed but musically influential. We are now, for the first time, in that state familiar to most of the rest of you. Publication of words has so proliferated that we not only cannot read everything that is relevant, but cannot even determine what we most profitably might read, even just as voraciously selfish composers. Writing on music is by no means confined to Schenkerian or serial issues. On the contrary, as one might expect of an essentially new—or young—field, there were successive attempts to seek guidance from other fields. Information theory, structural linguistics, machine intelligence, connectionism, philosophy of science, many of the fast changes of literary criticism, all were tapped for aid. But these attempts, even when stimulating, served primarily to reveal the limitations and even incongruities of such theories and techniques, designed for other functions, in attempting to capture the multidimensional ramifications of musical relations. Although Schenker and Schoenberg were aware of each other’s presence in Vienna, neither appears to have been aware that right around the ring there was the Vienna Circle. Its letter and spirit also were transported here in the
1930s, and formed the third side of our Vienna triangle, not the specific technicalities but the flavor and aim as imparted by the words of Israel Schef-
fler: “to affirm the responsibilities of assertion, no matter what the subject matter, to grant no holidays from such responsibilities to the humanities, etc., etc.” For the first time in music’s history, there is discourse about music that takes few such holidays, and has suffered the consequences.
134 1991 Those of us who were unworldly enough to be trapped into traffic with unprofessional organs of communication often were badly, even permanently, burned. I was. In the 1950s, while teaching during the summer at Tanglewood, I was asked to give an informal talk on Friday afternoon for those visitors who arrived early in preparation for the heady cultural events presented by the Boston Symphony over the weekend. It was suggested that I speak about the unreal world of the contemporary composer: his mulieus, his problems, his modes of support (the major problem), and I did. The talk was overheard by the editor of a magazine impredicatively entitled High Fidelity. He asked me to write 1t for publication; I resisted, he insisted, I capitulated, coward that I was and still am. My title for the article was “The Composer as
Specialist,” not thereby identifying that role of the composer in which he necessarily revelled, but in which, necessarily, he found himself. The editor, without my knowledge and—therefore—my consent or assent, replaced my title by the more “provocative” one: “Who Cares If You Listen?” a title that reflects little of the letter and nothing of the spirit of the article. For all that the true source of that offensively vulgar title has been revealed many times, in many ways, even—eventually—by the offending journal itself, I still am far more likely to be known as the author of “Who Cares If You Listen?” than as the composer of music to which you may or may not care to listen. And, for all
that the article, after many anthology appearances as “Who Cares If You Listen?” finally has been anthologized in English and German under my title, as recently as last week the attribution to me of “Who Cares If You Listen?” appeared in the nation’s most self-umportant newspaper. In my life, the learning process was never so demanding and edifying as during my years as the master of my music’s fate, in the Columbia—Princeton Electronic Music Center, and although they did not begin until 1959 when the
Rockefeller Foundation placed its substantive blessing upon us, I had cast longing eyes and ears toward the electronic medium some twenty years earher, when I attempted to work in the medium of the handwritten soundtrack, which had been developed in the 1920s 1n Europe—mainly in Germany—as the result of an awareness that originated with recording itself: that, unless you are a firm believer in musical ghosts in the talking machine, whatever was recorded of musical instruments, the voice, or any source of sound could be implanted on the disc, or on film, without such acoustical sources. This was accomplished on film by a mixture of drawing and photography; all that was missing were composers who needed the medium sufficiently to apply them-
selves to mastering a new, refractory instrument. But for most composers it appeared to be only an almost unbelievable possibility, technologically myste-
MILTON BABBITT 135 rious while providing resources that did not yet correspond to needs. So, the technology did not effect a revolution in music; the revolution in musical thought was yet to demand the technological means. My short and not particularly happy life with the handwritten soundtrack ended with World War I. Although that war enforced compositional absti-
nence upon me, I was able to think myself through a new compositional phase, a series of musical Gedanken Experimente centered about the remarkable isomorphism—not just formal but empirical and experimental—between
the temporal and pitch domains. These necessarily carried me beyond the imagined composition to the imagined performance, to—at that time—the impossible performance. For the production of pitch by the performer is a very different act from the production of successive durations, successive temporal intervals. The mental imagery involved in “measuring” a duration has subverted too many performances of rhythmically complex contemporary works, as contrasted with the semi-automatic means of pitch production by
pressing a key, or covering a hole, or depressing a valve. So, when I, as a member of the mathematics department at Princeton during a part of the wartime period, was privy to John von Neumann’s first semi-public thoughts on
the computer world to come, with its emphasis on “intelligence amplifica-
tion,” it was not stretching a point to imagine ahead to a performer of amplified intelligence in the computer, even if it reduced only to mechanical amplification, as the temporal world of the computer already was far ahead of any values one could imagine would be needed or used in music. But immediately after the war, the computer was not yet ready for the task of controlled sound production. What was available was the tape machine. Although this was basically a storage medium closely akin to the handwritten soundtrack, it was much more easily manipulable; sound from electronic and other sources could be stored on the tape which could be spliced into segments, and those segments represented precisely measurable temporal durations. For all that the medium was only too susceptible to trivial tricks with sounds and words, as the early motion picture revelled in automobiles racing backward as fast as forward, divers leaping out of the water onto the diving
board, and on and on; but there were soon works on tape by knowing composers, works that reflected musical needs that could not be satisfied in any other way. One of these needs, I must emphasize, was not the desire merely to produce “new sounds.” However unsatisfactory were and are many aspects of, for example, symphony orchestra performance—above all, those “practicalities” which make it impossible for an important part of the contemporary orchestral repertory to be performed by American orchestras—no
136 1991 composer was dissatisfied with the sheer sound of the orchestra. Nor did composers turn to those technically demanding new media because they did not know musically what else to do; they knew precisely what they wished to
do and knew that it could be done precisely only by the use of electronic media.
For me, that meant not employing the tape medium, but waiting for an instrument of greater scope, flexibility, and efficiency. I had to wait over a decade, meanwhile composing works for instruments and voices that represented for me my new beginning, and those works from the late 1940s and early 19sos are still virtually the only ones quoted in the textbooks. In the mid-i9sos, engineers at the David Sarnoff Laboratories of RCA somehow learned that there were composers who were tediously cutting up tape to create compositions that could not be realized by acoustical instruments and their performers. So, as a birthday gesture to General Sarnoff, they proceeded
to demonstrate what a covey of engineers and some half million dollars in material and labor could produce. It was the Mark I Electronic Sound Synthesizer, with which they created a record of electronic emulations of standard instruments playing mainly substandard music. The understandable reaction of the casual listeners was similar to that of Samuel Johnson’s to the acrobatic
dog. But when someone at RCA discovered that there were composers of whom even they at RCA had heard who could penetrate beyond the engineers’ concoctions to the potentials of such an artifact, RCA quietly constructed a far more elaborate, “universal” machine, the Mark I, and it was this that eventually was installed in the Columbia—Princeton Electronic Music Center in 1959, and which I employed to produce, after four years of research with it and on it, my first electronic work and all my other electronic works.
This enormous machine, which resembled in size and even in outward appearance the largest of the mainframe computers of its time, was nevertheless in no sense a computer; it could not crunch numbers; it had no memory (for which it probably is grateful). It was purely and entirely a sound synthe-
sizer in the most complete possible sense. It was not and could not be employed as a performance instrument; it was a programmable device, whereby every aspect of the musical event (pitch, envelope, spectrum, loudness) and the mode of succession to the next so-specified event were introduced into the machine in binary code by the operator (in my case, by me, the composer) to control the most elaborate of analog cosmos. An event could be specified at any time point, and a succession of events was simply stored on tape, eventually to be combined with any number of other so synthesized successions. The eventual music could be heard only as it issued from speakers. Any specifiable
MILTON BABBITT 137 musical event or complex of events could be made to occur at any designated time. The machine, as the most passive and extensive of media, did no “composing,” not even to the extent that the performer may be said to do so even with the most completely notated of compositions. ‘The machine has no biases with regard to degree of musical complexity, or idiom, or style—whatever those ill-defined terms may purport to suggest. Therefore, to speak of electronic music is to speak only of music produced by electronic means, the most admittant of means, and nothing more, or less. What the synthesizer provided and posed were those vast and mysterious musical resources beyond what
could be produced by conventional instruments and the only human performer. The hand is never faster than the ear, but electronic instruments are capable of speeds, as well as of temporal discriminations, loudness and timbral
differentiations, which can far exceed any listener’s capacities. What the learned composer had to learn, and still is learning as he creates music from sonic and temporal scratch, are the limits of the new musical boundaries, the
intricate abilities of the human auditor with respect to the perception and conceptualization of every musical dimension and their compounds. With the electronic medium, the role of the composer and performer became inextricably fused, and only the loudspeaker intervenes between the human composer and the human auditor, while the composer could experience the particular pleasure of entering the studio with a composition in his head and eventually leaving the studio with a performed work in the tape in his hand. There may have been weeks, months, even years between the entrance and the exit, filled
with trials, errors, and tribulations, but also with singular satisfactions. My friends at Bell Laboratories, who wished to induce me to use the computer for musical production, insisted that I was willing to do battle with the synthesizer only because I possessed the mechanical aptitude of Thomas Edison. I
certainly did not, but I may have had the patience of Job. Logistic and ecological pressures made it necessary for me to abandon my work with the synthesizer too many years ago, and—because I was unwilling to begin again at the beginning with computer sound synthesis—I returned to exclusively non-electronic media. Of course, [ had continued to write for conventional instruments and the voice during my electronic career, and I had combined the two media in a half dozen works. As I had learned much about music from my life with the synthesizer, so had I learned and continue to learn from my life with performers, and the sometimes alleged performance difhculty of my music often derives from my wish to transport the flexibilities of the electronic medium to conventional instruments and instrumental ensem-
bles. The obstructing, inhibiting element is our traditional, inappropriate,
138 I99I clumsy notation which imposes the visual appearance of complexity upon easily apprehended musical phenomena. Therefore, I am multiply grateful to those performers who have overcome this, and many other obstacles to make my music heard.
As revealed thus far, my life in musical learning would appear to have begun significantly with my bright college years, but it began more importantly at the age of five in the public schools of Jackson, Mississippi, where every school day, in every one of the six grades, we received musical instruction, not with stories about Mozart the Wunderkind, or by music poured over us from a phonograph (yes Virginia, there were phonographs), but by music to be read, sung, and played, all to the end of our acquiring, at least, minimal musical literacy. Such forces of formal musical conditioning either have vanished or are being banished rapidly. For instance, I happened to have discov-
ered that whereas in 1974 there were some 2,200 music teachers for the 920,000 public school students of New York City, ten years later there were just 793 such teachers; and I dare not conjecture how even that number has declined in the past seven years, and in how many other cities, towns, and villages. Our young students are left to the merciless informal musical conditioning in which they and we are daily drowned and suffocated at the most critical moments in their musical maturation. And with musical literacy so little rewarded and so lightly regarded, there 1s little inducement for anyone to ascend from such musical lower depths. When I entered the academic world, it was with the hope that I, like my colleagues in other fields of creative intellection, would be permitted and— even, on occasion—encouraged to pursue the most responsibly advanced, the most informedly problematical professional ventures, and, as a teacher (particularly in a primarily arts and science university) to attempt to train professional listeners rather than amateur critics. But this task has not been reinforced by the example of many of my fellow academics, who scarcely serve as role models for musical modesty. I have documented at other times, in other places, the cavalier presumption with respect to music of a roguish gallery, including a historian of culture, a mighty computermite, a self-declared polymath, a sociologist, a linguist, a barrister, all of whom are regarded as academically respectable in some field. Time does not permit a display of these sadly laughable arrogances, and I only can hope that, did it, you would have laughed. But permit me to offer just one example which, unlike the others, does not affect expertise, only precognition. It is from Sir Ernst Gombrich, who gratuitously, without being asked, asserted, 1n a volume on the philosophy of Sir Karl Popper, that he was “likely to stay away (from a concert) when
MILTON BABBITT 139 a modern work is announced.” What, indeed, 1s a modern work in this most pluralistic and fragmented of musical times? Nothing beyond the property of chronology is likely to be shared by any two works written even in the same month, or on the same day, or even by the same composer. Consequently, I am obliged to conclude that Sir Ernst must subscribe to an academic dating service, which provides him with the chronological provenance of every announced work. I once suspected that this wealth and range of presumption was induced
by the admittedly confusing and, perhaps, even confused picture that the world of contemporary music may present to the outsider, particularly the dilettante, but I was mistaken; it appears to be music itself that brings out the worst even in the best-intentioned. A few years ago I was to be on a panel where I was to respond to a paper presented by an aesthetician. I received the paper only a very short time before the event, and found that it dealt exclusively with visual art, with not a word about music. But there were constant references to John Stuart Mill. In desperation, I clutched at that clue,
and was pleased to discover, first, this uplifting statement of intellectual probity by Mill on the occasion of an address at Saint Andrews: “It must be our aim in learning not merely to know the one thing which is to be our principal occupation as well as it can be known, but to do this and also know something of all the great subjects of human interest, taking care to know that something accurately, marking well the dividing line between what we know
accurately and what we do not.” Then, second, in his autobiography, this standard of behavior is applied to music thusly as he instructs us in the fundamentals: “The octave consists only of five tones and two semitones” (a
terminologically amateurish statement of a falsehood) “which can be put together in only a limited number of ways” (computably in error) “of which
but a small proportion are beautiful—most of them, it seems to me, must already have been discovered.” So, by applying some pre-Birkhoffian measure of beauty, Mill-—in 1873—provided Gombrich with a scientific basis for ex-
tending the extension of “modern music” back to the middle of Brahms’ creative life.
If we composers required any further evidence of our position m the cultural hierarchy of our time, we would need but consult that professorially peddled “culture list’ which purports to compile “the shared knowledge of literate Americans” (I overlook the only slightly concealed circularity). No living composer appears on that list. Nor do such non-living composers as Schoenberg, Webern, or Berg. As for American composers, I merely point out that the list contains Will Rogers, but not Roger Sessions; Hank Aaron,
140 1991 but not Aaron Copland; Jimmy Carter, but not Elliott Carter; Babbitt (the
title), but not... . The late Paul Fromm, one of the few true musical amateurs and one of the rare private benefactors of contemporary American music, wrote: “I have a profound longing to live in a community where the significance of music is recognized as an integral part of cultural and intellectual life, where the suste-
nance and development of the music of our time is a deeply felt responsibility.” So do I.
I992 D. W. MEINIG Maxwell Professor of Geography Syracuse Untverstty
Had the idea of such an invitation ever crossed my mind, I would have thought the chances of being asked to give the Haskins Lecture as a good deal less likely than being struck by lightning. I found it a stunning experience, and
I cannot be sure that I have recovered sufficiently to deliver a coherent response. I can only assume that I was selected because I am one of a rare species in the United States—an historical humanistic geographer—and someone must
have suggested it might be of interest to have a look at such a creature, see how he might describe himself and hear how he got into such an obscure profession. Geographers are an endangered species in America, as, alas, attested by their status on this very campus [the University of Chicago], where one of the oldest and greatest graduate departments, founded ninety years ago, has been reduced to some sort of committee, and the few remaining geographers live out their lives without hope of local reproduction. I shall have more to say about this general situation, for while I have never personally felt endangered, no American geographer can work unaware of the losses of positions we suffered over many years and of the latent dangers of sudden raids from preying administrators who see us as awkward and vulnerable misfits who can be culled from the expensive herds of academics they try to manage.
I have always been a geographer, but it took me a while to learn that one could make a living at it. My career began when I first looked out upon a wider world from a farmhouse on a hill overlooking a small town on the eastern edge of Washington State. My arrival on this earth at that particular place was the result of the convergence (this 1s a geographer’s explanation of 141
142 1992 such an event) of two quite common strands of American migration history. My paternal grandparents emigrated from a village in Saxony to Iowa in 1880, following the path of some kin. My grandfather was a cobbler and worked at that a bit, then got a laboring job on a railroad, and before long had purchased a farm. He had three sons (my father being the youngest and the only one born in America) and as they were reaching adulthood he heard that good farmland in Washington State could be had for a third of the price in Iowa, and so in 1903 he moved there and settled his family on a fine 400-acre place. My mother’s parents were born in upstate New York and what is now West Virginia, met in Minnesota, where my mother was born, and about that same year migrated to the same town in eastern Washington, where my grandfather dealt in insurance and real estate. My forebears were not pioneers, but moved to places that were developing with some prosperity a generation or two after initial colonization. That prosperity eluded almost all of them and, in time, most of my aunts and uncles and cousins joined in the next common stage in this national pattern and moved on to Seattle, Tacoma, and western Oregon. The view from that farmstead was one of smooth steeply rolling hills to the south and west and of local buttes and a line of more distant forested mountains in Idaho on the north and east. This was the eastern edge of the Palouse Country, a regionally famous grain-growing area. Physically, it is a unique terrain, famous among geomorphologists for its form and texture, and it can be beautiful in the right season, especially just before harvest. To me it was interesting in all seasons. As far back as I can remember I was fascinated by that panorama. I wanted to know the names of all those features, I wondered what lay beyond, I explored on foot for miles around, climbed all the nearby summits to gain a broader view. Two branch-line railroads ran along the edge of our farm, readily visible from the barnyard. And so I also became fascinated with trains, watching them every day, counting the cars, learning to recognize the different engines, deciphering all those mystical letters, emblems, and names of the railroad companies, poring over timetables and maps obtained from indulgent station agents. From an early age I was collecting road maps as well and avidly reading about places, mostly faraway places. Geography
was, of course, a favorite school subject, but more than that, it fired my imagination. There was something about maps, and names of places, and the way they were arranged in space; about rivers and railroads and highways and
the connections between places that enthralled me—and they still do. Whereas other children might have imaginary playmates and adventures, and write about them, I had imaginary geographies: I made up railroad systems, with names and emblems, engines and schedules, and put them on maps, with
D. W. MEINIG 143 mountains and rivers and ports, and all the places named and their relative population sizes shown by symbols. But what does one do, really, with such interests? I remember announcing at some point in my boyhood enthusiasm that I was going to be a “geographic
statistician.” That came after some hours of poring over my big Rand McNally atlas and memorizing the 1930 census popoulation of every city, town, and hamlet in the state of Nevada. But of course there soon arose in my
own little mind the deflating question of what possible use would such a person be? If anyone wanted to know such information wouldn’t they simply
look it up in an atlas, as I had, instead of hiring me to tell them? So there I was, even at so young an age, a skilled person with a bleak future, a living data
bank no one wanted. However, larger horizons were being created by the hammer of world
events and emblazoned in the headlines of the Spokane newspaper, announced in the clipped tones of H. V. Kaltenborn, featured on the cover of Time magazine: the Italian invasion of Ethiopia; Japanese attacks on China; the bewildering chaos of the Spanish Civil War; and Nazi pressures in the Rhineland, Sudetenland, Danzig—-World War II. I tried to follow it all closely in my atlas, and when I graduated from high school a few months after Pearl Harbor, I was ready to go across the mountains to our big university and begin to train for a career in the U.S. Foreign
Service. That seemed a logical combination of geography and history, of places and events, with exciting prospects for actually seeing a lot of the world.
When I look back upon my preparation for this undertaking I am rather appalled at how thin it was in all formal respects. Neither of my parents had more than an eighth-grade education. My father read the Spokane newspaper every day, but I can never remember him reading a book. My mother read a good deal, but other than her Bible, we had almost no real literature in the house. Nevertheless, they assumed that my older sister and I would make our way as far as we might want to go and did everything they knew to encourage
us. Even though I soon could look back and see that the 1930s were very stringent times, I never felt touched by the Great Depression. We might not have had electricity or running water (in that we were somewhat behind the times even locally, chiefly because my father was so fearful of debt), but I always had new books and pencils and tablets and new clothes for school. But
I cannot recall in any detail just what I was being taught, what kind of academic groundwork was being laid, what books I was reading. I remember lots of English drill on grammar but only a few excerpts of great literature. As
144 1992 for classes in history, the only one that comes clearly to mind was the joke of the curriculum: while I was in high school the state of Washington suddenly decreed that every student must have a course in Washington history and government. We had no textbook, and to be led through the state of Washington’s Constitution by the music teacher was far from inspiring. Such small-town schools were not bad schools. I was given a foundation in
basic subjects, but never pressed very hard to excel. I had conscientious teachers, but no really inspiring ones. The most extraordinary person was a talented young drama teacher fresh from Seattle (hired mainly, of course, to teach typing) who generated such interest and discipline that our little school won the state one-act play contest two years in a row and the town was so
thrilled that they raised enough money (something like $300) to send us across the country by car to the national contest at Indiana University. I had a bit part, and the long journey to the Midwest and back was an important stage in my geographical education. Our teacher characteristically insisted we make
the most of it and not only plotted a route by way of such features as the Mormon Tabernacle, Royal Gorge, Mt. Rushmore, and Yellowstone, she took us to Chicago and arranged for us to stay a night at Hull House. Now, sixteen-vear-olds from the country would have much preferred a modern hotel on Lakeshore Drive, and it was hard to grasp just what a “settlement house” was, but a walk through the immigrant ghetto and incredibly congested Maxwell Street market left a powerful new impression of American urban life. The deficiency of my schooling I was first to feel was the lack of foreign languages. These were not required for college entrance and apparently were taught only when there was enough interest or a teacher available. I know that French and German had been offered, but not, as I recall, to my class, and I later regarded these as a burden in my university work. The broader limitations of such a place only became apparent later and have never been a cause of
great regret. Those of us who enjoyed school and all of its activities never thought of ourselves as country bumpkins. We were well aware of a larger world, in part because of our geographical situation. Washington State College and the University of Idaho were less than twenty miles away and those campuses were familiar ground. Although it was common for students to drop out of high.school, most graduated, a few each vear went on to college, and I never doubted that I would. When I now think about those formative vears I conclude that the weakness of my formal training was in some degree offset—especially in view of my
later work—by the experience of how lively small-town life could be. For
D. W. MEINIG 145 hundreds—probably thousands——of towns like Palouse, Washington, one has to go back at least to 1941 to find that vitality, for things changed with the war and changed rapidly—drastically—after the war. And it may seem a contradic-
tion, or at least a paradox, that the 1930s—the Great Depression—was a period of great activity in such places, at least in that part of the country, for crops were good even if the prices were low and there was an influx of people
from drought-ridden Montana and Dakota. If almost no one was making much money, a great many were trying hard to scratch out a living. In that town of L100 people there were fifty shops and businesses, several doctors, dentists, and lawyers, a weekly newspaper, half a dozen churches, busy farm suppliers, ten passenger trains a day, a usually packed movie theater, occasional traveling shows, evangelists, and lots of sports. Saturday night in harvest time, when all the stores stayed open, was so packed you had to come early to get a parking place. I am glad to have experienced all that. I think it has given me some real understanding and feel for what a large segment of American life was like in many regions over a considerable span of our history.
I went off to the University of Washington in 1942 because I was just seventeen, but knew that I would soon be in military service. There was, of course, much talk and plotting among all male students as to how we could get into some branch that might be exciting or at least interesting. Unlike many of my friends, I had no interest in going to sea or flying. That left the Army, and the ominous possibility of being arbitrarily assigned to cooks-andbakers school or something equally awful. Concluding that the only thing I
knew much about was maps, I spied a course in cartography in the winter term offerings and went to the geography department to enroll. It turned out
to be an upper-division course full of naval ROTC students, but after a conference with the chairman, he agreed to let me take cartography and a prerequisite course simultaneously. And it worked. At the end of that term I
enlisted and was assigned to the Corps of Engineers as a topographic draftsman—and as soon as I completed basic training they saw that I could type and I was put in a dull office job and never had a drafting pen in my hand. Pil not give an account of my illustrious wartime career. I never got out of
the United States. The only pertinent thing is that three years in the Army provided a much-needed maturing and did nothing to dampen my interest in the foreign service. The G.I. Bill opened up heady new prospects, and I remember that with unsullied naiveté I sat in my boring army office and sent off for bulletins from Harvard, Stanford, and Georgetown to decide which might offer the best training. After careful study I chose Georgetown because it had the most specific curriculum and because it was in Washington. [ had
14.6 1992 glimpsed some of the attractions of Washington, D.C. while in officer’s school at nearby Fort Belvoir. Naive as I was about universities, I have never regretted my choice. The School of Foreign Service was certainly an uneven place, but I had a few firstrate professors and my interest and enthusiasm never flagged. As everyone knows who was a part of it, it was a wonderful time to be at any university. Even though classes were packed, staff was short, and we went day and night, the year around, there was a maturity and seriousness about it that was quite unprecedented. One’s classmates varied in age from twenty to forty, from all walks of life, and with a great diversity of experiences. I never had a small class, but some of the lecture halls crackled with excitement: as with Carroll Quigley on “Development of Civilization” and “Shakespeare” with John Wal-
dron. After my first term I returned West to Colorado to be married, and needing extra income I got a part-time job as assistant to a remarkable academic character, Ernst H. Feilchenfeld, a Jewish refugee, doctorate from Berlin, who had taught at Oxford and Harvard before happily settling in, as he put it, “under the benevolent despotism of Jesuit Georgetown” as professor of international law and organization. He ran the Institute of World Polity, more or less out of a file cabinet, and my job was not only to take care
of his correspondence with a distinguished board of consultants scattered about the world, but to sit and listen to him talk. He was a garrulous and lonely man, and after two years with him, I was tempted to think that about fifty percent of my education at Georgetown was from Feilchenfeld and fifty percent from all the rest. So it was an immensely stimulating time to be at that unusual school in the capital of the new superpower. Many of us participated in small networks of contacts with the lower levels of various government departments and agencies. But there were dark clouds as well, and they rapidly thickened. Senator Joseph McCarthy—and many little McCarthys—were running amok. For-
eign Service officers were being pilloried as traitors, the State Department increasingly demoralized, and the whole prospect of having one’s life work bound to and constrained by such a government created a vocational crisis for me—and for many of my classmates. There were other factors, as well. One of the virtues of the School of Foreign Service was the practical segment in its curriculum: one studied accounting, business law, and consular practice as
well as history, government, and literature. Even a glimpse of the actual chores of consular work, the endless forms and regulations, responding to imploring citizens and would-be citizens, began to tarnish the glamor of my adolescent view of overseas service.
D. W. MEINIG 14.7 But where to turn? I floundered for a few months. I tried to think about
what I most enjoyed. Railroads? I got an introduction to some railroad officials in Washington but all they could describe for me was to become a salesman and solicit freight. Geography? Read and learn about the world? But how to make a living out of it? I have no explanation for why I was so stupid not to see what was so obvious; it finally did dawn on me that that is what professors do: read and study and talk at great length about that which most interests them-—they have a great deal of freedom to do it in their own way, and they have captive audiences forced to listen to them. Once I had that belated breakthrough I had no doubt about what I wanted to be: an historical geographer. I knew of a book or two by that name, but neither I nor anyone else I talked to knew if there really was such a field. But I had spent many hours, usually fascinating hours, in history classes and had read rather widely, and I already knew enough geography that I was always visualizing a map and often thinking how much more effective the teacher or writer might be if the
narrations and explanations had been informed with maps. I had no advice whatever as to where to go to graduate school, but I knew
there was a big geography department in Seattle, where I had taken two courses as a freshman and had actually talked to the chairman; and besides that, I think we were a little homesick for the West. It was not a very good department. Shortly after my time there it was revolutionized under a new chairman and mostly new faculty and became one of the most influential centers of a “new” geography in all the Euro-American world, but it was distinctly mediocre 1n 1948. Within a short while I realized that I should have gone to Berkeley, but practical reasons impelled me to persevere in Seattle. The not-very-taxing geography courses provided a sound foundation, and I read widely and roamed the campus in search of interesting lectures and courses. Among the most memorable was the packed hall—standing room only—of Giovanni Costigan’s lectures on English history; what I should have sought was a solid seminar in historiography. However, I happily acknowledge my debt to one professor who took a
real interest in me and was helpful then and thereafter. Graham Lawton was an Australian, a Rhodes Scholar who had taught briefly at Berkeley. He
sought me out when he learned that I, having seen an announcement on a bulletin board shortly after my arrival, had applied for a Rhodes Scholarship. He did his best to help shape my rather exotic statement of interests (as I recall, I declared a research focus on Northwest Africa—miainly because I hadn’t found much to read on that corner of the world and was curious about it). As was not uncommon in our region, some bright fellow
14.8 1992 from Reed College won the Rhodes, but I had gained a very supportive advisor.
I had arrived from Georgetown with a headful of Quigley and Toynbee and Mackinder and other sweeping world views, and it took a while for my geography mentors to bring me down to earth, to get my feet firmly on the ground, and eventually on my native ground, in the prosaic little Palouse Country. Graham Lawton guided me into British and American historical geography—not a large literature-—and I soon tried my hand at it. What started as little more than an exercise, a convenient thesis topic, soon
developed into a much larger and self-conscious work. I wanted to put my home area into history, to see how it fitted in as part of American development. To do that one had to create a rather different version of history, one that was focused on the land and places rather than on politics and persons. I wanted to find out what the early explorers actually said about all the various localities, just where the earliest farmers and townsmen settled, spread into
other districts, and domesticated and developed the whole region with the way of life I had known in boyhood. I avidly reconnoitered the countryside, visited every locality, studied old maps and documents, read hundreds of country newspapers, plotted data from public and private records. I had a lot to learn about my native ground, but I already knew about some important matters. I knew a lot about farming and livestock raising because I had done them. Our farm was small by Palouse standards but nonetheless real—indeed, more real for my purposes than others, for my father was the last farmer in that area to use horses rather than tractors. He loved those big workhorses as much as he hated all the high-powered machinery that was already essential to successful farming. And so I grew up with them, learning at an early age how
to take care of them, harness them, and work in the field with them—and thereby I was in contact with an older—indeed, ancient-——world of farming.
I found great satisfaction in that research and I wanted to share it with others. I wanted to write a book that could be read with pleasure and enlightenment by local residents who had some serious interest in their homeland. I overestimated that potential, but a sprinkling of letters over the years assures me that Te Great Columbia Plain has helped a few. At the same time, I wanted to write a book that would command attention in professional circles. I wanted to help create a literature that would at once exemplify something of the character and value of the geographical approach to history and the historical approach to regional study. I was convinced that professional geography in America badly needed that kind of literature. Human geography and regional geography were too largely textbook in form,
D. W. MEINIG 14.9 stereotyped descriptions of a set of standard topics with rarely any historical or interpretive dimension at all. Certainly no geography book told me what I
most wanted to know about my country. I thought my approach was a valuable way of looking at a region. It answered most of the questions I had at the time, and I hoped it might encourage others to do something similar on other regions—though, 1n this too, [ seem to have overestimated that prospect. For a while I had in mind more such studies myself, and I did, in fact, write
another book (before I completed this first attempt) from an opportunity provided by a Fulbright to Adelaide—where Graham Lawton was now head of the geography department. A surge of settlers into the dry country north of Adelaide had created Australia’s premier wheat region. Emerging at the same time, working with the same general technology, and competing for the same Liverpool market, this South Australian episode offered illuminating
comparisons with the Pacific Northwest. Regional geographers are often accused of being too focused on particularities and diversities, but any geographer’s global training should provide analogues and generalizations as well.
But I did not proceed with more historical studies of agricultural regions.
Two experiences of residence in “foreign lands” brought about a shift of focus, a change in emphasis. One of these was that year in Australia, where another branch of English-speaking pioneers had created a nation on a continental scale. “The most American” of lands beyond our shores was a likeness many Australians were ready to assert and most Americans seemed happy to accept. There were, of course, grounds for such a characterization, but I was struck more by the differences, and they helped me to see my own country in a clearer light. The thing that most impressed me from my reading, research, field studies, and general observation was the difference in the general compo-
sition of the population: the homogeneity of the Australians as compared with the kaleidoscopic diversity of the Americans. And one was more alert to
the comparison because the Australian population was just beginning to change toward the American type by the unprecedented post-war influx of emigrants from Continental Europe: Germans, Dutch, Poles, Italians, Greeks, Maltese. Their number was not really large but they were clearly injecting a new variety and vitality into Australian life. Australian commentators, novelists, and dramatists were giving attention to the many individual, familial, and social challenges of immigration, acculturation, assimilation— themes that were century-old clichés in America, and I returned with a heightened appreciation of the stimulus, the energy, the creativity, and the special
ISO 1992 problems generated by the marvelous ethnic and religious complexities of American society.
The other so-called foreign experience was congruent with that. I began my professorial career at the University of Utah. I knew, of course, that Salt Lake City was the seat and symbol of the Mormons. We all knew of the Tabernacle Choir, and something vaguely about their peculiar history— polygamy, Brigham Young, and the Great Trek to the desert West. But I didn’t realize just what we were moving into when my wife and infant daughter and I settled into the Salt Lake Valley. We found ourselves classified in a
way we had never thought of: we were “Gentiles.” We had unwittingly moved into a dual society wherein everyone was either a Mormon or a Gentile (giving rise, of course, to the local cliché that “Utah is the only place where a Jew 1s a Gentile”). This binary character was a subtle but pervasive reality: two
peoples, interlocked in much of daily life, not at all visibly distinct to the casual observer, without any overt antagonism between them, each subdivided into complex varieties within—yet ever conscious of being two distinct peoples. That Mormon—Gentile dichotomy seemed to permeate everything and it gave a special interest, flavor, and edge to life in Utah. One also came to see that the local landscape, rural and urban, was different from adjacent areas.
The farm villages, the ward chapels, tabernacles, and temples, the rigid squares and the scale of those big city blocks, stamped a visible Mormon imprint on the area. And so one came to realize that the Mormon Church was not just another of the many denominations in the remarkable diversity of American religion,
but was the creator and vehicle of a distinctive people, of a highly selfconscious, coherent society that had set out to create a large region for itself in the desert West and had essentially done so, for Gentiles were a minority and
generally regarded as “others,” “outsiders,” even at times “intruders.” Nine
years in Utah taught me something new about America, heightened my consciousness of such social groups, made me feel that the historical geographer would do well to focus on the kinds of communities that were characteristic of various regions. Despite powerful pressures toward standardization and conformity, the American West was far from even an incipient uniform or united area. And so with a heightened sense of life and locality I began to examine the West as a set of social regions. I wrote an extensive essay on the creation and dynamic character of the Mormon culture region, then a small book on Texas, followed by another on New Mexico and Arizona. Each of these gave considerable attention to ecology and spatial strategies, as in my earlier books, but
D. W. MEINIG Isl the main focus was on the various peoples shaping discrete regional societies.
In this kind of human geography one was not describing simple regional patterns, fixed in form and place, but continuous geographical change. That is, changes in limits and relationships, in internal character as a result of migrations, diffusions, demographies, in economies, transportation, and other technologies, in regional attitudes and perceptions. These more interpretive writings were well-received outside geography. A number of historians seemed to find in them a fresh perspective on a general topic still domimated (twenty-five years ago) by the Turner frontier thesis.
And I must also tell you that they were the means of snaring my favorite student. He is a fictional character in James Michener’s vast volume on Texas.
I got to him in the middle of it, on page 504, and changed his life. He was already a football hero at the university, but, in Michener’s words: . . . he read a book that was so strikingly different from anything he had ever
read before that it expanded his horizons. Imperial Texas... by D. W. Meinig, a cultural geographer from Syracuse University, . . . was so ingenious in its observations and provocative in its generalizations that from the moment
Jim put it down, he knew he wanted to be such a geographer . . .
Michener sends him off to Clark University instead of to me, and I lost track of him in the further depths of that book. Pve never heard from him, but
I take satisfaction in the fact that whatever one may think of Michener’s fiction, it is generally agreed that he gets his facts right. I had in mind to do a large book on these American Wests—I had done considerable work on California and Colorado as well—but then came another sojourn overseas and from it another shift in scale, if not in perspective. In the fall of 1973, I had a very pleasant visiting position at the University of St. Andrews in Scotland. I was expected to give one lecture a week, ten in all, on the United States; the rest of the time I could do as I pleased. In the winter we
shifted to Israel, where I repeated that course on America at The Hebrew University of Jerusalem. How to treat the United States in ten lectures made one search for a few major themes and to generalize at a broad scale in time and space. And thinking about such matters in those places, and later on as we settled into a small village in Gloucestershire, forced one to consider things from the beginning: how did Europeans reach out, make connection with, and get all those colonies started in America? Once one began to think seriously in terms of oceanic, intercontinental connections, one was caught up into a vast field of action, and inevitably American Wests became but small
152 1992 pieces within a large system. One had always known that, of course, and it didn’t make the West of any less intrinsic importance than before, but it altered the balance and made seeing the West in the fuller context of nation, North America, and, indeed, an Atlantic system, the principal goal. And so I slowly got underway with a rather audacious task of writing “a geographical perspective on 500 years of American history.” I suppose my whole writing career could be seen as a geographer’s version of the search for the self—of who one is, and how that came to be, and what is the meaning of it all. For the geographer that means close attention to where one 1s, what that place is like, and what the summation of the localities of life
might reveal. Thus, this geographer began his search on his native ground, expanded into the next larger encompassing region, and on and on through successively larger contexts in a search for an understanding of his whole | country, of what the United States of America 1s like and how it got to be that way.
The Haskins Lecturer is asked to reflect upon “the chance determinations” of a life of learning. I have suggested some, but two others come prominently to mind: going to Salt Lake City rather than to London 1n 1950; and going to Syracuse rather than to Berkeley in 1959. When I was finishing graduate course work, I needed a job; [ had a family to support. I had applied for a Fulbright
to London many months earlier, but the process in those early days of the program seemed interminable. I was unable to find out anything about my status, and so in early June I accepted a position at the University of Utah and
felt I could not ethically back out when the award came through later that summer. After all my talk about foreign service it was a painful choice, and I have occasionally wondered what might have happened had I gone to Britain on the threshold of my career. The University of Utah proved to be a lively place for a beginner, starved for funds by a niggardly legislature, but home to some excellent faculty, engaged in considerable experimentation under a new dean fresh from the University of Chicago. The teaching load now seems like a killer, but I was young and energetic, involved in many things, including a TV lecture series 1n 1953.
In 1956 I taught a summer session at Berkeley and the next year Carl Sauer invited me to join his staff. At the time that was generally considered the best possible thing that could happen to a young historical geographer. But there were complications, at his end and mine. It turned out that the position was
not as yet firmly authorized as permanent, and by that time I was already committed to go to Australia for a year. Mr. Sauer agreed that I must go there and he would see what could be worked out for the year following. On our
D. W. MEINIG 153 return voyage from Australia, a letter awaited me in London from a new chairman, explaining that Sauer had retired and regretting that no position was available. I have always assumed that I was not the new chairman’s choice
and, of course, I was greatly disappointed at the time. But we had barely settled back in at our mountainside home when the chairman at Syracuse telephoned and invited me to come for an interview. Looking out my window at the sunshine on the snowcapped mountains looming about my backyard I very nearly said “no thanks.” I had never thought of going to Syracuse or that part of the country and had no real interest in doing so. But I did have sense enough to realize that it would cost nothing to go and have a look. In fact, it cost a good deal, for I returned in a serious quandary. I didn’t really want to
leave the West, for a variety of reasons; I had assumed I would spend my career somewhere in the Mountain West or Pacific Coast, but the prospects at
Syracuse were so much better professionally and the region so much more attractive than I had realized that, after much agonizing—and a strong nudge from my always-more-sensible wife—we did decide to go. It was a chance determination of major consequence for us. Syracuse University provided a far better working environment, the geography department was very good and kept getting better, the university was never rich in funds but it had some riches in talent, and for thirty years its leaders at every level from department chairman to chancellor have given me much help to do whatever I most wanted to do. Equally important, upstate New York was a beautiful region and an excellent location and we quickly settled in contentedly. Our relatives, all Westerners, regard us as living in exile, but those who have visited have had to acknowledge the attractions. To conclude on “chance determinations,” I would add that I was fortunate to meet at the outset of my career (in one case quite by accident) two of the
foremost scholars and teachers of historical geography, Clifford Darby of London and Andrew Clark of Wisconsin, and to receive their cordial welcome and respect as if I were already a worthy member of our small guild. That meant a lot to a beginner. Geographers work at various scales; it is expected that we can move easily and skillfully up and down the general hierarchy. My own published work has
been mainly at some sort of regional scale and my current project retains something of that emphasis, for a central purpose is to assess the United States as, simultaneously, an empire, a nation, a federation, and a varying set of regions. But my life of learning has been strongly influenced by both larger and smaller views of the world. Geography, like history, provides a strategy for thinking about large and
I$4 1992 complex topics. Stephen Jones’s observation that “the global view 1s the geographer’s intellectual adventure” has always had a ring of truth to me. I began adventuring at that scale through boyhood fascination with a big atlas, learning locations, shapes, and names, and added substance to that framework through reading at progressing levels about places and peoples. It was always a minor thrill to discover some thick book on an area one knew little about— McGovern’s History of Central Asia comes to mind—and a challenge to try to make historical sense out of some complicated geographic pattern, such as the world map of languages. One was not simply accumulating facts packaged in convenient areal compartments; one was secking concepts that helped one to make ever greater sense of the complicated natural and cultural patterns of the
world. That sort of study has a very respectable lineage, dating from the multi-volume works of Humboldt, Ritter, and Reclus, but never really got a firm hold in America. Modern single-volume versions only belatedly appeared
from the writings of the Berkeleyite geographers, Rostlund, Knitfen and Russell, Spencer and Thomas, but these remained marginal, and increasingly antithetical, to the main stream of American geography. Similar comprehensive works in anthropology, such as Linton’s The Tyree of Culture, and in history, such as Ralph Turner’s two volumes on The Great Cultural Tradttions, and the polemical interpretations of Lewis Mumford (especially Technzes and Civilization) also nourished my appetite during my early growth. In time
I would work out my own ways of presenting the historical geography of the great world cultures to undergraduates. Helping students to make sense out of their world in such a manner has been a very satisfying experience, and I have never understood why such knowledge has been so persistently undervalued in American universities. Much the most challenging intellectual adventuring was to be found in those heavy ambitious works that asserted deeper meanings, especially Speng-
ler, Toynbee, and F.S.C. Northrup. One didn’t swallow them whole, for reading critiques and alternatives was part of the fare. For example, at the same time as I was devouring some of these works I was being led methodcally through the dissection and analysis of “culture” and “cultures” in Kroeber’s Anthropology by the formidable Erna Gunther, a student of Boas. It was not the audacious claims and portentous conclusions of these metahistorical works that were so fascinating, it was their sweeping perspectives and at-
tempts to integrate an immense range of knowledge in order to grasp the wholeness and the vital springs of the great cultures and civilizations. A few months ago I mentioned to a musicologist friend of mine a book that I had read about but not yet seen. He said, “Well, the author tried to
D. W. MEINIG 155 synthesize a whole society by looking at its art, but,” my friend said, “it didn’t work, it can’t work; it was grand, but it was a failure.” (You may infer that we were talking about Schama’s The Embarrassment of Riches.) I said that I was especially interested in grand failures. I was, in fact, trying to break into the business. I was confident I could be a failure; I dreamed of being a really grand failure. The ACLS generously refers to the Haskins Lecturer as “an eminent hu-
manist.” You would do well to regard me as marginal on both counts. Although my kind of geography belongs in the humanities, for much of it secks to be a form of portraiture, a depiction and interpretation honed into literature, my understanding of humans is not exactly “humanistic” in the most common modern uses of that term. Rather, it 1s grounded upon the old, rich, and rather severe view of man and all his works as expressed in The Book of Common Prayer. That book has been a routine part of my life for forty years.
It provides a larger scheme of things, however mysterious, that helps put one’s own work in perspective. I find nothing therein to keep me from accepting whatever real truth science may offer and a lot therein to help me keep a certain detachment from whatever the latest popular “-isms” of the academy may be. More specifically, in relation to my own specialization, it provides a
quiet but insistent warning about some of the characteristic tendencies of American society and culture, as expressed in its exaggerated emphasis upon freedom, individualism, democracy, materialism, science, and progress. By
providing wisdom and hope rather than cynicism and despair, it helps to mitigate the anger and alarm one often feels about the drift and disorder of one’s own country. Furthermore, that book and its associated rituals offer a code of conduct and a rehearsal of the follies and perversities of humankind that can have a salutary bearing upon daily life. To be reminded year after year that “thou art dust and unto dust thou shalt return” is a specific against the vanities and posturings so endemic in professional circles—and it comes with the insistent warning that none of us is immune from such temptations. Geography has sometimes been represented as a kind of moral philosophy, primarily in the sense that those who have a deep fascination for the earth needs must have a special concern for the care of the earth. An old definition of geography has been coming back into favor: the study of the Earth as the home of man—or, as we now say, of humankind. We have recently become aware that the Earth as home is in alarming condition, and geographers, like
many others, are eager to tackle urgent problems of home repair and of remodeling the way we live. I have no practical skills to put to use on such projects. [can only add my small voice to the few urging the need, as well, for
156 1992 a much longer perspective on such matters, a far better understanding of how we got to where we are. And that sort of historical investigation must surely lead to a sobering meditation on the human situation on this earth. There are
mysteries there to haunt the mind. In such matters I can be no more than a faint echo of the wisdom of Carl Sauer, the only really philosophical geographer I have known, who, while working quietly over a long lifetime mostly in
remote corners of time and space, spoke and wrote eloquently about these grand themes, calling for geographers to “admit the whole span of man’s existence” to our study and to press for “an ethic and aesthetic under which
man... may indeed pass on to posterity a good earth.” For me, meditations on deeper meanings are more likely to be prompted by a walk in the country than by trying to contemplate the globe. It 1s this other end of the scale, that of landscape and locality, that most enlivens my sense of ethics and aesthetics. “Landscape” has always been an important— and troublesome—word in geography, referring to something more than a view, setting, or scenery. What lies before our eyes must be interpreted by what lies within our heads, and the endless complexities of that have stimulated important work. I have paid particular attention to symbolic landscapes as representations of American values and generally tried to use the landscape as a kind of archive full of clues about cultural character and historical change that one can learn to read with ever greater understanding. At the same time
landscape is always more than a set of data; it 1s itself an integration, a composition, and one tries to develop an ever keener appreciation of that. It is here that geography makes its most obvious connection with aesthetics, with writers and poets and painters and all those who try to capture in some way
the personality of a place, or the mystery of place in human feelings. If geography’s old claim to be an art as well as a science is as yet backed by relatively little substance, the logic and the potential are there. I was rather slow to appreciate these truths, in part at least, because I was never trained to see them and there was then little American literature on the subject. I did have the good fortune to happen upon an obscure new magazine in the 1950s called Landscape, published and edited by a J. B. Jackson, from a post office box address in Santa Fe. A few years later | arranged to meet
this modest, refreshingly unacademic man who would eventually be regarded—even revered—as the principal founder and inspiration of cultural landscape studies in America. By happy coincidence I also met Peirce Lewis of Penn State on that very same day and he has served as my principal academic
mentor in learning to read the landscape. This dimension of my life was stcadily enhanced by personally exploring British landscapes with increasing
D. W. MEINIG 157 regularity. I found there a wonderfully rich literature, by scholars and specialists of many kinds and by those splendid English creatures, the devoted, gifted amateur. I got acquainted with William G. Hoskins, the foremost historian of English localities, who by talent, perseverance, and personality reached out with several sets of books and a splendid BBC television series to bring this
kind of historical geographic appreciation to a broad public. In the 1970s I devoted much of my time to landscape studies, to a lecture series, seminars, and field trips. I tried to bring together the best of what I had found in Britain
and America with the hope of stimulating some fresh work. A few of my students responded quite creatively, but although I itched to do so, I never produced a substantive study myself. The actuarial tables warned that I dare not delay my larger project, and it is one of my few regrets that I have had to give up doing something on Syracuse and Central New York. Quite by chance I was able to participate in a really vast outreach to the reading public. In the 1980s a former student of mine, John B. Garver, Jr.,
served as Chief Cartographer at the National Geographic Society, and he invited me to guide the preparation of a set of maps depicting the historical regional development of the United States. Seventeen large sheets, each containing a set of maps, were issued with the magazine over a span of five years. Each distributed to 10,600,000 subscribers around the world, it must have been my most effective teaching even if only a very small percent were
ever studied carefully. (When I see these maps in bins at used bookshops for fifty cents a piece ’'m always tempted to buy them—they are such bargains. )
Tam a peculiar geographer in that [ almost never travel with a camera. That is surely a limitation, even a flaw, but I have tried to compensate. I carry the images of thousands of places in my head, all partial and impressionistic, of course, but obtained with a cultivated “eye for country,” to use an old saying. Perhaps I got both the eye and the preference from my father, and my resistance to technology makes me as archaic and crippled in my time as he was in his. My colleagues aptly sum me up as the man with the quill pen in an age of word processors. Travel is, of course, an important part of a geographer’s learning. Though I have traveled fairly extensively, I have not deliberately set out to see as much of the world as possible, as some geographers do, but I find it uncomfortable
to write about areas I have not seen and over the years I have used every opportunity—meetings, guest lectures, vacations—to obtain at least a passing acquaintance with every part of the United States and adjacent Canada. What
few research grants I have sought have been used in some degree for such
158 1992 reconnaissance, thereby continuing in modest personal form the famous role of the geographer as explorer. I mention Canada deliberately because it has come to have an important place in my life of learning. As a geographer, I must regard Canada as an essential part of the context of the United States. And I refer not just to its physical presence on our northern border and the many practical interactions
between the two countries, but, as well, to the presence of a companion empire, federation, nation, and set of regions that can provide invaluable comparisons with our own. I regard the common indifference and ignorance of Canada by Americans as arrogant and stupid. To learn and ponder the fact
that the basic foundation of Canadian nationalism is the desire not to be American ought to be an instructive experience for all thoughtful Americans. This is not the place to expand upon this topic, I only wish to declare that I feel much the richer for having gotten acquainted with a good deal of Canadian territory and literature. I have been especially interested in writings on nationalism and regionalism, technology and social philosophy, and I have found the ideas of George Grant and W. L. Morton particularly instructive and congenial. Although I have pursued my own interests with relatively little attention to what was exciting many of my colleagues, I nevertheless claim to speak for geography in a quite literal sense: for “ge-o-graphy,” “earth writing,” “earth drawing,” the task of depicting the actual character and qualities of the whole surface of the globe—at various scales, and at various levels of abstraction. Such a field does not fit comfortably into modern academic structures, and has suffered for it. To the not uncommon question “is geography a physical or a social science?” almost all geographers would answer “both.” That in itself can
become an annoyance to tidy administrators (as at the University of Utah, where in my day geography was in the College of Mineral Industries as one of the “earth sciences”). My answer to such a question has always been “both, and more.” That 1s to say, while much of our work 1s a form of physical or social science, the larger purpose is of a quite different character. I accept the old Kantian concept that geography, like history and unlike the sciences, is not the study of any particular kind of thing, but a particular way of studying almost anything. Geography is a point of view, a way of looking at things. If one focuses on how all kinds of things exist together spatially, in areas, with a special emphasis on context and coherence, one ts working as a geographer. The ultimate purpose is more synthetic than analytic. Of course, no one can master all that exists together in any area. Every geographer must be selective,
D. W. MEINIG 159 and we follow the usual divisions and identify ourselves as social geographers,
economic geographers, biogeographers, or whatever. The great temptation for administrators 1s to dissolve geography departments and allocate their residual members to these various disciplines. Such taxonomic logic is not only arbitrary and intellectually suspect, it 1s deeply destructive. It denies the legitimacy of a venerable field and the coherence vital to its nurture. [t implicitly declares to the student that there is nothing there worth devoting one’s life to. If my remarks have taken on a polemical tone it is because such matters have been an ever present part of my life and because my life of learning has always extended far beyond my formal life in the university. I hope I have conveyed to you that geography has been more to me than a professional field. We are odd creatures. Geography is my vocation, in an older, deeper sense of that word: vocation as an inner calling—not what I do for a living, but what I do with my life. The born-geographer lives geography every day. It is the way one makes sense out of one’s world, near and far, and it is the means
of appreciating the immediate world—of whatever lies before one’s eyes. Every scene, every place—one’s daily walk to work as well as one’s traverse of unfamiliar ground—can be an inexhaustible source of interest and pleasure—
and pain, for there is plenty to deplore in what people have done to their surroundings. It 1s difficult to convey the intensity and fullness of such a thing. To such a person geography is not simply a profession, it 1s a neverending, life-enriching experience. I have no idea how widespread this aptitude and hunger for geography are. There are relatively few geographers in total, and a considerable number who call themselves such are of a narrower technical kind who would not really understand what I am talking about—indeed, will be embarrassed by what I
have had to say. I have no doubt that there are others who never think of themselves as geographers who are also responding to the vitalizing attractions of such interests. There are some encouraging signs that the crisis of social science and new confrontations with a complex world may cause the value of professional geography to become more recognized in America. One
hopes that thereby not only will the number of persons with the requisite skills for productive work be enlarged, but that the prospect of becoming a geographer will become much more widely apparent so that the young © natural-born geographers among us can be nurtured to the full wherever and whenever they may appear. I was one of the lucky ones. Like most American geographers of my time, I
160 1992 only belatedly discovered that there was such a profession, but I did so just in
time to make the most of it. It has been such a richly satisfying thing that when I reflect upon my life, in the way that your kind invitation has encouraged me to do, it seems as if from the moment I first looked out in wonder across the hills of Palouse I have lived happily ever after.
19 9 3 ANNEMARIE SCHIMMEL Professor Emerita of Indo-Muslim Culture
Harvard University; and Honorary Professor of Arabic and Islamic Studtes University of Bonn
Once upon a time there lived a little girl in Erfurt, a beautiful town in central
Germany—a town that boasted a number of Gothic cathedrals and was a center of horticulture. The great medieval mystic Meister Eckhart had preached there; Luther had taken his vow to become a monk there and spent years in the Augustine monastery in its walls; and Goethe had met Napoleon in Erfurt, for the town’s distance from the centers of classical German literature, Weimar and Jena, was only a few hours by horseback or coach. The little girl loved reading and drawing but hated outdoor activities. As she was the only child, born rather late in her parents’ lives, they surrounded her with measureless love and care. Her father, hailing from central Germany,
not far from the Erzgebirge, was an employee in the post and telegraph service; her mother, however, had grown up in the north not far from the Dutch border, daughter of a family with a centuries-long tradition of seafaring. The father was mild and gentle, and his love of mystical literature from all religions complemented the religious bent of the mother, grown up in the
rigid tradition of northern German protestantism, but also endowed with strong psychic faculties as is not rare in people living close to the unpredictable ocean. To spend the summer vacations in grandmother’s village was wonderful: the stories of relatives who had performed dangerous voyages around Cape Horn or to India, of grandfather losing his frail clipper near Rio 161
162 1993 Grande del Sul after more than a hundred days of sailing with precious goods—all these stories were in the air.
Mother’s younger sister was later to weave them into a novel and to capture the life in the coastal area in numerous radio plays. Both parents loved poetry, and the father used to read aloud German and, later, French classical literature to us on Sunday afternoons. The little girl owned a book of fairy tales, printed in 1827, and at age seven she enjoyed correcting what appeared to her as spelling mistakes, that is, the old-fashioned orthography before the language reform of 1900, thus preparing herself as it were for the innumerable page proofs she would have to read later in life. In the book there was one story which she almost knew by heart-— a story not found in any book she would read in her entire life. It was called “Padmanaba and Hasan” and told of the visit of an Indian sage to Damascus, where he introduces an Arab boy into the mysteries of spiritual life and guides him to the subterranean hall where the mightiest king’s catafalque is exposed amidst incredible jewelry. Over it was written: “People are asleep, and when
they die they awake.” Ten years later, when the little girl was eighteen, she realized that this was a hadith, a word ascribed to the Prophet Muhammad and dearly loved by the mystics and poets in the Islamic world. She enjoyed school, especially languages such as French and Latin, and shocked her teacher by writing her first essay in high school, entitled “A Letter to My Doll” about the Boxer rebellion in China. She tried to copy little texts in foreign characters from a small publication of the British Bible Society, entitled God's Word in Many Languages, and loved poetry. One of her favorite poets was Friedrich Riickert (1788-1866), the ingenious Orientalistpoet, whose versions of Persian and Arabic literature impressed her deeply. Her most ardent wish was to learn more about Oriental culture, and when she was fifteen she found a teacher of Arabic. After a week she was absolutely infatuated with her studies for her teacher not only introduced her to Arabic grammar, but also to Islamic history and culture. Weeks were counted only from Thursday to Thursday (that was the day of the Arabic class), although she had to keep this to herself. For who among her classmates would have understood, who of her relatives and acquaintances would have appreciated a girls learning a Semitic language at a time when nationalism and political fanaticism filled the air? Somewhat later the girl skipped two levels to finish high school at sixteen. Alas, she had to attain seven years of English in six months so that the grade in English was the lowest in her otherwise brilliant gradesheet—that is probably
ANNEMARIE SCHIMMEL 163 the reason why the Good Lord found it necessary later to send her to Harvard to improve her skills a bit. Before joining the university we had to undergo the trial of Arbeitsdienst, a
forced labor service during which we lived in the countryside to serve as unpaid maids and agricultural help in poor areas, and I learned such useful things as cleaning pigsties and harvesting beets—and desperately tried to keep up my Arabic. This stubborn clinging to my ideals resulted in the fact that I
was probably the only girl in my age group who was not automatically transferred into the Nazi party as was customary when one reached the age of eighteen.
It was in the camp that we heard the news of the Second World War breaking out, and our leader proudly told us that we could now stay much longer than the usual s1xx months to serve our herrlicher Fiihrer. My nonexistent love for the Fiihrer certainly did not increase at hearing this news. My father had been transferred to Berlin on the very first day of the war. Soon my resourceful mother found out that I could be released from the Arbettsdienst provided I studied natural sciences. Why not? After all, I loved physics and immediately imagined that I would work later in the history of Islamic science, especially mineralogy. After reaching Berlin and registering in the Faculty of Arts and Sciences, I also continued Arabic and took courses in
Islamic Art, and by the end of the first trimester (the semesters had been shortened), at Christmas 1939, Professor Ktihnel, the doyen of historians of Islamic art, smulingly encouraged me to forget science and to concentrate upon Islamic studies by promising that [ would become his assistant after completing my doctorate. This, however, remained a dream. After I had finished my Ph.D., in November 1941, I joined the Foreign Service as a translator, for from the museum, which was not important for war activities, I would
have been drafted into the army. But forty years later my initial dream was fulfilled when I was invited to join the Metropolitan Museum on a part-time basis to do what Kiihnel had hoped—that 1s, to work on Islamic calligraphy, a
field which I also taught during my Harvard years. To study in wartime Berlin was—at least for me-—like living far away from
the stark realities of political life. My professors were the most outstanding representatives of their respective fields. Importantly for me, we had a woman professor, Annemarie von Gabain (d. 1993), to whose introduction into Turcology I owe much and whom I considered my “elder sister,” my apa. And while Richard Hartmann taught us the patient historico-critical approach to classical Arabic and Ottoman Turkish, Hans Heinrich Schaeder, a true ge-
Yq 9B nius, carried us to the farthest shores of history, nay of culture in general. Discovering my interest in Maulana Rumi (kindled by Riickert’s free translations of his poems), he suggested that I read R. A. Nicholson’s Selected Poems from the Dwvan-1 Shams-1 Tabriz (which I copied by hand) as well as Louis
Massignon’s studies on the martyr—mystic Hallaj (executed in 922 in Baghdad)—and three months later, Christmas 1940, I surprised him with a set
of German verse translations from Rumi and Hallaj, which I feel are still valuable. After the war it was Schaeder who introduced me to the work of T. S. Eliot, and instead of spending a brief visit in G6ttingen with discussions about Persian poetry, we read the Four Quartets, just arrived on his desk. As a
corollary he suggested that I should read John Donne, whose poetry fascinated me so much that twenty years later I published a collection of my German verse translations of his work because his style seemed so close to that of my beloved Persian poets. Both Schaeder and Kiihnel were married to academic women who generously encouraged me in my work. This certainly contributed to the fact that I never felt I was a stranger in the academic world and took it for granted that
women had the same role to play in the academic community as did men. Six terms of study were, however, by no means a quiet time of learning: during every vacation we had to work in a factory, ten hours a day, and I would return home, often with my hands bleeding, to write my dissertation on Mamluk history. I learned much about the women’s hard life in the factory
and was grateful for the understanding they showed to the stranger whose work was meant to guarantee some days of paid leave for a few of them. After finishing my studies, I worked not only in the Foreign Office, but also prepared the great index for a sixteenth-century Arabic chronicle of some 1,500
pages, which appeared—still during the war—in Istanbul. | The dark clouds of war becarne more and more terrifying; the bombing stronger—I remember walking for four hours through burning streets in search of a lost colleague, of giving shelter to friends who had lost everything, and reading about the worsening political situation in the telegrams we had to
decipher in our office. Yet, I remained, in my spare time, faithful to my Mamluk officials about whom I was writing my Habilitationsschrift. 1 submitted it on April 1, 1945, the day when our office was sent to central Germany for security reasons. In a small Saxonian village we were captured by the Americans, spent a week in a subterranean prison, and were transported to Marburg
on the day of the armistice to remain interned during the summer in a student’s house. It was the best thing that could happen to us: at least we had a roof and regular, though, of course, minute rations of food, and we soon
ANNEMARIE SCHIMMEL 16 arranged something like a camp university, teaching and learning to adapt to life in a strange little community. One day an important visitor came to look after us. It was Friedrich Heiler, the famed historian of religions, then Dean of the Faculty of Arts in the yet to be re-opened old University of Marburg. He spoke about Nathan Sdderblom, the leader of the Ecumenical Movement, Archbishop of Sweden, and histo-
rian of religion (d. 1931). Although I had the impression that the learned speaker barely noticed me during the discussion, two months later, when the internment was drawing to a close, he called me at home. Would I like to stay in Marburg? They were in need of a professor of Arabic and Islamic studies as the former chairholder was such a terrible Nazi. I was barely prepared, but as I
had—along with some Persian and Arabic text—a copy of my Habu:tationsschrift in the one suitcase I could bring with me, I agreed. After three months with my aunts in northern Germany I delivered my inaugural address on January 12, 1946, not even twenty-four years old. It was quite an event in the conservative little town of Marburg, and the only woman on the Faculty, Luise Berthold, specialist in medieval German, congratulated me with the words: “My dear child, remember ove thing—men are our enemies!” Despite her warning, I enjoyed teaching immensely. No one can image how happy both teachers and students were during those years—no more war; freedom to speak, to read books of which we had not known anything; listening to inspiring lectures of returning emigrants; and although we had barely anything to eat, we ate and drank knowledge. Every class—be it Arabic, Persian, or Turkish, or first ventures into the history of Islamic literature and art—was an adventure, especially since quite a few of my students, returning from the war, were senior to me. Besides, I became closely attached to Heiler and worked with him on history of religions, supplementing his classes with Islamic materials and learning much about the phenomenological approach to religion, and about Church history and its intricacies. In addition, I enjoyed the German Mass which Heiler used to celebrate on Sundays in the small chapel in his house. However, it was also the time of learning of the atrocities that had been taking place during our childhood and youth, atrocities which seemed too shocking to be true—and of which most of us had been unaware.
In my discovery of new areas of knowledge, I was supported by my mother, who had joined me in May 1946 after my father had been killed in the battle of Berlin. He was one of the numerous elderly men who, without even knowing how to handle a shotgun (and there were six guns for twenty-five people!) were sent against the Russians as “the main defense line.”
166 1993 One interesting aspect of my life in Marburg was that Friedrich Heiler was one of the first to realize the importance of women’s contribution to religion
and scholarship. His seminars and his book, Die Frau in den Religionen, tackled the problem long before it became an issue in the clerical and academic world. We jokingly called him “the patron saint of women professors.” In this quality he warmly advocated the role of women as ministers of the church and a Swedish champion of the cause, Marta Tamm-G6tlind, visited him in 1948. She invited me to come to Sweden in 1949, and, after many “external” diff-
culties, I went to spend two weeks with her on a small island on Sweden’s West Coast to polish up my Swedish which, at that point, was purely theological. Days in the beautiful setting of Sigtuna, north of Stockholm, followed and for a whole month I enjoyed Uppsala. I was fortunate enough to meet the great masters of Oriental studies, such as H. S. Nyberg and Zetterstéen as well as the numerous historians of religion, in the first place Geo Widengren. But the high point was the connection with gamla arkebiskopinna, Archbishop
Séderblom’s widow, Anna, who received the young colleague of her husband’s friend with affectionate warmth. I enjoyed every minute of my stay and felt thoroughly spoilt—but how could I foresee that thirty-five years later the
Faculty of Divinity in this very place would confer upon me an honorary degree? I confess that I was proud, at that occasion, to be able to express part of my vote of thanks on behalf of the foreign recipients of degrees in Swedish, a language filled with precious memories, and I again enjoyed the fragrance of
the lilacs around the Domkyrkan and tried to read the lines that the ravens around the church spire kept on writing on the crystalline blue sky. For a modern student of Oriental languages, it seems unbelievable that we
never saw an Arab, let alone studied in an Arab country. But for post-war Germans even the smallest excursion into a neighboring country was a major
event. One event of this sort was my participation in the first International Conference for History of Religion in 1950 in Amsterdam, where I saw and heard the giants in that field. Among them was Louis Massagnon, a figure that seemed to consist of white light, with barely any trace of a material body—a mystic, but a mystic who fought relentlessly for the underprivileged, for the Algerian Muslims, and who incorporated passion and love. Years later he talked to me in an overcrowded elevator in Tokyo about the secrets of the mystical rose, unaware of the noisy human beings around us.
Amsterdam opened my eyes to the numerous possible ways to interpret religion in its essence and its manifestations, philological, historical, theological, sociological, and shortly afterwards I obtained a doctorate in the History
of Religion from the Faculty of Divinity in Marburg. Yet, the Protestant
ANNEMARIE SCHIMMEL 167 church of the province Hessen very soon prohibited the faculty to offer such a degree because its ideals did not tally with the church’s attitude toward the
study of non-Christian religions. And was there not the danger that a nonProtestant might receive a degree from a Protestant faculty? A brief visit to Switzerland in the spring of tos: brought me in touch with the philosopher Rudolf Pannwitz, whose fascinating thought system—much too little known even in German-speaking countries—helped me appreciate better the philosophy of Muhammad Iqbal, the Indo-Muslim poet—philosopher. For the first time I also met Fritz Meier, the best authority on the study of Sufism—an admired model and, later, a wonderful friend to this day. A decisive event took place in 1952: my first visit to Turkey. I had received a
small grant to study manuscripts on Islamic prayer life in Turkish libraries, and fell immediately in love with Istanbul and with the wonderful hospitality of Turkish friends from whom I learned so much about Islamic culture as well as of Turkey’s classical past. At the end of my first stay I had still enough money left to fulfill my heart’s wish: I flew to Konya to visit the Mausoleum of
Maulana Jalaluddin Rumi who died in this place in 1273. After reading and translating his ecstatic verse for so many years, I simply /ad to go, and Konya, then a small sleepy town, did not disappoint me (as it does now, surrounded by rows of high-rise apartment buildings that seem to bar off spirituality). A thunderstorm at night transformed the greyish streets and little gardens into a veritable paradise; the roads were filled with the heavy fragrance of igde (musk
willow), and I understood why Rumps poetry is permeated with spring songs. It is not a logos or a worn-out image based on a Koranic reference to the resurrection—rather, he knew that the thunder was indeed like the sound of Israfil’s trumpet which announces the resurrection of the seemingly dead bodies. And did not the trees now don green silken robes, fresh from Paradise?
I loved Turkey so much that I returned next fall without a grant. In retrospect, these two stays look like a tume of perfect ecstasy, and my major joy—besides the library work-—was to discover Istanbul on foot. The librarian of the delightful Aya Sofya library took me around after work, reciting
a poem at each corner, so that I experienced the city through poetry. And often did I sit with the well-known poets of the country to discuss with them problems of modern literature—problems of a people that had been deprived of its time-honored Arabic alphabet in 1928 and was trying to shed its historical fetters. During my second stay, new friends helped me to gain access to another part of Turkish culture, to the best traditions of Turkish Sufism. There were
168 1993 successful businessmen who yet would spend night after night in silent meditation, and there was Samiha Ayverdi, the towering figure among mystics and writers, author of numerous books and articles in which she conjures up the
traditional life. In her house I was introduced to the culture of Ottoman Turkey, and she and her family opened my eyes to the eternal beauty of Islamic fine arts, in particular calligraphy. I loved to listen to her discourses which went on in long, swinging sentences, while the sky over the Bosphorus seemed to be covered with clouds of roses. A few weeks ago, in March 1993, she passed away on the eve of the Feast of Fastbreaking, three days after I had kissed her frail hands for the last time. After experiencing such generous friendship by people from all walks of life, Germany appeared cold and unfriendly to me, and the prediction of my old colleague in Marburg seemed to be much more true than at the beginning of my career—there were enough people who did not like a young woman who, to add to this in itself negative aspect, had published a book of verse translations of Oriental poetry, not to mention a volume of German verse in Persian style and who was—even worse!—fascinated by the mystical dimensions of Islam instead of relying solely on the hard external facts, be it history
or philology. Therefore I more than gladly accepted the offer of Ankara University to join the recently created Faculty of Islamic Theology and to teach, in Turkish, history of religions although I was a Christian woman. The
five years that followed were beautiful, hard, and instructive. My mother joined me for many months every year and shared my love of the Anatolian landscape through which we traveled on long, dusty roads—the poetry of
Yunus Emre, the medieval Turkish bard, was my company. The years in Ankara gave me the chance to visit villages and small towns, to observe “the piety of the old women,” and to discuss questions of religious truth with Sufis and laypeople, to learn much about Islamic customs and practices. At the same time I had numerous friends who espoused Ataturk’s ideals, and I saw how a gap was widening, year by year, between the two faces of contemporary Turkey. Its result is a superficial Americanization of those who have forsaken their moorings in the Islamic~Turkish tradition, and a hardening
stance of those who, as a reaction to such a development, seek help in a legalistic “fundamentalism.” Of course, we visited Konya time and again. Shortly after our arrival I was asked to give a paper during the first public celebration of Rum1’s anniversary on December 17, 1954. It was the first time that the old dervishes could get
together for the samac, the mystical concert and the whirling dance, after Ataturk had banned the mystical fraternities in 1925 and prohibited their activ-
ANNEMARIE SCHIMMEL 169 ities. There they were—we saw them first in an old private mansion, whirling like big white butterflies and listened to the enchanting music. Thus, Rumi became even more alive and stayed with me as an unfailing source of inspiration and consolation to this day. Now, however, dervish dance and whirling have in most cases degenerated into a folkloristic play or a tourist attraction,
just as much as those who claim to translate Rumrs poems into a Western language usually cling to a few, often misunderstood, concepts which would make the great mystical thinker—poet shudder. But who can still undergo the
hard training of 1,001 days, during which Rumi’s Persian works would be studied while music, whirling, and meditation slowly “cooked” the dervish until he was spiritually matured? Having reached a certain impasse in my scholarly work after five years, I decided to return to Marburg, not exactly welcomed by my colleagues. But in the meantime another strand appeared in my life’s fabric. Ever since
I became a student I had admired the work of Muhammad Iqbal, the IndoMuslim poet (1877-1938) who is regarded as Pakistan’s spiritual father and in whose poetical work Eastern and Western ideals, personified by Rumi and Goethe, are blended in a fascinating way. After Pakistan came into existence
in 1947, I was able to procure some studies about hum, and a wonderful coincidence brought me, thanks to Rudolf Pannwitz, in touch with an old German poet who once had translated some of Iqbal’s poems from an English version into German verse. He had sent them to Lahore where they are now
on display in the Iqbal Museum, and as he could not read the two Persian works Iqbal offered him as a token of gratitude, Hanns Meinke gave them to me. It was the poet’s Payam-i mashrig, his answer to Goethe’s West-Ostlicher
Divan, and the Javidnama, the souls journey through the seven spheres. I could not help translating the latter work into German verse, and my enthusiasm was so great that I talked incessantly about his wonderful, reformist and yet deeply mystical thought so that my Turkish friends urged me to translate the epic into Turkish—not in verse, to be sure, but with a commentary. This lead to my invitation to Pakistan at the beginning of 1958. There it was not only Iqbal’s memory and the repercussions of his work that I found; I became interested in the different languages and literatures of
the country and fell simply in love with Sindhi, the language of the lower Indus Valley. To read the mystical songs of Shah Abdul Latif (d. 1752) and his
successors proved a never-ending spiritual adventure, for classical Islamic thought, mystical trends, the admixture of Indian bhakiz elements, and especially the concept of the woman as the representative of the soul in her quest for the Eternal Beloved fascinated me for years. I often remembered the wise
170 1993 old Padmanaba of my childhood tale who introduced the young Arab into the mysteries of Sufism, for from the walls of the innumerable saints’ tomb in the countryside resounded the word: “People are asleep, and when they die they awake.” And the undulating cadences of Sindhi music led to a deeper love for Indian and, in general, Oriental music. Pakistan remained my main field of work after I left Turkey. Numerous journeys have led me there in the following years to this day, and I came to
know the different nooks and corners of the vast country—not only the steppes of Sind, dotted with little mausoleums, but also, at a much later stage,
the mountains in the north, and I often wonder what was the highpoint of some thirty visits to Pakistan. Was it the radiant morning in Islamabad when I was awarded the Hilal-1 Pakistan, the highest civil distinction of the country, in a ceremony in which the Aga Khan also participated? Was it the drive to the Khunjrab Pass of 15,000 feet at the Chinese border? Or the flight along the
Nanga Parbat into the gorges of the young Indus? Or was it the incredible hospitality of the people even in the poorest village, the gentle gesture of an unknown guard who hurried to bring a glass of water for the honored guest from Germany? Or was it perhaps the flight in a small helicopter across southern Baluchistan to Las Bela and then to the sacred cave of Hinglaj in the Makran mountains, a Kali sanctuary which we finally reached on camelback? I
watched the political changes; had long talks with Mr. Bhutto and with General Zia ul-Haqq; saw the industrialization grow; the old patterns of life slowly disappear; tensions between the different factions intensify; ministers and heads of states changing or being killed. But the variegated cultural trends and the friendship of so many people (who usually knew me from my frequent appearances on TV) made me very much feel at home in Pakistan. My fascination with Pakistan—and the whole subcontinent—was supported in a quite unexpected way. In 1960, before being called to the University of Bonn to teach Islamic studies and the related languages, I had helped organize the International Congress for the History of Religion in Marburg. Five years later, American colleagues invited me to assist them a bit in organizing the next conference in Claremont, California. It was my first visit to the
United States. I enjoyed it, taking in everything from Disneyland to the Grand Canyon as well as New York, which never ceases to excite me. The conference itself clearly showed the historical approach to religious studies of the majority of Europeans and a more dynamic attitude advocated by a number of North American scholars. But more confusing for me than this somewhat worrying tension between schools of thought was Wilfred Cantwell Smith’s question whether I would consider coming to Harvard to teach Indo-
ANNEMARIE SCHIMMEL 171 Muslim Culture. It was the famous Minute-Rice Chair which a wealthy Indian Muslim, infatuated with the Urdu poetry of Mir (d. 1810) and Ghalib (d. 1869), had dreamt of in the hope that his favorite poets would be translated into English to enchant the West as much as Fitzgerald’s renderings of Omar Khayyam’s Rubatyat had done more than a century ago. No, I said; I was not interested at allk—Urdu was not my field. And America? I had never thought of settling there. At that point I had still another reason to refuse the offer or, at least, to hesitate to accept it: after moving to Bonn in 1961, I edited from 1963 onward an Arabic cultural magazine with Albert Theile, one of the most ingenious creators of high-class cultural publications. Our Fikrun wa Fann was often praised as the most beautiful journal printed in Germany, and as I was not only responsible for the Arabic texts but in part also for the composition, | learned how to make a classical layout with scisssors and glue until a perfect piece of work was achieved. In connection with our selection of articles, authors, and illustrations, we had to visit numerous museums, theaters, and ballets, and my horizon widened thanks to the lovely work which enabled me to indulge in my artistic interests and thus in a certain way supplemented my academic teaching. To leave my journal? No! And yet, who could resist a call from Harvard? I finally accepted, all the more since I did not see any chance for further promotion in Germany—as my chairman remarked: “Miss Schimmel, if you were a man, you would get a chair!” My contract with Harvard began in July 1966, but I used the first months to
buy books in India and Pakistan. Coming from Iran, I stopped in Afghanistan, whose natural beauty captured my heart—was not the sapphire lake of Band-i Amir taken out of a childhood dream? Later I was to return several times to this country with its hospitable people, traveling from Sistan to Balkh, from Ghazni to Herat, and each place was fraught with memories of Islamic history, resounding with Persian verse. I stayed again in Lahore and then proceeded to India, which in the following years was to become more
and more familiar to me—not only the north with its Moghul heritage, but perhaps even more the south. I found in the old royal cities of the Deccan— Gulbarga, Bidar, Byapur, Aurangabad, and Golconda-Hyderabad—so many things that gave witness to the vast but little known literary and artistic heritage of that area that again a new world unfolded, a world which I tried to open up to my students at Harvard and which enabled me to be of some help when Cary Welch prepared the glorious “INDIA!” exhibition in 1985 at the Metropolitan Museum.
172 1993 In March 1967 I arrived in Harvard to experience the very first morning a terrible blizzard. Nobody had ever told me that such events were quite com-
mon, just as nobody ever bothered to introduce me into the secrets of Harvard administration: the mysterious proceedings that ruled grades, term papers, admission meetings, the difference between graduates and undergraduates, and so on. How could one, acquainted with a completely different academic system (that held true for both Germany and Turkey), know all these things? The first semester was hard: not only was I made to teach an introduction to Islamic history besides Persian, Urdu, and quite a few other
subjects, but I also spent every spare moment in the bowels of Widener Library making the first list of the hundreds and hundreds of Urdu books which by then had arrived from the subcontinent. While we had only six or seven Urdu publications when I checked the catalogue first, Widener now boasts one of the finest libraries of Urdu and Sindhi in the United States. “Harvard 1s the loneliest place on earth.” Thus an American colleague had told me, and it was only thanks to my wonderful students that I survived those first years—students from India and Pakistan, from the Carolinas and from the West Coast, from Iran and from the Arab world, Jesuits and Muslims as well as Buddhists. They were my children, and they supported me when I went through phases of despair, and trying to help them solve some of their problems (not only scholarly ones but personal ones as well) helped me overcome some of my own problems. And as I had seen Istanbul through the eyes of poets, so I learned something about “the Cambridge ladies who live in furnished souls” through e. e. cummings’ verse. My problem was that I had to teach in a language not my own, and while I had thoroughly loved teaching in Turkish, I always remembered my nearfailure in English in high school, even though I had already published quite a few books in English. And worse: In Germany I could use the magnificent poetical translations of Oriental poetry made from 1810 onward, and when there was none available I translated a poem myself into verse. Here, I was like
a mute, unable to get across these treasures to my students. Or so I thought. After Harvard had offered me tenure in 1970, I grew more secure, and the arrangement to teach one semester with a double teaching load while spending most of the fall in Germany and in the subcontinent was, I think, beneficial both for my research and for my students. That such an arrangement was accepted by the university was largely due to the efforts of the trustee of the Minute-Rice money, Mr. James R. Cherry, whose friendship and wise counsel I have enjoyed ever since I came to this country. In the course of time, especially after moving into Ehot Housc, I felt more and more a veritable
ANNEMARIE SCHIMMEL 173 member of the Harvard community, meeting colleagues from different fields of specialization through the Senior Common Room—something the member of a small, exotic department really needs in order to develop a sensitivity to the problems facing a major elite university.
Strangely enough, with my life on three continents, my literary output kept on growing. The United States compelled me to publish in English, which meant reaching a much wider readership than previously, when I wrote
mainly in German. I also enjoyed the chance to learn more about North America since numerous conferences led me to most of the major campuses. Everywhere I found friends. UCLA was an almost regular site where I attended many of Levi-della-Vida conferences and was honored myself, quite unexpectedly, by receiving the Levi-della-Vida medal in 1987. There was Salt
Lake City and the stunning beauty of southern Utah; there was Eugene (Oregon) and Dallas; Chapel Hill and Toronto and many more; and there was Chicago with its fine group of historians of religion who included me among the editors of Mircea Eliade’s prestigious Encyclopedia of Religion. It is fitting to mention here the ACLS lectures in History of Religion in the spring _
of 1980, which took me from Tennessee and Duke to Edmonton, Alberta. I think I broke the records with the sheer number of my lectures on various aspects of mystical poetry in Islam, published in the book As Through a Ved.
The time on the other side of the ocean was filled with lecture tours to Switzerland and Scandinavia, to Prague and to Australia, to Egypt and Yemen, not to forget my participation in the festivities on occasion of the 2,500 years of Iran in 1971.
Often people ask me whether such a life between classes, typewriter, and ever so many lectures on a variety of topics is not exhausting. It may be at certain moments, but the joy one experiences when meeting so many interesting people indulging in lively discussions after the lectures—over breakfast, lunch, and dinner—is certainly reyuvenating, for it fills the mind with fresh
ideas, and even the most stupid question of an untutored journalist or an inquisitive high school student may tell you that you could have tackled a certain problem more skillfully, or defined a formulation more lucidly. To be
sure, the constantly repeated question: “How is it that you as a woman became interested in Islam, of all things?” makes me increasingly impatient and even angry! The circle of my scholarly life, and that is almost a coterminous with my life
in general, expanded. The fact that my American cousin Paul Schimmel (named after my father, who never knew of his existence) teaches at MIT and
was elected on the same day as I to the American Academy of Arts and
174. 1993 Sciences was and still is a great source of joy for me, and I feel proud of him and his loving family, with the two girls deeply interested in Islamic culture. It was certainly an experience to watch not only the development of my students (some of whom by now are retired ambassadors or grey-bearded
professors), but also to observe how spiritual seeds that existed long ago matured into wonderful flowers and fruits. When I learned how to handle the phenomenological approach to religion, which seems to facilitate the understanding of the external manifestations of religions and slowly guides the seeker into the heart of each religion, I was and still am convinced that such an approach can lead to much-needed tolerance without losing oneself in sweeping, dangerous “syncretistic” views that blur all differences. But could [ have ever dreamt in those early years that one day I would be elected (in 1980) President of the International Association of the History of Religion, the first woman and the first IsLamologist to occupy this office? Or could anyone have foreseen that I would be invited to deliver, in 1992, the prestigious Gifford Lectures at Edinburgh, the dream of every historian of religion, theologian, or philosopher? When I read in my second semester of Persian, at age seventeen, the Safarnama of the great medieval Ismaili philosopher Nasir-i Khusraw (d. after 1071), could I have imagined that some of my best students at Harvard would be members of the Ismaili community or that I was to become closely associated with the Institute of Ismaili Studies in London, where I like to teach summer courses and for whom I translated (and now—thank heaven!—into English verse) poems from the pen of this very Nasir-1 Khusraw?
And when I, near despairing in the Arveitsdienst before entering the university, wrote a letter to the imam of the Berlin mosque asking him whether he
could find a family in Lahore with whom I could spend some time to learn Urdu (which, of course, was a purely utopian idea! )—who could have foreseen that more than forty years later, 1n 1982, one of the most beautiful alleys in
Lahore would bear my name? My entire life, lived in widening circles, as Rilke puts it, was a constant process of learning. To be sure, learning and re-learning history, as it happened several times in my life, made me somewhat weary of the constant shift of focus or of perspective in the political life of the countries with which I was associated. Perhaps, looking at the Islamic (and not only Islamic!) societies in modern times, one should keep in mind the ingenious insight into the patterns of ebb and flood of the tides of history as expressed by the fourteenth-
century North-African historian Ibn Khaldun in his mugaddima, parts of which I translated in my early days—and one tends (at Icast I do) to look out
ANNEMARIE SCHIMMEL 175 for the unchanging power behind the fluctuating surface of the ocean of events.
My parents, wise as they were, taught me this in different ways. Without
my father’s understanding of the very heart of religion and without my mother’s ever deepening wisdom, her infinite patience with a somewhat unusual daughter, and her never-failing support, my life would have been quite different. A village girl who never had been to high school but was completely self-taught, my mother read the manuscripts and proofs of all my German books and articles and acted, as she loved to say, as the “people’s voice” and thus taught me to write with non-specialist readers in mind. But she also tried to check my tendency to enter too deeply into dreams of mystical love for, being supersensitive herself, she was afraid lest I lose my sobriety and my critical mind. Although it seems that the time of learning might now draw to a close, I still understand that every moment—even the most unpleasant one—teaches me something and that every experience should be incorporated into my life to enrich it. For there is no end to learning as there is no end to life, and when Iqbal says in a daring formulation: “Heaven is no holiday!” he expresses the
view, dear to Goethe and other thinkers, that even eternal life will be a constant process of growing, and, that is, of learning—learning in whatever mysterious way something about the unfathomable mysteries of the Divine, which manifests Itself under various signs. Suffering, too, is part of it; and the most difficult task in life 1s to learn patience. Learning is, to me, transforming knowledge and experience into wisdom and love, to mature—as according to Oriental lore, the ordinary pebble can turn into a ruby provided it patiently takes into itself the rays of the sun, shedding its own blood in a supreme sacrifice. Perhaps a few lines I once wrote after visiting Maulana Rumi’s mausoleum in Konya can express what _ learning means to me: Never will you reach that silver mountain which appears, like a cloud of joy, in the evening light. Never can you cross that lake of salt which treacherously smiles at you in the morning mist. Every step on this road takes you farther away from home, from flowers, from spring. Sometimes the shade of a cloud will dance on the road,
176 1993 sometimes you rest in a ruined caravansary, seeking the Truth from the blackish tresses of smoke, sometimes you walk a few steps with a kindred soul only to lose him again.
You go and go, torn by the wind, burnt by the sun and the shepherd’s flute tells you “the Path in blood.” until you cry no more until the lake of salt is only your dried-up tears which mirror the mountain of joy that is closer to you than your heart.