Against the Stream: Reflections of an Unconventional Demographer [1 ed.] 0765802228, 9780765802224

With the insight and clarity that mark all of Petersen's writings, Against the Stream brings together reflections o

189 23 6MB

English Pages 154 [162] Year 2004

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

Against the Stream: Reflections of an Unconventional Demographer [1 ed.]
 0765802228, 9780765802224

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

2K

Reflections of an Unconventional Demographer

_William. Petersen

Digitized by the Internet Archive in 2023 with funding from Kahle/Austin Foundation

https://archive.org/details/againststreamref0000pete

Also by William Petersen Planned Migration: The Social Determinants of the Dutch-Canadian

Movement

(1955)

University Adult Education: A Guide to Policy (1960; with Renee

Population (1961,

Petersen)

1969, 1975)

The Realities of World Communism (1963; editor and contributor)

Nevada’s Changing Population (1963; with Lionel S. Lewis)

The Politics of Population (1964, 1970) Japanese Americans: Readings

Oppression and Success (1971)

in Population

Malthus

(1972; editor)

(1979)

The Background to Ethnic Conflict (1979; editor) Dictionary of Demography (1985-86;

with Renee

Petersen; 5 volumes)

Ethnicity Counts (1997)

Malthus:

Founder of Modern

A Consumer’s

Demography

(1999)

From Birth to Death: Guide to Population Studies (2000)

From Persons to People: Further Studies in the Politics of Population (2003)

Aggainst Stream Reflections of an Unconventional Demographer

William Petersen

Transaction Publishers New Brunswick (U.S.A.) and London (U.K.)

Copyright © 2004 by Transaction Publishers, New Brunswick, New Jersey. All rights reserved under International and Pan-American Copyright Conventions. No part of this book may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without prior permission in writing from the publisher. All inquiries should be addressed to Transaction Publishers, Rutgers—The State University, 35 Berrue Circle, Piscataway, New Jersey 08854-8042.

This book is printed on acid-free paper that meets the American National Standard for Permanence of Paper for Printed Library Materials.

Library of Congress Catalog Number: 2003066168 ISBN: 0-7658-0222-8 Printed in the United States of America

Library of Congress Cataloging-in-Publication Data Petersen, William. Against the stream : Reflections of an unconventional demographer / William Petersen. p. cm. Includes bibliographical references and index. ISBN 0-7658-0222-8 (alk. paper) 1. Social problems. 2. Demography. I. Title. HN17.5.P485 361.1—dc22

2004 2003066168

For Renee, again

Contents Introduction

On Utopias

— .

Two Utopians Plan Towns Population and Its Sustenance

A Founder of Sociology Blunders An Anomaly

in Western Europe

Canadian-American

Relations

Revolting Berkeley Students Student Rebels and Juvenile Delinquents The Anti-Urban

Bias in the Protestant Ethos

— io).

The Roots of Christian Holidays

PD

From the History of English to Current Usage . Too Much of a Good Thing

. On the Cause of Death Index

o> 103

EN

~

oorivtins stiorgt Cie —

td niko thi

a

ee A, yi oe!

yoo

ae

vill

eae

aa 2

Introduction Concerning the topics discussed in this book, the principal point I try to make is that the view expressed is unorthodox. In my professional career as a demographer and sociologist, I have often been out of step with the generalities of the two disciplines. I believe that I was right and my fellows were wrong, and in any case the reader is offered a fresh slant on matters about which he may already know the conventional opinion. During the past several generations, it has been dogma in both popular and scholarly writings that the growth of the world’s population threatens to exhaust the resources on which the people’s lives depend. In the process, this excess

fertility, the canon

continues,

is con-

taminating the environment that sustains all life. This contention about the environment can be exemplified by an issue that has gained prominence over the past decade or two, the supposed warming of the earth’s atmosphere. Like virtually everyone else, I lack the training needed to judge the arguments of experts who contend on both sides, but the other points that the doomsayers make suggest that their supposedly scientific stand may have a considerable infusion of politics. According to their familiar thesis, it 1s industrial ix

x

Against the Stream

countries, and particularly the United States, that are fouling the human

nest; and this Luddite, anti-indus-

trial position is part of the standard environmentalist homily. The main professional task of demographers, it has been alleged, is to foster a decline in fertility, particularly in less developed countries. Contrary arguments are forcefully expressed, for example, in the several books of Julian Simon, and he also felt it appropriate to title his book about himself A Life Against the Grain: The Autobiography of an Unconventional Economist. In attempts to get a hearing for his minority view, he tried all sorts of maneuvers,

with the result that those

holding to official tenets labeled Simon not only a poor

economist but a “buffoon.” And when a Danish statistician, BjOrn Lomborg, read his works and himself wrote an overwhelmingly documented defense of Simon’s conclusions, The Skeptical Environmentalist: Measuring the Real State of the World, he was unable to alter

the dominant view about population and the environment. As one who supported the judgment that the world is not on the point of manmade destruction, I also was out of the mainstream. But however sanguine I have been about the material progress of mankind, I have been unable to join the happy throng of all-out optimists. Twice in the twentieth century, international conflicts spread to become “world wars,” and the word “genocide” had to be coined to represent a new phenomenon, mass murder by advanced technological means. The spread of “terrorism” marks a general breakdown of the civility that once governed at least the places where I was. To live in such times and ignore such trends is common, but hardly to

Introduction

xi

be recommended. I find the contrast between over-all “optimists” and over-all “pessimists” to be fatuous. I have been in neither camp, and my stance put me in a minority of a minority. In addition to demography, I have had a scholarly interest in ethnicity, and in this context also I was something of an odd man out. Most of the academics who study ethnicity start from a concern about their own tribal heritage; three of the most prominent wrote first about Jews, about the Irish, and about the Italians be-

fore they expanded their research to general theses. But my background is hardly more than a joke. My father was born on Fohr, an island in the North Sea off the coast of northern Germany, and his native language was Frisian. This is an appellation that most Americans

associate with cows, but there is such a lan-

guage and the people who speak it have had a long but, in recent times, a not very glorious history. When the Roman general Drusus crossed the Rhine in 12 Bce, he encountered a tribe called the Frisii, whom he quickly subjugated. Historians do not agree about their origin: they may have been a Germanic tribe, or perhaps not. Their tribal name may have derived from a Germanic root meaning “friends” or “free men” or “edge dwellers” or “curly-haired ones,” or, as still another alternative, from a non-Germanic source. During three hundred years the Frisii supplied the Roman army with produce from the cattle they raised along the coastal area from present-day Belgium up to Denmark. Between the fifth and ninth centuries, they dominated trade from southern France to Finland, and what we know as the North Sea was called the Mare Frisicum. In the fifth century, when Angles, Saxons, and Jutes invaded England, they went through Frisian territory and undoubt-

xii

Against the Stream

edly were accompanied by a number of Frisians—who appear, however,

in none of the histories of early Brit-

ain that I have consulted. The only present discrimination against Frisians that I can complain of is that few people have ever heard of them. Even so, I have found it convenient and pleasant to

be of Frisian descent. Such an outlandish background can always act as a restorative when conversation

lags,

and with the blood of so undistinguished a people, I have been immune from jingoism. I have, however, none of the pride or indeed even the interest in my forebears from which to start my work on how ancestry affects members of other ethnicities. _lam an American by birth, and I share the common view that I am lucky to live in the best country the world has ever seen. However, this association has also been tarnished by my upbringing in Jersey City, ruled in those days by one of the nation’s most corrupt political machines. My political orientation then should have led me to join with the millions supporting the New Deal, but I was put off by the fact that our unsavory mayor, Frank Hague, was an honored member of the Democratic National Committee, enjoying the full support of Washington’s reformists. The patriotism I was taught in the civics class of my high school always had to compete with the reality of moral squalor in which I lived. In short, in a number of crucial respects I happened to have been placed in an ambivalent zone between the politically correct and its opponents. After a commonplace adolescence of commitment to radical causes, I have been less often engaged in social conflicts than a supporter of third camps or an interested observer on the sidelines. The essays that follow exemplify this recurrent position “against the stream.” Books on the

Introduction

xiii

methodology of social disciplines recommend that research be conducted by a neutral being, the odd type that, with respect to many of the issues that I discuss here, I have become. The commonplace

faith in a utopia is not one I share,

as the first two chapters make clear. I am a “sociologist” by vocation, but the secular worship of such a pioneer of the discipline as Emile Durkheim is in my opinion a false orientation, as I try to show in a review of his best known book. Perhaps the most routine cliché about population is that the food to feed the growing number of people is running out, and in the next essay I submit that simplistic thesis to a thorough review. I spent several years in Belgium, which I found to be a far different country from the image of most Americans or, indeed, most Belgians. I was a member of the Berkeley faculty during the riotous years of the 1960s, and my review of those events is not in accord with the conventional accounts, whether then or now. The very com-

mon prejudice of the middle class against hoi polloi is exemplified by how most people, but again excluding me, regard the behavior of college students. The political and cultural theses promulgated by mainline Christian churches are very often composites, made up of official positions of the urban headquarters and more traditional views of the rural membership. Christian holidays, which virtually everyone sees as the holy days of that faith, are also the deposits of borrowings from earlier religions. Everyone who respects the English language has trouble in formulating a suitable guide, and I suggest that the way the language developed in its continual change gives us important clues to what is “correct.” The cause of death, as seen by both laymen and most medical professionals, is something

xiv

Against the Stream

of a puzzle, and I attempt to clarify some of the basic questions. The United States has become something of a paradise for gourmands and fressers; I contrast the two poles in the competition between eating and staying (or, more likely, becoming) slim. The cause of death,

as seen by both laymen and most medical professionals, is something of a puzzle, and I attempt to clarify some of the basic questions. This miscellany has one overall characteristic—that the point of view broached challenges the conventional politically correct.

On Utopias Sir Thomas

More

(1478-1535),

an English states-

man, diplomat, and noted humanist, refused to subscribe to the Act of Supremacy, the law by which Henry VIII— in order to validate his successive marriages—substituted his own religious authority for that of the Pope. More was charged with treason, imprisoned, and eventually beheaded. In memory of his martyrdom the Roman Catholic Church canonized him in 1935. Though he was a man of many talents, Thomas More is remembered today mainly as the author of Utopia. The book describes a country called Utopia (“no place,” suggesting Eutopia, “a good place’). Its people lived alternately in the cities and in a surrounding agricultural belt, so that all but the top stratum both farmed and, for a portion of their lives, carried out urban tasks.

In either sector, “you can scarce find five hundred men or women” who were capable of working and not engaged in some useful task. Every household had slaves, recruited from criminals or the poor of neighboring countries. The outstanding trait of Utopia, as also of most other imagined paradises, was a perpetual and unchanging serenity.

2

Against the Stream

If we understand by “utopianism” a doctrine that propounds the creation of an impractical social system with supposedly excellent features, it is a common faith. All sorts of communities have been defined as models, with accompanying directions about the route to the realization of each version of perfection. One common type of literary utopias is an eden from which mankind was expelled. The word eden came via Late Latin from the Hebrew word for “delight”; hence “a place of delight.” A Sumerian version dates from a millennium or so before the familiar story in Genesis, to which it is strikingly similar. In the Sumerians’ magical land of Dilmun, lions killed no lambs, wild dogs ate no goats, disease and death were unknown, fields were eternally fertile, water was plentiful, and everyone lived in peaceful bliss. However, one of the gods living in this paradise sampled plants forbidden to him, and this transgression was punished by converting him and his progeny into the inferior species of humans. In recent times, paradise has often been portrayed as the domicile of the Noble Savage. In his book, Property and Freedom, Richard Pipes has a fascinating passage describing how one version of this myth came into being. He begins with Columbus’s own account (which reads as though quoted from many others) of the land he discovered: “It is perpetual spring, the nightingale sings, the flowers bloom, the trees are green, the rivers wind, the mountains

are high, and the inhabitants

are

innocent and happy.” This was not quite what later voyagers found; for example, the Caribs that Columbus had encountered raised captive children to be eaten. As the reality of the population of the Americas became evident, the locus of Europeans’ paradise gradually shifted

On Utopias

3

to the South Pacific—particularly Tahiti, where all had an equal right to the abundant food, women were comely and accessible, and life was good. Edens have had an extraordinarily good press. Lewis Mumford is characteristic in the enthusiasm that permeates his Story of Utopias. Man lives in both a material and a spiritual world, and “if the physical environment is the earth,” he held, “the world of ideas corresponds to the heavens.” As the heading of one chapter of The Road to Serfdom, Franz Hayek quoted a passage from the German poet Friedrich Hélderin as a response: “What has always made the state a hell on earth

has been precisely that man has tried to make it his heaven.” Not only utopians but residents of the real world, whether engaged in business, government, or any other activity, try to predict how the conditions of their work will change over the coming years. They demand expert advice on how many people they will have to serve; but these data, as given by census bureaus of all advanced countries, have proved to be thoroughly unreliable. And forecasts of population growth are only one of the extrapolations of social trends that statesmen and entrepreneurs use as a foundation for choosing between possible paths to the future. But the reasonable practice of prognosticating tomorrow’s problems in order to prepare to cope with them is not the same as constructing an image of the whole of a perfect future society. Any policy points to tomorrow, which is linked to today by setting a course in a particular direction. According to quotations in the Oxford English Dictionary, “policy” denotes either “prudent, expedient, or advantageous procedure” or, on the other hand, “political

4

Against the Stream

cunning, craftiness, dissimulation.”

The word derives

from the Greek polis, “city,” which is also the root of a weird list, everything from “police” to “cosmopolitan.” The negative connotation is strong in the aura around the related words “politics” and especially “politician.” This range of insinuation concerning words denoting a policies and politicians suggests that, in the estimate of subsequent generations, many of the paths taken went agley. Many social processes the world over have been “planned.” Five-year plans spread from the Soviet Union not only to other Communist states but to such diverse countries

as India and Brazil.

In the United

States,

though featured in Communist propaganda as the last capitalist redoubt, the whole

social-economic

structure

was altered by the government’s response to the depression of the 1930s and the war of 1941-45. Though in part this seeming ubiquity of “planning” has meant only that a stylish word was applied to various modes of thinking and behavior, it is also true that the functions taken over by the modern state have extended far beyond traditional political economy to virtually everything in society. In modern times the most familiar example of utopianism has been socialism, the type of society in which the state has become all-dominant. As the Oxford English Dictionary remarks, the history of the word socialism is “somewhat obscure.” The English word may have been borrowed from the French socialisme, or it may have been a neologism used in discussions of the commune that Robert Owen founded in the 1830s. From that small beginning it snowballed into a variety of distinct and often quarreling denominations. The

On Utopias

5

miasmas from the concept began with the visionary ideals of various social theorists; continued to the “scientific” version of Marx and Engels, who taught that socialism is destined by the force of history to succeed capitalism; then to the rise of socialist parties in Western Europe, with typically no more than reformist programs;

to the totalitarianism

of Communist

Russia,

which metastasized around the world. What do these various types of society have in common? At least as a goal, some remnant of the word’s etymology is usually retained: not only socialism but also the words social, sociable, associate, and so on all stem from the Latin socius, a companion; thus, a comrade. A socialist society is also generally understood as one in which the means of production and distribution are owned by the whole community. This “ownership,” however, has had nothing to do with control. More realistically, a socialist society is one in which private property and each person’s legal prerogatives associated with it have been all but eliminated and the whole economy and society are controlled by a small sector of the population, called bureaucrats or commissars. It is remarkable that the ideal worlds created by so many diverse minds are similar in two essential respects: they have of course little or none of the wonderful variability, the inconsistency, of unplanned and therefore changing human communities; and they are almost static, with change permitted only as further steps toward the ultimate goal.

ran

ee,

SA,

OPS

aetna

BHiwed A (poring & aap) Yoke cima ny eres See o all ts

TTubhitee’ Lesa ite Byeth FRE TE

Fin TA ina ceAN

A

Peas aaa

oe iitaivis'? Sele

“a NAL AT On Dry

lea

Poe PR e e

iy Gin

Ry

Set obit Vie eh

ee

UN

SP ec

yee,

at

ware

op Se Ah) oe Jha sty 91

&

bie id

FyA

iy

t4'%

ied ety.

apiek A iry

= renr aea)

an

2 Wat

ett

ae

eee.

yl

Vat ge eo

Ove

iow

pig BA

anand

deste ade aa

vay lity dae bese gies

ine {i FHS

Dos

Fe ee elias

Sa Eee

Mihi? oH! (ees

v oh 1

eee

ao

i” aaa fee, ida Ch

-

>

;

Ww,

Paki

(es

ae

SOR eu

?

a

|

Ries Hijen® PY ebe QUIN

sii naa

er

ae

ha

aE

2 Two Utopians Plan Towns Here I want to consider how depictions of the ideal society have influenced the real world, and to do so by focusing on schemes that seemingly helped bring about the reforms they recommended. It is ordinarily difficult to trace the lines of leverage from plan to actuality or to estimate their weight, but this is less true of two famous utopias presented in books by Edward Bellamy and Ebenezer Howard. Though both men achieved fame far beyond anything one might have anticipated from either their modest beginnings or the schemes they offered, the books

are worth

reviewing

in some

detail

because they have had a significant influence on Ameri-

can and British societies. Edward

Bellamy

Bellamy was born in 1850 in what is now Chicopee, Massachusetts, the son of a Baptist minister. After lackluster pursuits in law and journalism, he began the career for which he is remembered, the composition of quasi-literary literary representations of a future world from which all social blemishes had been eliminated. After a number of earnest short stories and novels, he a)

8

Against the Stream

spent two years working on the book for which he is remembered, Looking Backward: 2000-1887. A sufferer from insomnia, the hero of the novel had an underground bedroom that afforded him absolute quiet, and on one occasion he slept there for 13 years, 3 months, and 11 days (the precision is in curious contrast to the whimsy of the major theme), to awaken in the home of the Leetes: a physician, his wife, and their good-looking daughter. Through the rest of the book, Dr. Leete describes the splendid city of Boston, part of the vast improvement that had taken place over these fourteen years not only in the United States but in Europe and various other parts of the world, the crescendo of a general “era of unexampled intellectual splendor.” In this new era, all the nation’s properties are consolidated into one state-run enterprise, directed by “a single syndicate representing the people in the interest of the common good.” Everyone is an employee of the state, the sole employer, and each person chooses whatever occupation he prefers; the administrators adjust the benefits and disadvantages of each calling so as to bring into balance the number needed in the economy with that of new entrants to this sector of the work force. “File-leaders” and “captains” of the “Industrial Army” hold the rest of the population up to the highest standard of performance and permit no lagging. Workers are graded: any who are unsatisfactory are sent to do less pleasant jobs, and what might be termed Stakhanovites receive special badges of, successively, iron, silver, and gold. On the other hand, “all who do their best are equally deserying,” and “the lame, the blind, the sick, and the impo-

tent” have the same income as the most efficient. All

Two Utopians Plan Towns

9

retire at age 45, not to pointless leisure but to a revivification in a fresh pursuit of artistic or scientific achievements.

Household tasks—laundry, cooking, whatever—are carried out at communal facilities. In place of trade, there is a direct and resolutely equal distribution from national warehouses, with the consequent abolition of both money and bankers, as well as of lawyers. Every store has precisely the same assortment of all goods, so that shopping takes very little time or effort. Individuals are permitted to accumulate a small credit from their monthly allowances, but it is expected that any substantial personal surplus shall be returned to the common stock. No one need fear for either himself or his children: the nation guarantees the nurture, education, and comfort of every citizen from birth to death. Though some details that the author described hardly fit into the model,

overall he intended to sponsor a centralized state instituted by the peaceful elimination of capitalism, in which religion and small-town values would cultivate a genuine individualism. In contrast to this total transformation of society’s social-economic structure, little is changed from the technology of 1888, though indeed a waterproof covering is extended over sidewalks when it rains so that pedestrians need no umbrellas. From the moment he wakes from his long sleep and asks, “How came I here?” the Romantic Hero is a caricature. The daughter of his host, with whom from Fri-

day to Sunday he falls in love, turns out to be a direct descendant of his earlier fiancée; “womanly compassion surely never wore a guise more lovely.” And finally, as though abashed by permitting his imagination to roam so far from sober reality, Bellamy ended the

10

Against the Stream

book with the still more fantastic notion that the whole of the long exposition of this flawless society, replete with all sorts of details, was only a dream.

In their review of various utopias, Glenn Negley and J. Max Patrick ranked Looking Backward at the top. “It is doubtful that any single utopia, including the classics, has had so great an impact.” It sold over a million copies in the first few years, is still in print, and has been translated into more than twenty languages. Something like a hundred other volumes appeared in imitation of Bellamy’s book, including two by William Dean Howells, a leading novelist of that day. In his subsequent

career as an all-round

reformer,

Bellamy helped found a Nationalist Party with many local “clubs,” a newspaper, and a magazine. Over the years, intellectuals as prominent as John Dewey, Eugene Victor Debs, and Norman Thomas acknowledged

their debt to Looking

Backward,

which

reasonably be described as a founding document American socialism. Ebenezer

can

of

Howard

If the glowing reputation of Edward Bellamy is hard to explain, that of Ebenezer Howard, the founder and leader of England’s Garden City movement, is virtually beyond belief. Born in 1850, the son of a small shop-

keeper, Howard started work as a clerk at age 15 and, over the next six years, drifted from one trivial job to another. Then he migrated to the United States, acquired a homestead plot in Nebraska, tried to farm it, and ended up a dismal failure. Back in England, he set up a business partnership, which went bankrupt. Having learned shorthand, he became a parliamentary re-

Two Utopians Plan Towns

= 11

porter and reached his apparent peak with a vocation that brought him an adequate income but nothing more. A nonconformist churchman, he moved in the solemn circles of religious enthusiasts, whose overlapping plans on how to conquer poverty and urban squalor were often based on novel tax schemes or the nationalization of land. The most significant literary influence on his thinking was Looking Backward,

which Howard

tells us he “swallowed whole.” His conception of “the Garden City” was described in a book published in 1898, only ten years after Bellamy’s. Garden City was envisioned as a community with two interacting sectors—a town of 30,000 built on 1,000 acres and a surrounding agricultural belt of 5,000 acres with a population of 2,000. The farmers have their market at their doorstep, and the nightsoil of the urbanites can be used to fertilize the adjoining land. As he described it, the plan was starkly schematic, with a discussion only of the few details that a man of limited horizons might think important. The number of retail shops was limited in order to avoid wasteful competition, but the town’s basic economy was covered by a mere listing of “factories, warehouses, dairies, markets, coal yards, timber yards, etc. all fronting on the circle railway which encompasses the whole town.” Garden City was to be built on agricultural land, which could be purchased at bargain prices, and the costs of social welfare could be cut by offering private charitable agencies the opportunity of setting themselves up “in an open healthy district.” Thus, the average resident of Garden City would pay (at 1898 prices) only about £2 a year in both rent and taxes.

12

= Against the Stream

In Howard’s

vision, Garden City was not an alterna-

tive to metropolitan living but ultimately a substitute for it. “There should be an earnest attempt,” he wrote,

“to organize a migratory movement of population from our overcrowded centers to sparsely settled rural districts,” a traffic that would soon begin to feed itself. For as some abandon them, cities’ “ground values will fall enormously,” necessitating an increase in municipal taxes that will induce still more to move out. Ultimately,

cities will be almost emptied, and the remnants can be torn down and their sites converted into parks. Initially, no component of the developing Labor Party, neither trade unionists nor London intellectuals, responded favorably to the notions of this ardent visionary. George Bernard Shaw, who was well disposed toward him, regarded Howard as an “elderly nobody, whom the Stock Exchange would have dismissed as a negligible crank.” Apart from Shaw and the noted city planner Raymond Unwin, Fabians were hostile or, more often, indifferent. Howard’s early support came from such other fantasists as Lewis Mumford

introduction to Garden

(who wrote an

Cities), Patrick Geddes, and

Frederic Osborn, and from the association that they and

their fellows founded, the Town and Country Planning Association. Two Garden Cities were built more or less according to Howard’s specifications. During its first year, the company formed to build the first one, Letchworth, was unable to sell enough shares to cover even the cost of the site; the construction had to be paid out of mortgages, which entailed large outlays for interest and administration. Since cottages had therefore to be leased at rents far higher than had been anticipated, the poorer

Two Utopians Plan Towns

= 13

workers who were supposed to be the main beneficiaries could be accommodated only after the district council offered the credit facilities of a public authority. The complete commercialization of this first Garden City was averted only by public acquisition. The financing of Welwyn, the second of Britain’s utopian communities, was also based less on business investment than the support of well-to-do ideologues, and this project had also to be rescued from insolvency by a grant from the government. Contrast this history with the fanciful account in Lewis Mumford’s book: As adirect result of the plans advocated by Mr. Howard, a flourishing garden city called Letchworth has come into existence; which in turn has propagated another garden city, called Wellwyn [sic]; and at the same time has, by example, paved the way for numerous garden villages and garden suburbs in various parts of Europe and in America.

True, in under half a century this scheme of Ebenezer Howard blossomed into full-fledged government policy. Both the Town Planning Act of 1909 and the several laws that followed it were based on the premise that the salvation of the city was to be found in suburbanization. As William Ashworth wrote in The Genesis of Modern British Town Planning, “It was as though people...had decided that they could be reconciled to an unavoidable urban existence only if...towns were made as untownlike as possible.” Cities were then, and are even more so now, the home of most Britons, and it is amaz-

ing that Britain’s governments chose to follow a blueprint based on the premise that urban problems are insoluble within the framework of the metropolis. The institution of a large-scale program of anti-city planning is a strange fate for a country whose greatness developed and resided almost entirely in its metropolitan centers.

14

Against the Stream

Suburbs’

Until the rise of modern suburbs, urban and rural were generally viewed as sharply contrasting modes of life. “Civilization” was fostered, as the word’s etymology suggests, in cities; the residents of the countryside were “boors”

(from Dutch

boeren, peasants) or “vulgarians”

(from Latin vulgus, the common people). Apart from the landowning gentry, the elite lived in the city center and the most prosperous among them in houses around the main square. When Western societies began to build suburbs, most of them were designed for the well-to-do. By this residential distribution, it has been mainly the poor that live near their working places, while the middle and upper classes spend a good portion of their lives in hours of “commuting” (the Oxford English Dictionary defines “commute”

as an Americanism).

With the fathers resi-

dent only part the day, suburbs are essentially the actual homes of women and their children, and there is a large shelf of books analyzing the deleterious consequences of this arrangement. Whatever criticisms one can make of the schemes of Bellamy and Howard, they were, in one sense, successes, for both in the United States and England the planned communities they championed came into a quasi-existence. Letchworth and Welwyn are actual English towns, and there are dozens of counterparts in both Britain and America. It is difficult, however, to differentiate the two books’ influence from the many other factors generating the suburbanization taking place. Universities were instituting departments of city and regional planning, and college towns were among the first urban centers to absorb some of the schemes into

Two Utopians Plan Towns

their layouts and economies. tors of most American

15

By now the administra-

cities include

“planners”;

and

though their duties and authority vary greatly, their very existence is an important change from earlier times. In 1986, thirty-four academics founded a nonprofit society to promote teaching, research, publications, and public education on the history of city planning in the United States. This Society for American City and Regional Planning History (SACRPH) has published a semiannual Planning History Studies. As the planning of communities came into being, business developers found it useful to assimilate some of the recommendations into their own commercial plans. One example of this sort of commercial-ideological combination

is Columbia,

Maryland.

In 1963,

the Rouse Company announced that it had bought some 14,000 acres, or one-tenth of that state’s Howard County,

and invited the residents to acquire one of the homes it would build. James

Rouse, president of the company,

wanted to construct an ideal community, and he assembled fourteen nationally known experts into a work group, which spent six months developing a plan. They set four guiding principles: 1.

The city shall meet all the usual needs of its residents, including not only housing but also jobs, recreation, educational and cultural institutions, health care, and

so on. Today, there are some 2,800 businesses, including more than 500 stores and restaurants serving the 32,000 residential units.

2.

The city shall have a strong infusion of nature. Over 4,700 acres were set aside for parks, playgrounds, natural areas, and pathways.

16

Against the Stream

3.

The city shall, through its physical and institutional features, inspire and develop the best in man. Columbia set up a system of participatory government, founded a Columbia Foundation that furnishes seed money for nonprofit organizations, and in general fostered a spirit of fellowship.

4.

The city shall be run at a profit, which has been done.

Within a few months 149 properties were sold at an average price of $1,500 an acre. In other words, it was the well-to-do rather than, as ostensibly in Letchworth, the urban poor that came to live in this community. The company also established the Columbia Archives, which houses among other materials papers relating to Rouse’s long career as a businessman and a developer. It serves as both a historical source and a novel advertising medium. Apart from developing Columbia, Rouse built the country’s first enclosed shopping center in Baltimore, served on President Eisenhower’s Task Force on Housing, and laid the basis for the national housing legislation of 1990. After his retirement in 1981, he and his wife established the Enterprise Foundation, which set as its goal the provision of affordable housing to the nation’s low-income population. He was awarded the Presidential Medal of Freedom as a hero who helped “heal the torn out heart” of America’s cities. As this example suggests, eventually Bellamy’s scheme was partly realized not by the transformation to a socialist regime that he envisaged, but rather by the efforts of developers to cater to the desires of potential homebuyers. Columbia was both a successful planned community and a typical product of developers’ aspirations to make money. Earlier, the fashioning of urban

Two Utopians Plan Towns

17

communities had been sought mostly through zoning laws. Originally zones were set to isolate such establishments as slaughterhouses from residential areas, but the political control went far beyond such a sensible regulation. Areas of the metropolis designated as residential could incorporate no commercial shops or businesses, and eventually homes were grouped by their costs, thus segregating social classes from one another. It is ironical that one of the early obstacles to the building of Columbia that James Rouse had to overcome was the ordinances that prohibited the mixed land uses featured in his scheme. Only after he managed to subvert these laws was

he able, for instance, to combine

expensive one-family homes with subsidized apartments. The utopian scheme of one generation had become the urban problem of the next one.

sige) ee

vaittig pare aaat witirs inioar

gy

isireelan tain Neeachonytip:~ Py

eAey sak olbesa

Peo dh red Gene pale Bi

ate

“Ane epee Fn ‘iy aii Pneresin ae

NE‘ass vn

one

ved Haas Ord CW

a

ey,

nls

Shed Ae See ee Alt (Sn A oo ly NO

mada Ine

f-

ty

1/08.

vog

ohms

. '

wml

erat

ey _s

Ao

hii

cept

Fea

Ma)

Tul Can

Oe

BA

ea,

aie

ce

hiewe nee S30

Ae

,

hoe hip -llprneel aR

*

iif. :


.

*

¥

cep este

[tS

'

[een

| &

su

Aer

ert 14%

Mis

GONTY

ni

2

Beri :

wntee Sac

Cote

(b Woh Jet:

‘A

Hl

aie

Vet

eae

aS

uke ay

et

aug, Rut

Va thea

re h one

hatiey,

‘:

Fant

err:

|ae

deveh se

oes Oe.

5 Population and Its Sustenance Since Malthus’s day, views on how population relates to its sustenance have alternated periodically between opposed extremes, typically with only a slight dependence on the real balance between people and food. During only two generations a double reversal took place in opinion on whether our civilization is faced with disaster because of too many, or too few, people. The successive moods can be exemplified, for the 1920s,

by Paul Mombert’s Die Gefahr einer Ubervélkerung fur Deutschland; for the 1930s, by Enid Charles’s The Menace of Underpopulation; and for the postwar decades, by the Club of Rome’s The Limits of Growth. That these three works were published in Germany, Britain, and the United States suggests, correctly, that the cycle was international, followed in every developed country by demographers, state officials, and the general public. Each of the contradictory positions was bolstered, of course, by whatever evidence could be found or manufactured, but the changes themselves deprive all three positions of much plausibility; for population grows by generations, and fundamental 19

20 ~=—Against the Stream

changes in its food supply ordinarily take place no faster. If they would take the time to read the no less fervent pronouncements of yesterday and the day before, today’s enthusiasts might learn that the alternation of dogma was largely self-generated: the exaggerations of one decade stimulated a counterhyerbole fifteen or twenty years later. The tension was often raised by three devices: (1) population projections, (2) utopian comparisons, and (3) demographic universalism. If we are to remain within hailing distance of an empirical orientation, we must begin by discounting all three. 1. In both the antinatalist and the pronatalist phase, policy was recommended less to mitigate the actual population’s deleterious effects, real or supposed, than to cope with its extrapolation either to “standing room only” (the title of a book by Edward A. Ross, a prominent American sociologist of the 1920s) or to the death of the last person of European stock (a spelling out of Oswald Spengler’s “decline of the West” implicit, or occasionally explicit, in projections made in the 1930s). Obviously, a projection over a sufficient period of no matter how tiny an increase or decrease must end in the calamity that doomsayers predict, but of all guesses about the future, the least likely is generally that any social trend will continue unchanged. By now responsible demographers know that their number includes no prophets. To those worried about the “incipient decline” of Western populations, the baby boom came as a shock. Then the all but universal expectation that the baby boom would have an “echo”— that is, a high birth rate when the large cohorts began to have children of their own. The computer facilitated

Population and Its Sustenance

21

the extrapolation of ups and downs by technicians “concerned” about the “limits of growth” but quite unconcerned about the limits of their expertise. Eventually at least the official bureaus of the various countries learned to use so much forbearance that their projections proclaimed themselves as guesses. As an example, the 1972 projections by the U.S. Bureau of the Census offered a range for 1980, only eight years after the date of the estimate, between 222 and 23] million; for the year 2000 it was between 251 and 300 million, and for the year 2020 between 265 and 392 million. 2. Utopian comparisons. If data suggest that the food supply has not been falling or even has been rising, those determined to draw pessimistic conclusions have often responded with the assertion, certainly valid, that

many of the world’s inhabitants still remained malnourished. The nutritional needs of the human species, it should be stressed, are not well established. Under the leadership of the physician Ancel Keys, in the 1950s some American nutritionists argued that the quasi-official tables of caloric requirements were far higher than the body’s actual demands. In the new millennium plaints of food shortages are made against a background of obesity epidemics in several Western countries. In any case, what any people eats is ordinarily determined not only by ecology but by culture. For example, the recurrent efforts to popularize soybeans, a cheap and efficient vegetable protein, have failed just as recurrently. 3. What I have labeled demographic universalism. All social phenomena involve people, and it is sometimes concluded that therefore population causes everything, and the larger the number of people the worse its effect. It is routine, for instance, to blame pollution

22 ~=—Against the Stream

on large concentrations of population, which at most is a minor factor. The pre-Columbian Indians who lived in holes that they dug in a cliff at Mesa Verde filled the valley before them with their garbage, traces of which are discernible even centuries later. Whether an industrial city remains polluted depends not on its population but on engineering, economics, and politics; among others, Pittsburgh and London transformed themselves without sending off a single one of their inhabitants. Having cleared the decks of some excess baggage, let us proceed to our destination. I shall argue that in a narrow ecological perspective the population-food balance has not worsened, and that in those terms the pros-

pects are reasonably good, and that the concentration on such factors as the alleged shortage of agricultural land diverts us from more serious—because more intractable—problems. The Trend in Food Production

We do not know how much food the world produces. For sizable portions—including the most populous nation, China—no useful records are available, and some of the data that exist must be taken as very rough estimates. In most underdeveloped countries much of the food is grown on small family plots, whose produce seldom finds its way into statistics. This means that the figures compiled, mainly from market transactions, are often considerably lower than what is grown and consumed. Each year the Food and Agricultural Organization compiles the national statistics, such as they are. In a more complete review of recent reports, it would be appropriate to submit these data to a technical assess-

Population and Its Sustenance

23

ment. Here I want only to present some conclusions. In 1955 and 1965, marking the tenth and twentieth anniversaries of FAO’s founding, the annual reviews were supplemented with summary appraisals. During the first postwar decade, the production of agricultural food rose enough to repair the war damage and to make up for the population increase. During the next ten years, the production of food continued to improve, more than keeping up with the larger number of consumers. From the early 1950s to the late 1960s world production rose by 14 percent or by | percent per capita, ranging from an increase of 27 percent from very low levels in the Soviet Union and Eastern Europe to the sole decline, of 4 percent, in Africa. Ever since Stalin’s forced collectivization of farms in the Soviet Union,

agriculture

was

a recurrent

problem,

and the

statistics may well have been falsified. In Africa data on both production and population are especially poor, and one can hardly know whether the slight decline was spurious or not. The reason for the overall favorable record is, of course, no mystery. Where the new varieties of food grains were successful, they overwhelmed the most optimistic anticipations. In West Pakistan, as a prime instance, a new race of wheat doubled the yield on a small experimental plot in 1964-65, and five years later it was growing on 6.5 million acres. Moreover, a more efficient use of land was adopted, so that in all less developed countries it was proposed to increase the land under crops from 210 million acres in 1965 to only 223 million in 1985. Growing populations would require roughly the same acreage, or even a bit less as marginal land was retired from agricultural use.

24

~=Against the Stream

According to the latest FAO report available, The State of Food and Agriculture 2001, Afghani farmers harvested the largest crop since the agency started keeping records, though it was judged that that poor and troubled country would still require food assistance. The same was true of Haiti, where the inefficiency of a corrupt government kept 3.8 million people hungry. Worldwide it was estimated that in 1996-98 some 826 million people were “undernourished,” with a shortfall of 100 to 400 calories a day or a lack of such essential nutrients as iron, vitamin A, and iodine. It was anticipated that the number of malnourished people would fall by about half by 2015. Even so, the FAO called on rich nations to increase their food assistance, for “not all countries can face the cost” of bringing their farming up to date.

Continuing Food Shortages The development of the superplants initially swung opinion toward high optimism, but when it became obvious that not all the problems had been solved, it swung back again. There has been a genuine and highly significant advance in technology, but those awaiting utopia were disappointed. The high-yield plants demand large amounts of fertilizers and water, and the increase in production is largest on farms big enough to warrant mechanization. It was the relatively well-off farmers who were best able to adopt the new plants, and it was they who profited from them. Some of those who started with small or marginal plots were pushed down into the landless proletariat; thus, the areas of India that succeeded

most in raising agricultural production suffered most from rising antagonism between landlords (in the process of

Population and Its Sustenance

25

becoming market-oriented entrepreneurs) and both tenants and laborers. Three revolutionary parties reaped the political harvest sown with the superplants’ seeds. During the 1970s, when OPEC greatly increased the price of oil, this was a disaster for countries like India, newly dependent on a large supply of fertilizers. The consequent food shortage, following immediately the novel problem of finding export markets for the food surpluses, was much more serious than the inconveniences that Americans were subjected to. The cause of the sudden change, one should emphasize, was not eco-

logical but rather political. Political afflictions can be illustrated also by the six African states immediately south of Sahara. The desert is spreading and threatening to engulf their 22 million people. Even in this ecological context, other factors contributed to the disastrous food shortages. In several of the countries a continuous civil war generated sizable refugee movements, and these destitute masses were prime targets for the cholera and measles that, at least initially, caused more deaths than malnutrition. Chad and Niger have no railroads, and during the spring rains the roads are impassible. When even so large amounts of food were shipped into the area, its disposition can be illustrated by one egregious example. Under the leadership of President Ngarta Tombalbaye, Chad was trying to imitate Communist China’s cultural revolution. All persons with “bourgeois” contacts—thus,

teachers, clergymen, bankers, and businessmen, for instance—were sent off into the bush for six weeks or more to undergo an initiation rite called Yondo. Reluctant recruits were beaten and burnt to purge them of their Western ways and convert them to “Chadtude”

26

~=Against the Stream

(coined from négritude to denote goal of “authenticity”). When high school students beat up a French teacher, France withdrew all the teachers it had installed there. In short, precisely the small minority best able to cope with the emergency were attacked and subjected to systematic degradation. Finally, President Tombalbaye was killed in an army coup. As reported in the Economist of April LO MLO 7S: His survival as head of state for fifteen years was due more to his ruthless suppression of his opponents than to any personal popularity, and his regime had become increasingly ineffective and eccentric... Among the president’s victims [had been] the commander of Chad’s

army.... His arrest was linked with a plot allegedly organized by Mrs. Guembang, a former leader of the women’s section of the president’s own party. When Mrs. Guembang came before a special tribunal, the prosecutor alleged that she had tried to engineer the president’s death by burying a black sheep alive.

That Tombalhaye’s regime was “eccentric” does not mean that it can be called an aberration. Amin of Uganda, Bokassa of the Central African Republic, and Marcias of Equatorial Guinea, for instance, maintained

themselves in power by even more ruthless means. In the late 1980s there was a devastating famine in Ethiopia. To gain a preliminary understanding of its causes, the best background reading would not be a work on desiccation south of the Sahara but Robert Conquest’s Harvest of Sorrow, a harrowing account of how in the 1930s the Soviet regime successfully planned to starve more than five million Ukrainian peasants to death. Communist Ethiopia had to deal in smaller numbers but, as well as it could, it followed that Stalinist model. Like these African countries, others suffering from

temporary food shortages could generally depend on emergency aid from reserve stocks in the United States.

Population and Its Sustenance

27

A conference hosted by the Agriculture Research Service of the U.S. Department of Agriculture was attended by some 200 scientists from 27 countries. They reported a consensus that, of the crops harvested in less developed countries, between 30 and 40 percent were lost through waste, erosion, spoilage, disease, pests, and improper storage. Even a generation ago techniques existed to cut these losses to about 5 percent. Conclusions

From the first to the seventh edition of An Essay on the Principle of Population, its author moved from an ecological to a sociological perspective, thus from a narrow focus on biological needs to an appreciation of the cultural-social-economic complexities that those needs connote, and—most remarkably—from an unrelenting pessimism to a cautious optimism. Even so, the simple relation between population and food is commonly called the “Malthusian dilemma.” Those who use the phrase show their ignorance of Malthus’s mature work, and they typically would not welcome such an emendation

of his immature

simplicity. Now

as then,

the dramatic positive check of mass starvation makes for better copy. The major breakthrough in the technique of food production has probably not run its course. Whether we consider farming procedures or the supply of usable land, the problem has been solved both for the present and for at least the next several decades. To focus on these factors, as many who discuss the issue continue to do, means that our attention is diverted from

the political, cultural, and social-economic factors that cause food shortages, though these factors are now far more important.

Pi.

4

ieMa instead cunivel ooiae : ponte bat

‘em (ee Naa

ry Paz

°

; oo. ae eae

4

epee oe + anal:

a

ld 1-6

ae

tN

fh iA

“Sur

ee

a aes

a

antBiee

hit PE

a eye he 4 at Hee ats}pistes Ase:

oe

ae

Say > fala OF 22 sot sewn

airsal .

we a

oo male j 4 | ore

¥5 ~i _ o

> >

2

~

Phy he 4 le iy) Mes

itt

yh aot is Nh

BCE

»

"

Ts

ait

n

2

A

ae

Pe 2

oa tiple



'

“”

~ ~

ry

ne

P

#

8

)

!

,

:

oat

Hoa

ie rae

hs

.

aC)

ae

vi area

:

i

bo

aeNTE ice

. 4

orl

eit wa nash

ear

wo ba

Tula

eo Ts APs

ay" ay

aid

Aare

ae eich

spentre 2 ni she 4);

liens

ay eet

eh is Viv tan ee on

eer, Higa

+}

ata i)

bth

4 A Founder of Sociology Blunders Emile Durkheim

(1858-1917)

is listed in every ref-

erence book with Max Weber and Karl Marx as a principal founder of modern sociology. Of the three pioneers, Marx was preeminently a politician, secondarily an economist, and his analysis of society per se was usually limited to an almost mechanical passage on class conflict. Weber was first of all a historian, distinguished because his view of history went far beyond the routine accounts of rulers and their politics. Of the three, Durkheim was the sociologist. With his axiom that “social facts must always be explained by other social facts,” he made

it a distinction of his works to ignore

both personal attributes and the psychological theories to explain happenings in the transpersonal sector. And of the four books that make up his major contribution, Suicide (1897) is perhaps the most significant. Suicide

is caused,

in Durkheim’s

view, not by a

person’s tragedy, or his mental breakdown, or his total frustration, but by “social currents.” At first sight, imitation might seem to be the motive power of these “social currents.” However, though contagion certainly exists from one individual to another, Durkheim did not 29

30

= Against the Stream

consider person-to-person contacts influential enough to affect the rate. Of the many discussions of Durkheim by American and European sociologists, none that I know of have subjected his curious notion of “social currents” to the thoroughgoing critique that so great a departure from empirical analysis would seem to warrant. There was a consistency of national statistics over time: the rank of European countries’ suicide rates remained almost constant during three periods of the nineteenth century. The incidence of suicide was consistently lowest in winter and highest in summer; Durkheim noted in italics that “not one country is an exception to this law,” because of the annual cycle of the length of day, for most suicides take place in daytime, “when social life is most intense.” In Durkheim’s view, suicide is a normal phenomenon in the sense that it exists in all countries, but it is also an evil that can be combated. His solution would be to establish outside the state new centers of communal coherence. Decentralization

of the state’s authority, a re-

form that had often been recommended, would deter the impulse to kill oneself only if it “simultaneously produce[d] a greater concentration of social energies” through “occupational decentralization”: each corporation would acquire a supplementary function as a moral center. From the suicide data of Europe, it was apparent that the Protestant countries generally had the highest rates and the Catholic

ones

lower rates. (Norway,

Sweden,

and England—Protestant countries with relatively low rates—he explained away on an ad hoc basis.) Both faiths condemn taking one’s own life with equal fervor, but Protestantism permits a greater freedom to individual thought and thus is left with fewer and weaker com-

A Founder of Sociology Blunders

mon beliefs and practices than Catholicism. two conclusions:

31

He drew

first, that Catholic rates are lower than

Protestant because Catholicism has a more intense collective life, which moderates any personal motive for suicide; and, second, that suicide rates increase with

knowledge, which tends to alienate the educated person from his religious community. This sums up his analysis of what he termed “egoistic suicide.” He also posited two other types, “altruistic,” to characterize a person who kills himself out of a sense of duty to a social superior who has died, and “‘anomic,” to characterize the suicide of a person suffering from a weakness or absence

of social norms,

the condition that he

termed anomie.

On Counting Suicides A crucial problem is that the whole argument of the book rests on the statistics that Durkheim collected from whatever sources he could find, with no mention of how accurate an index they were of what he was examining. Almost all more recent analysts emphasize that official data about suicide are untrustworthy. One reason is that the difference between a suicide, a murder, and an accident is often indeterminate, sometimes even to the persons involved. Suicides are so defined by a coroner, who may out of sympathy with the bereaved persons deliberately ignore the evidence presented to him. Such a compassionate gesture, moreover, may be offered more frequently to Catholics, who generally are more concerned

(irrespective of official doctrine) about cov-

ering over the sin of killing oneself. As a 1967 paper in the American Sociological Review remarked, “some reasonable men have rejected the use of ‘public,’ or

32

~#©Against the Stream

even any form of statistical, data for the study of suicide.” In the view of such critics, depending exclusively on such statistics and ignoring case studies, as Durkheim did, is likely to result in dubious conclusions.

One might expect that statistics on suicides in the armed services would be comparatively free of the faults of those collected in the looser civilian world. For that reason a 1979 article in Military Medicine is especially striking. The rate per 100,000 persons, as calculated from the same data, ranged from 8.4 (Office of the Surgeon General)

through

11.0 (Office of

the Adjutant General) to 16.3 (the author’s independent estimate).

For whatever they are worth, statistics on recorded suicides in the United States are compiled by the National Center for Health Statistics. For 1999 the national rate was 10.7 per 100,000 population, and the states ranged from 22.3 for Nevada down to 5.8 for Washington. One wonders what Durkheim, who routinely contrasted the rates of various geographical areas, would make of this list. That Nevada, whose economy has largely depended on gambling, migratory divorce, and prostitution, should rank highest is not surprising, but the rest of the list would be impossible to explain by citing greater or lesser social coherence. One of the most striking contrasts in the American Statistics is between the male rate of 17.6 and the female one of 4.1, supporting Durkheim’s curiously worded remark that “we know how great is woman’s immunity to suicide.” Similarly, the American rates by age category generally follow Durkheim’s summation that “suicide increases regularly until old age.” However, the rate for Americans aged 15-19 was higher than

A Founder of Sociology Blunders

— 33

it would have been in this progression, presumably reflecting the fact that adolescence is often a stressful period. Durkheim did not try to link the sharp and presumably universal contrasts by sex and age to his thesis that social coherence results in a relative immunity. Catholics vs. Protestants

The principal conclusion of the book, that Catholic rates are lower than Protestant because Catholics constitute a more compact moral community, may be so, but there are several faults in the argument as Durkheim presented it. At the turn of the nineteenth century, France was nominally a Catholic country, but many Frenchmen were rather lax in their observance of their church’s rituals. In works about European Catholicism of that period, Ireland or the Netherlands is typically contrasted with France, where members of the church were likely to be more casual. The anticlerical bloc prominent since the establishment of the republic was rejuvenated by the Dreyfus affair, which was the dominant issue in French politics around the time that Suicide was accumulating its initial acclaim. Prelates of the Catholic Church generally agreed that Dreyfus was a traitor, but the opposed Dreyfusards included many of the church’s faithful members. Nor can one easily characterize European Protestants as a unit. Those in France were Huguenots, originally rigid Calvinists and perhaps still, around 1900, retaining some of their cohesive fervor. Lutherans were probably diverse, possibly more secular in Scandinavia than in Germany. Over all, statistics on religions were hardly better than those on suicide, and combining the two did not improve the validity of either set.

34

Against the Stream

It is remarkable that Durkheim had so little to say about Judaism, for the contrast in suicide rates is great between Orthodox Jews and those more susceptible to modernist or secular influences. He merely mentions that the rate of the more traditional Jews was lower than that of Catholics up to about the 1870s, after which Jews “began to lose their ancient immunity.” The contrast that he merely mentioned in passing is more striking than that between Catholics and Protestants. That sociologists honor Durkheim is to be expected, for his whole scholarly oeuvre served to transfer phenomena that until then had been studied with psychological theories to the subject matter of their discipline. This constitutes his books’ originality, but also their most striking deficiency. Not content with the reasonable thesis that sociologists have a legitimate role in the analysis of such phenomena as self-destruction, he tried to establish a monopoly for them. Indeed, that he made a point of excluding any psychic influences from his analysis was something of a provocation. Would it not have been more reasonable to analyze the phenomenon with both sociological and psychological concepts, since there is no contradiction between them? Maurice Halbwachs, Durkheim’s student and friend, wrote a supplement to his teacher’s work, Les Causes du Suicide (1930), in which he explained suicide as the con-

sequence of both social and psychopathological stimuli.

5 An Anomaly in Western Europe As with many so-called New Nations in Asia or Africa, Belgium’s precise limits and indeed its very existence derive less from a unified history or a cultural coherence than from manipulation by greater powers. The borders with France, Germany, and the Netherlands are completely artificial, with no natural feature to mark the political demarcation. The most important cultural division, the boundary between Romance languages to the south and Germanic languages to the north, which has remained more or less fixed for one and a half millennia, runs through the middle of the country. From

1795 to 1815, when

the area was incorporated

as part of France, an effort was made to convert it into a francophone province. Then the architects of the postNapoleonic settlement, afraid that revolutionary France might rise from its grave at Waterloo, fashioned a buffer state on its border: present-day Netherlands and Belgium, united under Willem I of the House of Orange. He established Dutch as the official language throughout his kingdom, eventually alienating all his Belgian subjects. After an almost bloodless revolution, independent Belgium was founded in 1830. a5

36 ~=>-Against the Stream

Though Dutch-speaking Flemings constituted about two-thirds of the population of the new state, French was again its main official language, used in all governmental contexts and also by both the mercantile bourgeoisie and the Church prelates. Elementary schools were established

in Dutch, but all middle

and higher

education was in French. It was routine among upperclass Flemings to speak Dutch to the servants and French among themselves, and with the gradual democratization of Belgian society, this link between language and social class spread throughout the whole people. The Flemish Movement

In the first decades of the nineteenth century, selfconscious Flemings coalesced into grouplets led, for example, by a professor of literature, a minor government official, a novelist. They fostered a solidarity of the language community, attacking the opposed theme of Belgian nationalism defined as French-speaking. Their program can be best discerned in Antwerp, the cultural capital of Flanders and a city second only to Brussels in size, one of Europe’s great ports with a wide network of canals and dockways. In its dozen museums past glories are thoroughly recorded. The exhibits in the Archive and Museum of Flemish Culture, for example, are not of folklore and painting but rather of Flemish writers, musicians, and artists whose works were the collective base of an independent regional culture. On display, for instance, was Lodewijk de Raet’s book on the vervlaamsching (the conversion to Dutch)

of the University of Ghent. A yellowing newspaper clipping quotes one-time Prime Minister Charles Rogier: “The best thing that Flemish girls can do is to learn

An Anomaly in Western Europe

— 37

French as quickly as possible, so that they can become houseworkers in Wallonia, where because of their industry and cleanliness they are much in demand.” Even music is presented to show the contrast between a score handwritten in Dutch but printed in French. Many of those honored in the exhibits were bohemians during their own lifetime, prominent for their defiance of the cramped norms of Catholic Flanders. The smaller museums and exhibits also define their purpose in regional rather than national terms. Thus, a partly social, partly commercial exhibit of stamps and coins in the little town of Sint-Niklaas included a series of posters on each village in the Waasland (an area of eastern Flanders), giving its history, the etymology of its name, and a photograph of one public building. The question of what to call the language spoken by members of the Flemish movement was not to be answered simply. The linguist J. L. Pauwels wrote a fascinating essay, “The Difficulties in Giving Our Language a Name,” which is well worth summarizing. The Old High German phrase diu diutisca zunga means “the language of the people,” as opposed to Latin, the language of scholars. In one form or another this adjective, diutisca (“of the people”), is found in all Germanic

lan-

guages, but with different meanings. Two variations evolved in the Low Countries: dietsch in Flanders and duitsch in Brabant and Holland. During the sixteenth and seventeenth centuries, a new terminology developed to distinguish the precursor of modern German, which

had undergone

a considerable

evolution,

from

dialects or languages that had not done so: Hooglandsch Duitsch, “Upland Germanic,” or eventually Hoogduitsch was contrasted with Nederlandsch Duitsch, “Lowland

38

Against the Stream

Germanic,” collapsed into Nederduitsch or Nederlandsch.

These designations (written with modern Dutch spelling) eventually acquired narrower meanings: Duits, “German,” the language of Duitsland, “Germany”; Nederduits, “Low German,” the composite designation of languages and dialects that are differentiated from Hoogduits, “High German”; and Nederlands, “Dutch,” the language of Nederland, “the Netherlands.” Nederlands, however, is

in fact the language of a people that transgresses the boundaries of Nederland, and the double meaning of the

word—one geographic and the other linguistic—has often resulted in confusion. Because of the commercial and cultural dominance of the province of Holland in the Netherlands,

its dialect, Hollands,

came

to be the offi-

cial language of the whole area.

These complexities were compounded when the various designations were translated into other languages. In a list giving the number of American tourists that had visited various countries, the U.S. Department of State—no less—cited Holland and the Netherlands as different countries.

In Dutch, Nederland

(singular) is

the country of de Nederlanden (plural); in English the first term is ignored and the second becomes, in transliteration, “the Netherlands”

and, in translation,

“the

Low Countries.” In French, the translation /es Pays-Bas (plural) ordinarily means, on the contrary, the single country, but before “Benelux” was coined, it was occasionally used to designate the region. In English and American slang, the word “Dutch” has a number of opprobrious connotations dating from the Anglo-Dutch wars of the seventeenth century. The word reminds Europeans of its cognates, deutsch and Duits, “German,”



and indeed

sometimes

still means

that, as in

An Anomaly in Western Europe

39

“Pennsylvania Dutch,” the designation of immigrants from southern Germany. Faced with this confusing choice, what words should proponents of the Flemish movement have used to designate themselves and their language? Nothing quite fits all occasions,

and the selection of one or another

alternative was seldom entirely neutral. An old term, Diets, survives in such phrases as de Dietse kunst, “the art of the Low Countries,” and at one time it was proposed that the movement adopt de Dietse taal, “the language of the Low Countries.” But during the 1930s a number of pro-Nazi organizations used the word to designate themselves, and this political coloration is ineradicable. When I asked in a bookstore for works on particular subjects in “Vlaams,” the young clerk sternly informed me that there is no such language. In its usual narrow denotation,

Vlaams,

“Flemish,”

is one of the several

Germanic dialects spoken in northern Belgium. All these dialects together were given a composite name, /flamenco by the Spanish and flameng by the French. Thus, when the inhabitants of Brabant or Belgian Limburg call their language “Flemish,” they are continuing a usage first imposed under the Spanish and French occupations. Some in what everyone calls de Vlaamse beweging, “the Flemish movement,” insist that the language be called Nederlands. After 1945 the struggle of the Flemish to achieve equality with the French-speaking Walloons finally attained a mass base and significant successes. Earlier laws had been passed permitting the use of Dutch in criminal cases, public administration, and education above the primary level. These initial advances were

40

Against the Stream

modest, especially since in practice the new regulations were often ignored. During both world wars the German occupation forces favored what they termed their Teutonic brothers over the Walloons, and for a period the Flemish movement lost momentum because of its alleged collaboration. But when the constitution was amended to put the two languages on a par, the goals of the pioneer struggles began to be realized.

What Does “Bilingual” Mean? So long as French was the dominant medium of cultural, administrative,

and commercial

communication,

a Fleming could rise above his father’s level mainly by acquiring a facile and accentless French. For those Flemings who moved into the upper middle class, in other words, their assimilation to the French culture of Wallonia was both a means of upward mobility and the first sign of its achievement. The few upper-class francophones who lived in Flanders were gradually depleted except in the Brussels area, which remained the site of continuing contention. Such persons were generally bilingual, but they acquired a certain prestige by reporting themselves simply as “francophones.” According to a 1970 survey by two sociologists, P. Kluft and F. van der Vorst, only 17 percent of the residents of Brussels and its environs, or far fewer than most persons had estimated, reported their principal current language to be Dutch. Spokesmen for the Flemish cause attacked the survey in the press and in Parliament, and the two authors, who had shifted to market research, refused to even discuss their earlier work with me. The last census that included a question on language had been in 1947. Alternative sources of language data

An Anomaly in Western Europe

41

might be compilations, respectively, of French and Dutch identity cards, civil marriage ceremonies, or army service; but in all these statistics the person’s choice was distorted. Most officials in the Brussels area were francophones, and unless a person made a point of demanding a Dutch identity card, he was often given one in French. Even Max Lamberty, a well known leader of the Flemish movement, had a “carte d’identité’ rather than an “identiteitskaart.” Similarly, civil marriage ceremonies could be performed in either language at the couple’s option, but in many cases the bride and groom spoke a dialect rather than proper Dutch and the burgomaster could speak well only in French. Thus, many chose

to get married

in French,

in any case

the more

chic option. The Belgian army was divided into two components, and Flemish soldiers often selected a francophone unit in order to perfect their school French. In fact, the dimensions of the two language communities in Brussels and the surrounding suburbs are indeterminate. Depending on whether the response refers to the language the respondent spoke as a child or the one he currently uses more (with neither figure given very exactly), the statistics have varied significantly. Paradoxically, as the two languages became the criteria for more and more political decisions, data on their use became ever harder to assemble and analyze. If the disputes there could be settled, the long dissonance could be ended. But an ideologically based movement does not necessarily wither away when its purpose has been achieved; on the contrary, among some adherents the very success heightens the passion. A Cultural Council for Flanders, founded in 1959, used public funds in a dozen ways to enhance Flemish

42

Against the Stream

self-consciousness. Commissions were established to review such diverse issues as judicial questions; fine arts; pedagogy at the secondary and college levels; movies, radio, and television; literature and libraries; the theater and music; and, most generally, cultureel

vormingswerk, the task of molding persons through their culture. Moreover,

as more

Flemings moved

into the middle

class, their self-identity changed. After the Second World War, the Flemish economy prospered. Amsterdam and especially Rotterdam were partly in ruins, but Antwerp had been successfully protected from Nazi depredations by the Belgian resistance movement, and the port became the nucleus of a new industrial region. This boom was in contrast to a declining Walloon economy, based as it was on nearly depleted mines and obsolescent factories. The long slump that the Belgian economy suffered in the 1950s did not alter the relative advantage that Flanders had begun to attain over Wallonia. A Ministry of Employment and Social Care, established in 1960, drew up a national plan by region: Flanders, Wallonia, and Brabant (the province containing Brussels), with each receiving a separate economic council as well as public-private corporations that fostered investment. In spite of the continuing concentration of financial administration in the capital, the potential that Flanders had gained could now be realized through the official effort to reestablish prosperity through economic decentralization. According to data gathered by the National Bureau of Statistics, average real income rose by 183 percent from 1963 to 1971. Many families changed their consumption pattern to a middle-class norm; such

An Anomaly in Western Europe

43

prior luxuries as home comforts, an automobile, amusements of various types, and travel abroad became common, and Flanders certainly did not lag behind Wallonia in this national improvement of daily life. In sum, during one or two generations the Flemish rose from a subordinate status to parity with Walloons. Various dialects were forged into a unified language, the medium of several first-rate universities. In the dispute between the two communities, the point was often made that French is a world language and Dutch is not, but this does not mean that the Flemings were culturally deprived. In 1972-73, the Flemish publishers association

surveyed

about

150 families

a month,

which

spent an average of 1,800 francs per year to buy eleven books, as well as borrowing twenty-one others. Of those purchased, 92 percent were in Dutch; those in French, English, or German

were mostly technical works. Some

of the popular books reflected the widespread search for roots; others were art books, works on history. One series had thirty booklets on what the conscientious Flemish visitor should be sure to see in thirty Flemish towns.

Problems with Two Languages Most sizable towns and even some villages have two names, one French and the other Dutch. On highway signs, which for good practical reasons should be short and easily read, the rules are rather complex. In monolingual areas the names of the towns are given in the language of the area, but in the environs of Brussels both are given, with the Flemish name first for towns in

Flanders and the French one first for those in Wallonia. Thus, on the road from Brussels to Li¢ge, which hap-

44

Against the Stream

pens to meander back and forth across the language border, one starts out with two kinds of bilingual signs (Liége-Luik or Luik-Liége as directions to the single city), and along the way the direction is only to Liege for some kilometers, then only to Luik, then again only to Liége. A tourist seeking the way to Namur, for instance, has to follow the road to Namen, and one reaches Malines by going to Mechelin, or Waremme by taking the route to Borgworm. Newspapers have reported that mail addressed with the French name to smaller Flemish towns is returned to the sender with a request for a better address. Ordinary letters sent to “Louvain” arrived with no trouble, but special delivery mail (handled by the telegraph office, in that city more Flemish-minded)

could go astray

unless it was sent to “Leuven.” “Mistakes” of this type became a routine part of nationalist ammunition. The most significant victory of the Flemish movement was that instruction in secondary schools and colleges was no longer exclusively in French. Flemings used to be bilingual; now they were taught French relatively late as a second language or even, at the option of the parents, English instead. Walloon schools followed the converse pattern, and reputedly their knowledge of Dutch was slight because of both a lack of good teachers and of students’ motivation. Belgium became a bilingual country in which a serviceable knowledge of both languages was more and more anomalous. In The Ecology of Language, the distinguished sociolinguist Einar Haugen wrote that “any pair of languages functioning within the same cultural framework inevitably approach one another.” From the experience in Belgium, one should rephrase the generalization: any

An Anomaly in Western Europe

45

pair of languages functioning within the same cultural framework tend to approach one another unless they become

the indicators

of competitive

subcultures,

in

which case the differences are more likely to remain constant or even to increase.

Belgium-Netherlands If the Dutch spoken in the Netherlands and in Flanders is in fact one, then why have these two areas not joined together into Groot-Nederland, “Greater Netherlands”? The best discussion of this puzzle is in the works of Pieter Geyl. As he pointed out, the historical division along the language border had been distorted by a series of accidents, beginning with the fact that the army of Philip Il of Spain was stopped at the Rhine and Meuse down to the foreign invention of Belgium as a buffer state against revolutionary France. Several of the ten or so little magazines of pre-1914 years seemed to be seeking a rapprochement between Flanders and the Netherlands through their binational editorial boards. However, very few on either side of the border favored such a merger. In the Netherlands political and social institutions have long been divided along religious lines, and if the Flemish Catholics had joined their coreligionists in the Netherlands, the always delicate political balance in the life of the Dutch would have been fractured. In the past the prime advocates of cultural unity were usually Catholic partisans, so that every Dutch non-Catholic was likely to oppose any step toward unity. Later, the religious issue continued to be significant, but in a different way. For several decades Dutch Catholicism became the most unruly sector of the world church, and Flemings were generally far from

46

Against the Stream

enthusiastic about their neighbors’ innovations in ritual and doctrine. In the long struggle of Flemish nationalists to raise their region to cultural parity with Wallonia, they tried to substitute for the various dialects spoken in Flanders what is called A.B.N. (Algemeen Beschaafd Nederlands, “General

Cultured

Dutch”). A nationalist

movement

typically seeks roots in the speech of the common people, but more and more Flemish scholars consulted with their Dutch colleagues, for instance concerning the

continual reform of Dutch spelling. Both sectors typically used the long-established language of the Netherlands, rather than “Southern Dutch,” which differed from

speech in the Netherlands about as much as cultured American does from cultured English. Words and expressions imported from French were especially avoided in Flanders, while in the Netherlands an occasional French phrase suggested a pleasantly cosmopolitan chic. Apart from such minor differences, the cultured language in both areas has become one. In other respects they remain two countries.

6 Canadian-American

Relations

English-speaking Canadians and the anglophones who live just to the south of them are in many respects indistinguishable. It is a common saying that the boundary that separates the two countries is the world’s longest undefended border. People and commodities move north or south with few impediments. Yet the amicable association at the surface does not go very deep. If we ask why this is so, the first response is obviously the disparity in size and power between the two nations. The population of the United States is roughly ten times that of Canada, and the two economies differ by more or less the same proportion. Many Canadians resent their seeming status as an offshoot of their very much larger neighbor. Canada’s history linked it with Britain, its geography with the United States. Still today most of Canada’s 3.8 million square miles is close to uninhabited; almost all Canadians live within a hundred miles of the southern border. Moreover, all the continent’s natural features run north and south. The Maritime Provinces are like extensions of New England; southern Ontario and Quebec are a peninsula thrust into the center of American 47

48

Against the Stream

industry; the plains, and to the west of them the Rockies, stretch from Mexico to the Arctic; the Pacific coastal

regions of the two countries are similar in landscape and economy. When an Englishman hears a Canadian speak, he often identifies him as an American;

when an American

hears him, he may perceive him as an Englishman. In The Cambridge Encyclopedia of the English Language, the intermediate linguistic position of Canadian English was exemplified with several photographed signs, such as one that advertised

a sale on “tires” (rather than

England’s “‘tyres”) offered by “Sammy’s Service Centre” (rather than the American “center”). There is often a variation

within

Canada;

one

survey

showed

that

Ontario high school students generally spelled the word “colour,” while those in Alberta spelled it “color.” There is also a double source in vocabulary: British “tap” (for faucet), “railway” (for railroad), and “braces” (for suspenders) coexist with American “gas” (U.K.: petrol), “sidewalk” (U.K.: pavement), and “wrench” (U.K.:

spanner). Some words are pronounced as in England, some as in the United States, some with different pronunciations dominant in different regions of Canada. What is Canada?

Every history of Canada has a large section on how arduous an achievement it was to persuade the far-flung sectors of British North America to unite. Lord Durham wrote in his famous 1839 report on the British colonies, “I found two nations warring in the bosom of a single state.” Nevertheless, following his proposal, what were later named Quebec and Ontario were united into a new entity, the nucleus of the subsequent nation. In

Canadian-American Relations

49

1867, when Ontario suggested that this core be expanded, all the future provinces opposed confederation; it took skillful politicking and money to induce them, one by one, to more or less blend in. In the opinion of George F. G. Stanley, historian at the Royal Military College, “The union of 1867 had, in large measure, been the response of British North America to the threat of absorption by the United States.” In sum, as the Canadian historian Arthur Lower put it, “The Dominion of Canada was to reach the Pacific without the inner concept Canada having been born.” Hostility between Americans and Canadians was aggravated in continual disputes over the two countries’ boundaries—between Maine and New Brunswick in 1783, leading to the so-called Aristook “war”; follow-

ing President Polk’s famous slogan in 1846 of “54-40 or fight,” by which he laid claim to the Oregon territory; in the controversy in 1900 about Canada’s border with Alaska, with a settlement that excluded Canada from any inlets along the Alaskan panhandle; and so on. Over the whole of the country’s history the vast distances separating Canada’s thinly populated areas encouraged the persistence of local patriotisms. The Canadian

historian Alan A. Brookes

quoted a comment,

“Large parts of the countryside and the smaller towns [of Nova Scotia] still bear the hallmarks of the original settlers—traditions, beliefs, prejudices, and habits of life and work,” and followed with a comment that the ob-

servation applies to other areas of Canada as well. Of Canada’s efforts to bring about a greater unity across its wide expanse, perhaps the most important was the construction of two rail lines to the Pacific. In 1896, at the end of a long economic depression, the Liberals

50

Against the Stream

were returned to power, and they welcomed it energetically. The transcontinental road of the Canadian Pacific Railway, which had been built by the Conservative Party with great difficulty and scandalous generosity, was duplicated not once but twice; and the two new

systems,

bankrupt before their main lines were completed, were later combined into the Canadian National Railways.

Migration between United States and Canada Migration in either direction between Canada and the United States has not generally improved relations. We can start at the beginning, with the founding of the United States. After the American war of independence ended with victory for the rebels, nine of the thirteen new states passed bills exiling the most prominent of those who had remained

loyal to the British crown;

all

the states imposed double or triple taxes on those identified as traitors and confiscated their property. A commission that Britain established to cover the losses that Loyalists had suffered reviewed 4,118 claims and authorized almost £3.3 million in total compensation. An estimated 50,000 of those who had supported the Crown migrated to British North America, where they were given free farms. So many settled in one area of Nova Scotia that it was organized as a new province, New Brunswick. There and elsewhere these migrants became the core of a hyperBritish population with a visceral hostility toward the United States. That this scrap of history is still relevant is suggested by a recent novel by one of Canada’s best known writers, Robertson Davies. In Murther & Walking Spirits (1991), one of the hero’s (and the author’s) forebears

was a British officer stationed in New York in the 1770s.

Canadian-American Relations

51

One Sunday he listens appreciatively to a sermon condemning Boston’s rebels: “Their pretensions are noble, but their trade is treachery.” The taxes they complain of are “just imposts, meant to pay the cost of protecting the Colonies against many enemies.” In the war that the rebels started, the officer died “a soldier’s death,” and his wealthy widow and her family paddled a canoe up the Hudson, finding refuge north of the border. She “was my great-great-great-great-grandmother. Here she was risen from the waters into the land which was to be mine.” The War of 1812 was a continuation, after a lapse of several decades, of the American Revolution. When an American force of poorly equipped and ill trained volunteers

and militia attempted

to invade

Canada,

the

three-pronged drive met with three reversals. The effect on Canadian-American relations, one can presume, was to reinforce any antipathy that had been established earlier—and to show patriotic Canadians that they could retain their independence. With lines through an almost empty countryside, both rail companies set up highly competitive colonization bureaus in order to develop business along their routes. Their efforts overlapped with the one that the new minister of the interior, Clifford Sifton, had launched. If Canada was to get out of its doldrums, Sifton asserted, it had to

settle the empty West with producing farmers. Prodigious amounts were spent in campaigns to stimulate migration from Europe to the prairie provinces, but many supposed immigrants turned out to be transmigrants, using the subsidized facilities to get across the Atlantic and then going on to the United States. How seriously migration to the United States countered the expensive effort to encourage immigration to

52

Against the Stream

Canada cannot be stated precisely, for the data are el-

ther poor or altogether lacking. The Canadian-born American demographer Nathan Keyfitz remarked that his reconstruction of the past migration was “intermediate between pure fact and pure speculation.” For the century 1851-1950 he estimated that there were 7,233,000

immigrants

and 6,632,000 emigrants,

leav-

ing a net immigration of only 601,000. A subsequent estimate by the Canadian Craig McKie covered a longer period, 1851 to 1991, and it offered an only slightly more favorable picture: 12.5 million immigrants and 7.9 million emigrants, both Canadian-born and immigrant, mostly to the United States. By this estimate, over 140 years of costly effort to promote the growth of its population, Canada succeeded in importing and retaining 4.6 million, or an average of 32,800 per year. During the war in Vietnam there was a migration of American draft dodgers to Canada, one of whom,

Ken-

neth Fred Emerick, based his conclusions mainly on a survey that he administered to twenty-one “military resisters” (or deserters) and twelve “draft resisters.”” From

these sparse data, he concluded that the respondents were of reputable middle-class backgrounds, who as they saw it had a choice between accepting military service in a war they abhorred, going to prison, living underground in the United States, or going to Canada as “political refugees.” Forbidden to work or to apply for welfare; they lived mostly on small sums from their families or friends at home. Once registered as landed immigrants and thus able to apply for work, they generally got menial jobs, with wages from 10 to 30 percent below those for comparable work in the States.

Canadian-American

Relations

53

Some Canadians were sympathetic and sided with those opposed to the war in Vietnam, with the consequence that any antipathy they felt toward the United States was reinforced. Others found the American draft dodgers and deserters offensive; as Emerick remarked, “discussions with Canadians revealed some reluctance to hire any American due to their undependability.” The consequence in either case was the same: any antiAmerican sentiments already existent among Canadians were strengthened. In the post-1945 years many Europeans and Asians sought an escape from hardship and austerity; and from the other side the declining fertility and aging of Canadians motivated the government to seek more immigrants. In contrast with the earlier focus first on Britons, then on other Europeans, Canada welcomed Asians. Today one resident of Vancouver in six is Chinese, and

almost as many originated from other countries of Asia. In its effort to treat all minorities equally, the federal Department

of Canadian

Heritage (!) has published

a

book, Multiculturalism: Respect, Equality, Diversity. 12th Annual Report on the Operation of the Canadian Multiculturalism Act (1999-2000).

Two Economies, or One?

American firms repeatedly found it profitable to set up auxiliary plants in Canada, if only because that opened up the market in all the British dominions. The vast investment

in the automobile industry, metal processing, chemical industries, and the manufacture of

electrical equipment obviously benefited Canada. In the 1950s close to 100,000

new jobs were

created, with

wages higher than in the rest of the Canadian economy.

54

Against the Stream

But, as Professor William Kilbourn at York University remarked, “Many people regarded the growing integration of the Canadian and American economies in the post-war period as a big step towards the inevitable disappearance of Canadian independence.” He mentions as one important source of discord Americans’ antiCommunism: “at the height of Joseph McCarthy’s reign of terror,” the Senate’s investigative committee “indulged in repeated acts of public vilification,” leading in 1957 to the suicide of “one of Canada’s most respected public servants, Herbert Norman.” In accordance with guidelines set by Ottawa, many of the post-1945 immigrants have had professional or industrial skills—and again a substantial portion of the expensively sponsored immigration was canceled by emigration. Canada’s economy has been relatively stagnant. A symbol of economic woes is the comparative value of the two countries’ dollars: since 1971, when the currencies were set afloat, the Canadian dollar has fallen by a third. At the turn of the century, one Cana-

dian dollar was worth 66 U.S. cents. Between 1961 and 1998, according to an estimate by Canada’s prestigious Fraser Institute, the average Canadian

family’s taxes increased by 139 percent, and

as a consequence an estimated fifth of the country’s GDP was produced in Canada’s underground economy. According to the Wall Street Journal (February 24, 2000), “The new consensus in Canada is that high taxes

are smothering the economy and preventing the nation’s young people from getting jobs.” Skilled immigrants to the United States not only get higher salaries, but these are especially advantageous when reckoned net after the lower taxes.

Canadian-American

Relations

55

In 1987, when the U.S. State Department announced a lottery for 10,000 permanent resident visas open to citizens of thirty-six countries, nearly 80,000 Canadians applied in the first week. Toronto has some 3,100 firms involved in information technology and telecommunications, and this base could expand into another Silicon Valley—except that the requisite personnel are lacking. A new American law establishing a “temporary” working status (TN-1) thus relieved employers of the prior necessity of proving that offering a job to a foreigner would not deprive an American of work. TN-1 status is valid for a year and can be renewed indefinitely, facilitating the influx of many ostensibly temporary recruits. From the 2,677 Canadians admitted with TN-1 status the number grew to almost 27,000 in 1996.

in 1989,

Under a page-one banner headline, “Canada Bleeding MDs, Nurses to U.S.,” the Toronto National Post described a brain drain: the sum of doctors, university professors, computer engineers, and so on who have

received a good education in their country and have now left it. The lure of the south, all but irresistible to many individuals, is worrisome to Ottawa. The government has responded to complaints by noting that for seven consecutive years the United Nations chose Canada as the best country in the whole world in which to live. Canada’s Ills and Possible Remedies

The final chapter of George Woodcock’s book, similar to the summing up in one work after another, is titled “Canada Identity Crisis.” In writing The Canadians, he remarked, he avoided using the word nation, for he was

not certain that his “country” is one.

56

Against the Stream

One of the most pessimistic’ forecasts of Canada’s future is by Lansing Lamont, an American journalist who has long worked in Canada: Breakup, “a wake-up call to a country in mortal danger.” He argued that Quebec’s recurrent flirtation with secession is both more

important than it is often thought to be and also is only the most prominent of disruptive trends. There is now an organization based in Ottawa, Canadians for Language Fairness, which answers Quebec’s separatists with a counteroffensive—for example, by collecting cases of competent monolingual persons who were fired in compliance with the law setting bilingual standards. When “multiculturalism” was set as national policy, this encouraged other minorities, particularly Canadian Indians, to seek their own special advantages. A work edited by R. Kent Weaver, a senior fellow at

the Brookings Institution, with contributions by professors from Queen’s University, the University of Montreal, and Harvard, is titled The Collapse of Canada? It is a more detailed review of two of the same themes, the apparently irreconcilable differences between Canadians and Canadiens and the constant conflict between the center (Ontario and Quebec, where some 60 percent of the population lives) and the outposts both east and west. The several attempts to draw

up a new Canadian constitution, in order—among other concerns—to set guidelines for the incessant squabbles between Ottawa and the provinces, have all failed. In 1990, for instance, one more effort was made to fash-

ion a new constitution, the so-called Meech Lake Accord. The clause that generated the most hostile reaction was the proposal that Quebec be recognized as “a

Canadian-American

Relations

iy/

distinct society” within Canada. Public opinion outside Quebec ran strongly against the accord, and within Quebec support was reported to be lukewarm. The required unanimous approval of all provinces failed by two votes. One reason for such fiascoes is that federal and provincial governments have created institutional niches for contrasting views about how to meld state and market, individual and community, and region and country. Thus, full-scale constitutional reform would require a thorough restructuring of virtually all the country’s social relations, a solution that no one thinks is feasible. United States in the World

Canadian-American relations have been affected also by the changing status of its larger neighbor. The end of the cold war with the victory of the West and the total defeat of Communism has had, from the point of view of democrats, one undesirable consequence. In a unipolar world everyone tends to unite against the dominant single power. Peter Rodman of the Nixon Center has written a book, Uneasy Giant: The Challenge to American Predominance, of which he gave a summary in the Summer 2000 issue of National Review. There he wrote: Most of the world’s other major powers—even our friends—haye made it a central theme of their foreign policies to build counterweights to American power. In fact, their efforts in this direction constitute one of the main trends in international politics today.

In view of such a consensus, it is not surprising that anti-American sentiments in Canada, always one component of its national identity, seem to have sharpened.

58

Against the Stream

In the spring of 2000 there appeared an advertisement for the Molson brewery that featured a handsome Nova Scotia actor in a plaid shirt, giving voice to Canadians’ resentment about Americans’ ignorance of its next-door neighbor: “I’m not a lumberjack or a fur trader. I don’t live in an igloo, eat blubber

or own

a

dogsled. I have a prime minister, not a president....I believe in peacekeeping, not policing; diversity, not assimilation. I speak English and French, not American....Canada is the second-largest land mass, the first nation of hockey, and the best part of North America. My name is Joe and I am Canadian.” As Steven Pearlstein of the Washington Post commented: Annoyed by the depreciation of their dollar, frustrated that their kids are moving to New York and San Jose, fearful about the arrival of privatized, ‘““American-style” health care, and outraged that a Canadian team hasn’t won the Stanley Cup in ages, Canadians are finally fighting back. Their weapon? What else, a beer commercial.

The Toronto advertising firm that put the ad together based it on its research among young Canadians (potential customers for beer), who were far less diffident than their forebears had been in proclaiming their moral superiority to the United States. The program was astoundingly successful, and Molson followed it with a Website identified with the key slogan, “I am Canadian” (www.iam.ca), which provides unlimited free replays and hosts a chat room where others can enlarge on what they think it means to be a Canadian. The whole affair is characteristic of the national sentiment it symbolizes— a joke with a serious underpinning. In many ways anglophone Canadians and Americans are—to repeat—all but indistinguishable, and with the

Canadian-American Relations

59

massive presence of United States business, politics, and culture in Canadian

life, a Canadian

is likely to estab-

lish his identity by emphasizing the not-American part of himself and his nation. According to almost a consensus, today the principal unifying sentiment of a country lacking oneness is hostility to the United States; to be Canadian means mainly to be different from Americans and possibly also hostile to them, thus continuing the view that two centuries of uneasy relations between the two countries had helped establish.

oe

i

tea

vie

|

Mv y

A

ihre hs RIES

Ww bed eee “ata A steph ne

|

be eA

hit

lw.sai tha antifree . fp be

murine We

+ tiie

mes

Le

Ot ini Adan

“a

Cogent

eri

Sin wet

ae

Ti

isigie

Sein

On

;

ee

:

:



2



=

was

agit

2¥iPy YTSoe) it :

ira\

Th ay

Avid) vagies

a

Tah

+ Tt

Vee

j

sit Pa

Secheues ai

Mayen t.

or ie

LAALGA ‘

eee |aahay

our?

Dien

ere igs pv!

sf

ie

fies

nails ini é

atl ae seat; oe

BP

Mp wd eat viva

SIRES of



=)

oe

‘rij

Cree —

bphiy

Gal Desh te Pees =n

Uti

ey

tied,

Aho?

TAS

wuLram

baal folowae

°

'

yen, ah

uy

be Re

Oe a a

«

ke

was

Rel vthr is

.

HA)Leet

chen vi » re ear

tt —

ih ivf PREY fen!

tint

7 Revolting Berkeley Students Sometimes it is useful to review a “happening” that, when it occurred, was seemingly properly diagnosed but in fact was generally misunderstood. Let us begin with a simple account of the main events on the campus at Berkeley, a review of the record that is not in dispute. I was a professor there at the time, and my report on the riots and the reactions to them is an account of what I witnessed—and tried to counter. In the 1950s an extremist grouplet tried to gain political advantage, and one of its representatives, running for office in a student organization, declared that

he would not serve unless the whole of the slate of which he was a part was also elected. From this incident a successor radical student organization called itself “Slate.” It ran on a platform demanding that the student government “should take stands on national and international issues.” However, the public university’s charter stipulated that it and its official subdivisions must remain

“free from political influences”;

this restriction

was intended, quite reasonably in my opinion, to keep a tax-funded institution immune from direct competition between political parties. Resoundingly defeated 61

62

Against the Stream

in this initial attempt to politicize the campus, Slate continued to press for a leftist program, but with little success until the mid-1960s. On September 10, 1964, Slate distributed a letter from “A former student” calling on undergraduates to “organize and split this campus wide open,...to perform civil disobedience at a couple of major University public ceremonies....If such a revolt were conducted with unrelenting toughness and courage, it could spread to other campuses across the country.” Also included was such counsel to students as “you must cheat to keep up.” This statement came

before any of the events that,

allegedly, later sparked the protest. First, this Slate leaflet called for illegal action; it was

only later that the

dean of students announced that a 26-foot strip of land at the entrance of the campus was University property and would therefore, except for occasions specified by the chancellor, be subject to the existent policy prohibiting the use of its facilities “for the purpose of soliciting party membership or supporting or opposing particular candidates or propositions in local, state, or national elections.” The ruling did not apply, however, to the strip of the city sidewalk between the campus boundary and the curb, where such activities, not under the control of the University, were authorized by the city. In response, Slate hired a lawyer to consider legal action against the University, and representatives of eighteen student organizations met with the dean of students to protest the rule limiting certain types of political solicitation to off-campus localities. The dean accepted most of the students’ proposals, specifying only that solicitations be directed at other students rather than Berkeley citizens, and a few days later the chancellor

Revolting Berkeley Students

63

made further concessions. Even so, several students manned tables in the zone that was out of bounds, and Mario Savio (who would become nationally famous as a principal leader of the rioters) announced that this action was, in his words, deliberately “in violation of University edicts to the contrary. We realize we may be subject to expulsion.” As a gesture of protest, eight students stayed in the administration building overnight, and in response to their invitation to raise the stakes, they were suspended. Then the demand became not only that the University abstain from disciplinary action against “the Berkeley eight” but that no sanctions be imposed on any other students participating in whatever demonstrations might be brewed in the future. That evening several student organizations coalesced into what they called the “Free Speech Movement,” or FSM. The following day, when a nonstudent, Jack Weinberg, manned a table to solicit political support for some cause or other, he was arrested for trespassing and placed in a police car to transport him to the headquarters of the campus security force. The car was surrounded by several hundred students lying prone around it, and Savio climbed on top of the car to use it as a soapbox. Weinberg was held in the immobile car for thirty-two hours, being fed by his comrades and given a bottle in which to urinate. Part of the mob reentered the administration building and prevented two deans from leaving. Who

Were the Rebels?

Students have both adults’ and children’s roles. They are able to vote and to marry, and some do. But most depend on their parents for a substantial portion of their

64

Against the Stream

living, and to do with often with to release ley invited

many are still in the process of deciding what their lives. Physically they are at their prime, more energy than there are outlets into which it; and the pleasant climate typical of Berkethem to find an outdoor setting for whatever

extracurricular activities they chose. In short, students are often rebels in search of a cause, which may be serious or, if none such is available, farcical. The events

at Berkeley had both elements, but most journalists accepted uncritically the students’ whoops and screams as a call for the reform not only of higher education but of American society. In the usual account of this turmoil, the media reported it as a confrontation between “the students” and “the administration.” In fact, many of the demonstrators were not students but some of the so-called street people who lived in Berkeley; and, on the other hand, many students were at least initially opposed to the antics of the radicals and their supporters. Of the more than 27,000 students at Berkeley, some 200 to 300 incited the rebellion. The so-called Free Speech Movement, as even the dullest student must have recognized, had nothing to do with free speech. One of the few subjects not at issue was the right of students to speak on campus in support or denunciation of anything whatever. Some months before the rioting began, a representative of a leftist student group asked me to speak at a gathering at which smoking marijuana was to be recommended as a healthful practice. I declined, but the meeting took place, on campus, without me. The intended joke that FSM — stood for “Free Sex Movement” fell flat, for free sex was already not only advocated but practiced. A bulle-

Revolting Berkeley Students

65

tin board on campus listing apartments for rent included several items that shocked an old fogy like me: several male students sharing an apartment offered rent-free rooming to a female in exchange for servicing all of them. Similarly, the supposedly ironic “Filthy Speech Movement” was an accurate description of many of its participants. In a university lecture hall a self-proclaimed anarchist gave detailed advice on how to avoid the draft by mimicking various illnesses; a self-proclaimed Communist used University facilities not only to denounce the war in Vietnam but—illegally—to collect money for the Vietcong. A student, using the University amplifier, described in explicit detail his participation in a mass homosexual-heterosexual orgy, a liberating experience that he recommended to his fellows. All these happenings occurred before the administration ruled merely about the place of such meetings, the issue that ostensi-

bly provoked the demand for “free speech.” Preposterous as it may sound, the issue was power. One of the radical leaders, Bettina Aptheker (daughter of a national

leader of the Communist

Party, an off-

spring quite immune to any conflict of generations), gave the motivation quite clearly: “We want to be able to bargain collectively with the Regents and say, ‘Baby, you give in or we strike.’” A favorite aphorism of Lenin, which he borrowed from Napoleon’s famous word of advice to politicians: “On s’engage, et puis on voit,” was roughly translated on one of the radicals’ picket signs as “Strike now, analyze later.” The issue eventually became a very different one from free speech: Do college students have the right to determine, at their own discretion and without sanctions from a tax-supported institution, whether they will obey

66

Against the Stream

any particular law? This was not a proposition that the general body of the students would answer positively— unless they were induced to do so by the administration and faculty.

The Berkeley Faculty It was my impression that the vast majority of undergraduates were willing to follow the lead of their professors. Consider a freshman, with whom I spoke at length. Torn between antipathy to the FSM and sympathy with her zealous acquaintances, in the end she decided not to join the three-day strike that the leftists had called. Her English instructor, however, dismissed his class. Her professor of psychology, after giving a speech on academic freedom so platitudinous that every local television station quoted it for the rest of the day, dismissed his class. Some of the courses taught by teaching assistants, who were fearful of sanctions if they broke the law prohibiting strikes by state employees, had their classes dismissed by tenured faculty, who thus conspired with their graduate students on a maneuver to break the law with impunity. In a decisive step on the road to near-anarchy, the faculty Senate passed a resolution, by a vote of 824 to 115, not only bowing to the FSM’s demands but, in effect, pledging to fight on the rioters’ side against the Board of Regents. After this, opposition to the FSM among the general body of students collapsed. Philip Selznick, a professor of sociology, noted with approval that “over the years a number of faculty members were actively interested in liberalized rules and, in cooperation with student leaders and President Kerr, helped to

bring them about.” In a statement allegedly voicing the

Revolting Berkeley Students

67

views of a majority of Berkeley professors, he proudly pointed to the faculty’s role in the long quasi-conspiratorial preparation for the culmination in the 1960s. I had been in the habit of having lunch once a week with a young professor in the English department. During this hectic week, he proposed that we go not to our usual place but to a restaurant a good distance from campus—a suggestion that I was happy to accept, for I thought he merely wanted to distance himself from the incessant disorder. The next week, however, he told me that his wife had insisted that we discontinue seeing each other, for | was becoming known as an opponent of the radicals. “After all,” he apologized, “we have two kids, and I don’t have tenure yet.” He was a bit ashamed, and I was saddened by his cowardice—unfortunately, all too common a reaction among nontenured faculty. Why should the faculty, by more than seven to one, have endorsed the demagogic ultimata of a few hundred students and their allies among the street people? Some professors may have been confused, for in the welter of rumors and disinformation they might have believed that the students were actually being deprived of their legitimate rights. Many in the faculty, learnedly citing precedents from medieval times, seconded the FSM’s claim that the Berkeley police had no right to come on campus, a sanctuary from which law-enforcement

of-

ficers were barred. In support of such nonsense, we were fed sizable quotations from Milton’s Areopagitica and Thoreau’s On Civil Disobedience, but nothing from Lenin

or Trotsky, who were far more relevant in any interpretation of the zealots’ behavior. Some of the faculty had flirted with Communism in the 1930s or supported

68

Against the Stream

Wallace

in the 1940s, and their middle-aged

blood

coursed faster in this reenactment of the struggle of Good against Evil. Remarkably, some faculty supported the student rioters because they were demonstrating against the University administration, which was often at odds with professors. Several recalled that the faculty Senate had voted against a parking fee, which nevertheless the chancellor’s office installed. On the other hand, some faculty members were acting in support of the administration, thus following—in their quite reasonable interpretation—Kerr’s implicit appeal to yield to all demands, no matter how ridiculous.

The University Administration Apart from the stance of the student body or of the faculty, the control of the campus was unarguably the responsibility of the administration. If at any point in the developing turmoil President Kerr had used his legitimate authority to restore order, events would have flowed from this act, rather than from his vacillation and incompetence. As president of the university, Kerr had a decisive influence on all the persons that eventu-

ally were involved: the governor of the state, members of the California

legislature, prominent

alumni, jour-

nalists by the score, many private citizens, and hundreds of policemen and state troopers. The University had undergone an important change in 1958, when Clark Kerr was promoted from chancellor of the Berkeley campus to president of the statewide institution. As one of many professors who were trying, however unsuccessfully, to contain the rebellion, I was one of the members of the faculty called

Revolting Berkeley Students

69

several times to Kerr’s office to discuss any proposals we might have. My main emphasis was that no student, any more than any other citizen, has a right to break the law, and I was dismayed by the flabby responses I got. Kerr told me he was a Quaker and his fundamental

be-

liefs induced him to meet the rebelling students halfway. Indeed, as he accommodated to the pseudo-issues in the Berkeley subversion, his recurrent passivity, compromises, and waffling encouraged its excesses. All too often, his successive directives on permissible student activities were rejected, then amended several times, and finally not enforced. In the evening of the second day of rioting a number of the FSM leaders met with Kerr and Chancellor Strong and exited with a statement by which these administrators admitted that Savio and his comrades had won. Several of the issues in dispute were to be decided later by a specially appointed committee, but in its final clause the agreement gave way to the radicals’ principal demand: “The President of the University,” Savio proclaimed, “has already declared his willingness to support deeding certain University property...to the City or to the [student organization].” To celebrate their victory, some 3,000 persons held a “peaceful mass pilgrimage-demonstration” across the street from the building in which the University’s Board of Regents was meeting. With this encouragement, the Board accepted Kerr’s proposal that the “sole and total penalty” of six students be suspension to date, and that the two ringleaders, Mario Savio and Art Goldberg, be placed on probation for the rest of the current semester. The Regents held, in addition, that all students and student organizations must “obey the laws of the State and

70

Against the Stream

community”; and to protest this outrageous mandate the FSM organized another sit-in of the administration building. After long deliberations the faculty committee recommended that the activities heretofore prohibited on campus be permitted; Savio issued a statement to announce that “it is gratifying” that the students’ demands were supported by faculty representatives. Chancellor Strong set up the Study Committee called for in Kerr’s pact with the students. When its proposed membership was announced, with two persons to be selected by the demonstrators, the radicals threatened further illegal activities unless they were given a bigger role. A new committee was instituted, with six additional members representing the faculty Senate, the University president, and the Free Speech Movement. The local branch of the American Civil Liberties Union threatened to take the University to court if the students’ “political rights” were not granted. Berkeley as Prototype

Students acquired much of the power they sought, not only at Berkeley but eventually through much of American higher education. The calamitous decline in standards, the absurd extensions of the curriculum to frivolous nonsense, the dismay of scholars over the lack

of control of their profession—these disasters started as reverberations from the Berkeley turmoil. Like the Communist fronts of the 1930s, the core of the FSM occasionally used legitimate issues to manipulate a larger mass. Almost every commentator repeated the mantra that “only” a minority of the students were radicals. True, but what minority? Those directing the demonstrations did not bother to hide their associations

Revolting Berkeley Students

71

with the Young Socialist Alliance, the official Trotskyist youth group; the Independent Socialist Club, organized when a Trotskyist employee of the university split off the left wing of the democratic socialists; the DuBois Club, a half-disguised youth group of the Communist Party. Other leftist organizations included the Campus Women for Peace and the Student Committee for Travel to Cuba. Three were more heterogeneous: Slate, the oldest extremist coalition on campus; CORE and SNCC,

nationwide civil-rights movements with little central direction, which on the Berkeley campus were more “progressive” than the national bodies. Politicization of the campus did not cease with the end of the crisis. Some months after peace was reestablished, I happened to be the outside member at an examination of a doctoral candidate in geography. In this routine assignment, the role of the professor from another department is not to pose questions relating to an unfamiliar discipline but to see that the procedure follows elementary rules of fairness—ordinarily not an onerous task. In this case, I was at first puzzled and then disturbed by the manifest hostility of two members of the examining committee, and during a lunch break I sought out the candidate and tried to find out what was going on. It turned out that he was an A student and a well regarded teaching assistant, but that during the strike he had refused to dismiss the class he was teaching. In retaliation, some members of the department would have destroyed his career but for the accident that I happened to be there to watch them. As it was, I had to fight to institute minimum equity, and then—after I made it clear that I would take the issue to the graduate dean and if necessary to the news media—

72

Against the Stream

the harassed young man was passed without further questioning. One aftermath of the revolting students at Berkeley was, in a local perspective, a decline in academic standards at what had been arguably the best public university in the United States. A general desire of students is that they be permitted to live an easier life, and once they had been granted greater power, some of them used it to remove bothersome hurdles in both undergraduate and graduate requirements. In several instances that I knew about, it took no more than a one-hour sit-in, or even merely the threat of one, to delete this or that prerequisite or guideline. I was personally involved in one such incident. The Sociology Department assigned me to supervise the doctoral candidates who were to take a test, then required, to demonstrate their competence in a foreign language. Using a bilingual dictionary, students had to translate into English a passage of some 300 words in one hour. Anyone with a nodding acquaintance with, in this case, French could hardly fail. Nevertheless,

I

found it impossible to pass about half of those taking the examination. A representative of the graduate students protested to the department chairman, who accepted the arguments that the test was meaningless (indeed!), that all knowledge is available in English, and that as long as this archaic obligation remained in force all students should be passed as a matter of course. Without even consulting me, the chairman posted the grades, with everyone passing. In a book by Stanley Aronowitz, who introduced himself as “a political activist of the New Left of the 1960s and 1970s,” he cited the Berkeley radicals as proto-

Revolting Berkeley Students

types of the student movement return to American academia:

—_73

he would like to see

Students [at Berkeley] were not only legally disenfranchised [stet] of

political speech and assembly, but were unable to control the conditions of their own education. The curriculum was in the hands of the professors and the administration [!] By the university rules, students were powerless to thwart the technocratic intention of the administration and the authoritarian classroom dominated by the professoriate. Consequently, from the demand for free political speech about questions such as war and peace and civil rights for blacks, the movement spread rapidly to considerations of whether students were moral subjects, whether they could have a significant voice in determining their own education—its curricula, pedagogy, modes of evaluation.

That a significant deterioration of American culture started in the 1960s has become a commonplace, as well as the fact that student riots were an important catalyst. It is less well known how important an impetus came from a single campus. As David Horowitz remarked in his 1997 memoir, Radical Son, of the young men and women who initiated the “New Left” with the Port Huron Statement of 1962, “the crucible of the movement was Berkeley itself.” The principal architect of the Statement was Berkeley’s Tom Hayden (now a member of the California legislature and a former husband of the notorious Jane Fonda), “an angry man who seemed in perpetual search of enemies.” On November 6, 1996, when Mario Savio died, there was of course a memorial service on the Berkeley campus, a regathering of some of the now middle-aged exrevolutionaries who assembled to relive their adolescent fantasies. The reaction of the nation’s media was astonishing: typically they saw Savio’s demise as the passing of a hero. In the Book Review of the New York Times (December 15, 1996)—of all places—the editor

74

Against the Stream

of the Threepenny Review was afforded a full page to pass on her near-worship of a 1960s demagogue. She had met him only once, and he was not a writer (“he never published a book”), but he spoke “with passion and imagination”: He brought innocence to the world—a pure, ingenuous, trusting sense of righteousness and compassion—that the world was ill equipped to handle....He did not set out to be a martyr, certainly, but he was willing to lose in the name of justice; there were more important things to him than winning.

In 2000, Berkeley alumnus Stephen Silberstein gave a grant of $3.5 million to memorialize the campus movement of the 1960s. The gift was used to fund an on-line archive of movement memorabilia and also a Free Speech Movement Café, with a quotation from its hero on the wall: “Put your bodies upon the gears and upon the wheels” of the establishment. Berkeley student Cesar Cruz, who worked there, was quoted as hoping that the café will fill “the political vacuum between the administration of today and the students of today. That would be great.” Savio and his comrades won, not only at Berkeley but over a much larger arena. The massive change in American society began on campuses and then spread to other centers of restlessness, and the metastasis eventually reached the farthest corners. In the New Republic (August 1, 1997), Michael Kelly (then the editor but since fired for expressing views like the one quoted below) used an outrageous front-page story in the New York Times to make the point:

Revolting Berkeley Students

75

When the subject of the left is raised these days, the general liberal response is to assert that the left scarcely exists....The left is not nowhere, it is, in a bland and vague way, everywhere. As the aging sandalistas have accrued power and raised children, their values have become the values of the age.... What was once radical is now normal. What was once left is now establishmentarian center.

What was once left became the norm in Clintonite Washington—which was also embellished with a personal crudity that went beyond even Berkeley’s precedent.

ca J. emeny: adioabws: fomer VP Th 1G nh

O20, Op,

(reingbie Ee

rue 2

GHywt thf, habe octet Zi

aleees

ui this a ESE iy

of Say

ehech”

L219,

Ge

cait «8

eee a ae

aah

Gree aay

[She

Copia

pane)

ok

none ee

Gioia

7

hi

(Sr acreviien

wimkes 4)

Voy ancien

Peron

aa

Ts’ ele

wa ween

othonnge: Sie! tie

ed

a

oe

a

Re aos Tr ‘whe Nene ay vagild 6, 1107). State Biel ie rece

he

iow

te

fe

‘onc Fie tn

ie

vn jreshice Sa Seg

on ones

ASS We

_

ae

ea

‘re oe Ae

Ll ese

on a

eee tray

ee

Tee

oy easly net ve yngelliialy a ah

ca

te”

;

Pre ay ee ee ae

ca" Oheraed?



thoy x

in Le era

it

11 From the History of English to Current Usage A living language expresses the fact that it is alive by undergoing constant change. Those who love the English language and try to protect it against barbaric inroads are usually prone to object to all changes whatsoever. It has usually been the herd of English speakers who initiate changes, while the better educated fight to maintain every detail of tradition. Mere defense against innovation is seldom successful and always foolish. One must accept or reject intrusions on the basis of appropriate criteria, but few guardians of established standards have shown any concern about whether modifications have rendered the text laudably simpler, or freshly original, or valuable in any other manner. Classical scholars have perennially objected to a usage in English because, for one who knows Greek or Latin (or French or Urdu), the derivation is false. Indeed, what should one do with the foreign words that

keep washing ashore? Speakers of English have generally welcomed

them as additions

to the native stock,

and the newcomers should also be granted prompt citi103

104

Against the Stream

zenship. “The slowness with which the naturalization of the words [the various forms in Britain of “naive”’] has proceeded,” as Fowler remarks in Modern English Usage, “is curious and regrettable.” He recom-

mends following the American usage, “naive” spelled without

a dieresis,

and relegating

alternative

forms

to a supplementary discussion. As another example, according to a supposition that the Oxford English Dictionary quotes, the word “restaurant” was coined in Paris in 1765, and the first citation in English is dated 1827. Yet both that work and the American Heritage Dictionary give an approximation of the French nasal as an alternative pronunciation, which indeed the pretentious often prefer. “Cafe” dates in English from 1816, but the OED still terms it “strictly a French term,” which is spelled “café” both there and as the preferred alternative in Heritage. Plural forms taken over from other languages usually eventually disappear, but often only after a long struggle. Like almost everyone else, I am quite comfortable with “agenda,” a singular noun with a plural (“agendas”) that once would have left a purist sleepless. But I find it impossible still to accept ‘ta data,” with datum sent to join agendum in Never-Never Land. Such preferences are not made by logic alone. Words enter standard English not only from other languages but also from the jargons of technical fields. If these are accepted at all, eventually they lose any feeling of foreignness. Most educated persons accept “hectic” in its lay sense, ignoring its origin as a medical term. “By and large” started as a term in seamanship, but Heritage has no reference to that source.

From the History of English to Current Usage

— 105

The Perils of Sociologese Perhaps the largest and often the most objectionable transfer is now from several of the social disciplines. Works on these subjects are typically written in a strange language, related to standard English but with their own vocabulary, style, and—sometimes—grammar. The reason for this arcane jargon, we are told, is that every layman thinks he knows all about the topics that the professionals analyze. Almost everyone is born into a family, and most persons form another one when they reach maturity. How better can a family sociologist distinguish his distinctive knowledge than by using a private form of communication? As the philosopher Charles Peirce once wrote, “Few persons care to study logic, because everybody conceives himself to be proficient enough in the art of reasoning already.” In either instance, the rationale is nonsense. The main

reason that sociologese exists 1s that most who use it cannot write standard English. The leading American sociologist when I was a graduate student was a professor at Harvard, Talcott Parsons, whose ponderous works

took ponderous hours to decipher. When C. Wright Mills took the trouble to translate a considerable passage into the common language, what had been almost impenetrable was converted into utter banality. However current English is assembled, the product is often regarded as a universal language. The notion that all ideas are now available in our own native tongue and that therefore no anglophone need take the trouble to learn any other language was given a outlandish twist at the 1994 United Nations population conference in Cairo. The 113-page Program of Action, as written by

106

Against the Stream

the feminist organizers, was a minefield of Americanisms,

for which equivalent expressions in the conference’s five other official languages neither existed nor could be successfully coined. As Peter Waldman noted in the Wall Street Journal (September

13, 1994):

“Family leave” has stumped nearly everybody. The Arabic translation describes spouses taking leave of each other after a birth. The Russian draft has the whole family going on vacation. ‘We’ve translated ‘family leave’ literally,” explains a Chinese translator, “but nobody reading this will know what it means.”

While I was in high school, the policy of requiring Latin in the precollegiate curriculum was abandoned. When

I was a student at Columbia,

it was decided not

to require undergraduates to learn a foreign language. Today professors in the various social disciplines write learnedly of this or that society without the ability to read a single word of its literature, newspapers, or government documents—and, not infrequently, with less than a thorough command even of English. While I was at Berkeley as a professor, each time that the university announced a list of new PhDs, news

reports would quote a few excerpts from the incomprehensible and often semiliterate summaries of doctoral dissertations. To avoid this periodic embarrassment, my wife Renee was hired as an administrator in the graduate division to correct the English used in these abstracts, all of which had been written by successful candidates for the advanced degree and accepted by their faculty sponsors. Renee was an immigrant whose formal training in the language had been limited to one high school course in Cuba, and yet she knew more about how to write English than many of the Berkeley faculty.

From the History of English to Current Usage

107

From Earlier English to Current Usage Many linguists are opposed to any prescription: whatever usage a native speaker comes up with is a legitimate part of an evolving language. Others are pedants, using their knowledge to warn against a change in a word’s meaning not congruent with its source. I find neither position defensible. In order to judge how reasonable the present rules of usage are, one should have some knowledge of the past trend. The historical record of Western languages (except for a few anomalies such as Lithuanian and Icelandic) marks a long development from the complex to the simpler. Scholars have used patterns in the vocabularies and grammars of the Indo-European family to construct a hypothetical forerunner (called Proto-Indo-European or, in older works, Aryan), which they hypothesize was spoken before 2000 BcE somewhere north of the Himalayas. This conjectural language had three genders of nouns, pronouns, and adjectives; six cases of nouns; a large assortment of other complexities. The languages that evolved from this base typically strode along the same path. Since the main stress was almost never on a word’s last syllable, inflectional endings became blurred and thus hard to distinguish. Precise communication was being lost. The evolution from Anglo-Saxon through Middle English to modern speech exemplifies the process. In Anglo-Saxon, the word order of a sentence varied greatly, with the verb (as in modern German) often coming at the end of a clause. To read the old texts one must learn not only the vocabulary but also the different forms

of verbs, nouns,

pronouns,

adjectives, and

articles. It was only in the twelfth century, when the

108

Against the Stream

present norm of subject-verb-object was established, that position in a sentence largely replaced the earlier inflections. As custodians of good usage, Heritage assembled 173 “writers, critics, and scholars” into a Usage Panel,

which was asked to give opinions on what is suitable in current American speech. A panel was first polled in 1964, then in 1969 and 1992 for the dictionary’s second and third editions. Innovative conventions at the earlier dates, if they survived, generally became more acceptable as time passed. Sometimes no standard evolved; the panel split 50-50, for instance, on which syllable of “harass” is to be stressed. On occasion the dictionary’s editors, while noting the panel’s decision, made a point of disagreeing with it. For example, though a “large majority” condemned “hopefully” in the sentence, “Hopefully the measure will be adopted,” the editors judged the usage to be “justified by analogy to the unexceptional uses of many other adverbs, as in ‘Mercifully, the play was brief’ or ‘Frankly, I have no use for your friend.’” One might anticipate many disagreements between the Heritage panel and Fowler, reflecting differences between a generation or two ago and now, between one scholar and a committee, between usages in Britain and the United States. In Fowler’s sizable essay on Americanisms, he shows no trace of the nationalist self-regard common on both sides of the Atlantic, preferring “lift? to “elevator,” for example, but “fall” to “autumn.”

There has been a trend over many decades to convert the suffix -our to -or. Fowler notes that the more rapid pace of this mutation in American speech “has probably retarded rather than quickened English progress

From the History of English to Current Usage

109

in the same direction,” for many Britons try to maintain or even to broaden the differences of their speech from “American.” The main function of speech is to communicate, and anything that assists in this is good. The trend toward redefining “disinterested” as a synonym of “uninterested” should be condemned, for an important distinction would be lost. “Dilemma” is not a synonym of “problem,” which does not refer to a choice between alternatives. And so on. Whenever precision in speech can be sharpened, one should favor that shift, and one should never accept a change that substitutes sloppiness for rigor. What is called for in writing may often be dispensed with in speech, where stress on particular words helps in conveying the intended meaning. Thus, the tendency of “only” to wander about a spoken sentence one may accept, since the word the “only” modifies can be indi-

cated by stress; but when writing one should place it close to that word. Virtually without exception, examples of “back formation,” or the practice of coining a short word from an actual longer one, have been soundly rejected and then, after a period, fully accepted. Fowler cites such examples as “diagnose” (from “diagnosis”), “drowse” (from “drowsy’’), “resurrect” (from “resurrection”). The

reasons the innovations attain legitimacy are that the new forms are generally useful, filling a vacuum in the vocabulary, and that the process seems to fit the way the language naturally grows. But for variable periods one is admonished to avoid “enthuse” (from “enthusiasm’), “burgle” (from “burglary”): Heritage notes that 76 percent of its panel once rejected the first of these, but the second is listed without comment.

110

~=—Against the Stream

Many of the words we use we have never learned. We know that in general the past of a verb is indicated by adding -ed to the present, that the plural of a noun is formed by adding -s to the singular, and we follow these rules without thinking about them. Until children learn about the few exceptions, they are likely to say “sleeped” or “foots.” Similarly, when the meaning of a noun with an irregular plural is extended, analogy may conquer idiom; thus, the animals are called “mice” but the same

singular to denote a control for the computer seemingly has two plurals, “mice” and “mouses.”

Feminists have demanded, and often gotten, radically creative usages. The duly obsequious male may restructure the sentence in order to avoid having to make a choice between the once correct and the now proper versions. The once routine “Everyone should know his place” has given way to the clumsy “his or her” or to the formerly incorrect “their” place. The routine salutation in business letters, “Dear Sir,” became questionable. On the other hand, the specification of the person’s sex in words ending in -ess has been condemned, though for some reason not in “hostess.” “Man,” meaning a person, though seemingly allowable both by the history of the word and long usage, has become illegitimate. The suffix -man, used to designate roles once restricted to men who have now been joined by women, has been ousted in favor of such freaks as “spokesperson” or, worse, “chair.” Directives to Pupils

Schoolteachers who spend their lives obsessively correcting this or that “error” seldom try to justify the “correct” forms with anything more compelling than

From the History of English to Current Usage

111

decorum. Why is it that so much of the fuss about proper speech concerns not such values as comprehensibility or beauty or wit, but rather a few rules of dubious value that sometimes are of counterfeit coinage as well? The instruction implicit from the works of the best current writers has seldom penetrated public education, where whatever attention given to the English language is typically concentrated on a few outdated rules. The Split Infinitive. In Old English the infinitive “to speak” was sprecan (cognate with modern German sprechen). Subsequently the suffix -(i)an was gradually replaced by “to,” and from its first use writers routinely inserted a word or phrase between this indicator and the following verb. It was only in the nineteenth century that guardians of our language undertook the campaign against “the split infinitive”: as in Latin so also in English, the infinitive must be regarded as a unit. Double Negatives. As in current usage in some other languages, double or even triple negation was once used in English to denote emphasis. That two negatives make a positive is true in mathematics, but not in Middle English or early modern English: still in Shakespeare’s day such constructions as “I cannot go no further” were common. Though he upholds the present norm, of course, Fowler is also appreciative of the survival in vulgar speech of the older standard: “Everyone knew what Mr. Dombey’s butler meant when he said that he hoped that he might ‘never hear of no foreigner never boning nothing out of no traveling chariot.’” To get that much vehemence out of a sentence following the current regulations would be impossible. As a consequence of the campaign against a double No, those anxious to follow the rules have sometimes dispensed with both

112

~=Against the Stream

of the negatives: the comment, “I couldn’t care less” has been transmographied into the ridiculous “I could care less.” Position of Prepositions. Another rule is, as I would prefer to put it, “A preposition should never be used to end a sentence up with.” Like other decrees, this one was introduced as late as the seventeenth century in another adaptation of English to Latin grammar. Problems with Cases. How does one translate “C’est moi” into English? For a conventional person, it 1s impossible: “It’s me” is forbidden, and “It’s I” is asinine. Case is indicated in today’s English in only a few instances, principally in personal and relative pronouns. Two relative pronouns, “which” and “that,” live comfortably without a differentiation by case, but “who” still has a rather jaded “whom” tagging along. The trend in the language has been to dispense with the latter, and injunctions against a change in rules have seemingly failed to maintain “whom” except when it is ungrammatical. A phrase like “whom he believes is” is commoner than its correct alternative. Closure

In sum, the sensible rules of English speech can be summed in a few guidelines: Seek clarity in communication and simplicity in construction. Admit that all living languages undergo perpetual change, and choose between innovations by the same standards that govern acceptance or rejection in general. Know that good writing is not derived from pedantry, but from a careful search for lucidity adorned, when appropriate, with wit and freshness. Ignore much of what you learned in school.

12 Too Much of a Good Thing Some time in the Middle Ages, somewhere in Europe, someone concocted a dreamland overflowing with food and drink, which even those who refuse to work could consume without limit. Accounts of this eden are in German,

French, Spanish, Italian, and Dutch as well

as English. Since accounts of this wonderland of feasts spread by oral repetition or, if in writing, at a time when orthography was in flux, in all of these languages its name has sundry spellings. The usual spelling in English is either Cocagne (as in French) or Cockaigne. According to the American Heritage Dictionary, the word cockaigne derived via French from the Middle Dutch kokenje, little cookie. However, in a book specifically on this home of gourmands, Herman Pleij, Dreaming of Cockaigne (2001), this etymology is dismissed as “rather far-fetched.” There we are told that “it seems likely that the origins of this name lie in the sounds and associations produced by French or Provencal words having to do with cooking and a special kind of honey cake (cocanba).”

We know, of course, very little about what peasants in the real medieval world ate. As depicted by urbanites 113

114

= Against the Stream

(our almost exclusive source), the typical peasant ate turnips, garlic, onions, and bread—a nutritious diet that tended, however, to produce flatulence. Such other

sources of information as songs and paintings suggest that there was an abundance also of sausages, meat, poultry, and various drinks. Cockaigne, thus, was either a contrast with relative penury or, on the contrary, an extrapolation of good times to satiation-plus. America the Superabundant With its profusion and variety of food, the United States has transformed this dreamland into reality, but the overflow of delectations exemplifies the traditional warning against having one’s dearest hopes realized. In December

2001, the Surgeon

General

issued a re-

port that defined 61 percent of adult Americans overweight.

One

adult American

in three

as

is obese,

and the prevalence has been rising. Among those aged 50 or over, it is estimated that obesity doubled from 1982 to 1999. According to an article in Pediatrics (1998), over the prior thirty years the number of children aged 6-17 classified as obese more than doubled, and pediatricians now treat obesity more frequently than any other disease. By the turn of the century some insurance companies had begun to pay for drugs used to counteract morbid obesity—in spite of the fact that those presently available are not reliably effective and generally have unpleasant or even dangerous side-effects. Since 1985 obesity has been recognized as a chronic disease and a major risk factor for cardiovascular disease, hypertension, diabetes mellitus, and some cancers. The branch of medicine that deals with the causes,

Too Much of a Good Thing

= 115

prevention, and treatment of obesity is called bariatrics (from Greek baros, weight), and in 1950 specialists in this branch of medicine instituted the American Society of Bariatric Physicians. Its members undergo special training that enables them to prescribe diets, exercise programs, and changes in life style, supplemented when appropriate with anti-obesity medication or surgery. Paradoxically, the considerable undertaking by federal agencies to induce Americans to lose weight is countered by the programs of other federal agencies to feed the nonhungry. Several decades ago, a sector of the poor, particularly poor blacks, did not have enough to eat. Eventually the government responded with three new projects: Food Stamps, with which recipients could pay part of the cost of food; school lunches, furnished free to children who presumably got too little at home; and the food program of Women, Infants and Children (WIC). By the beginning of the new millennium, 20 million were given food stamps, school children were given 28 million

hot lunches

(as well as 8 million breakfasts),

and pregnant or nursing women plus children up to age 4 were given enough rations to constitute, by themselves, an adequate diet. The cost of the three programs

amounted to $31 billion a year. Such humanitarian programs, once instituted, are likely to be immortal. No member of Congress would dare propose that they be reduced, not to say abolished. This policy is continued in spite of the facts that the principal dietary problem is that Americans eat too much, and that the percentage of obese is 5 to 10 points higher among the poor than in the general population.

116

Against the Stream

The Slimming Industry

Overweight persons typically try periodically to lose weight, sometimes

succeed for a period, and then usu-

ally regain the pounds they had laboriously peeled off. Many men and especially women are on a regimen that they hope will melt away some of their poundage, and most of those who are not actually dieting are “watching” what they eat. With an immense and ever growing clientele, the slimming industry has grown as fat as its customers,

with estimated annual sales of $30 billion.

One indication of the industry’s mammoth sions is that, on line under “Diet Companies,”

dimenthere are

744 items. One of these, Freeweightloss.com, offers access to many more through such programs as the Weight Directory, which lists “thousands of links” to products and services. The introductions to the benefits the companies sell offer “unique” and “customized” services, often with before-and-after photographs of satisfied clients. Many of the millions of clients of reducing firms are concerned by the threat to their health, but one can suppose that many others hope to realize the slim bodies that every medium, from clothing advertisements to movie stars, proclaims as the American standard of “good-looking.” To counter this fashionable norm, some fat women have tried to concoct a competing standard of feminine beauty. An International Size Acceptance Association

(ISAA)

sponsors

an international

no-diet

day, May 6th, which was started in London in 1992. Members are encouraged to smash their scales and to donate to charity all the clothing that no longer fits them. SizeNet provides “all the information you’ll ever need” about issues related to overweight. Members of another

Too Much of a Good Thing

117

group on line, SeaFATtle (because it is located in the greater Seattle area), meet the fourth Saturday of every month at—where else?—Chang’s Mongolian Grill for lunch and fat-related undertakings. Other pro-fat groups emphasize not health but sexual attraction. Club Voluptuous is “Minnesota’s official big girl & guy headquarters.” More to Adore services the Detroit area, and Mid South Big and Beautiful is located in Memphis. A woman on line who identifies herself merely as “Betsy” offers videotapes of herself in the nude. She weighs over 500 pounds. Become

an Italian Peasant

The typical American diet provides not only too much food but nourishment of the wrong kind, not only much

too much fat, for instance, but the kinds of fat that dietitians advise us to avoid. Some years ago one of the recurrent regimens du jour was called the Mediterranean diet, based essentially on the food that Italian peasants eat. If one consumes far less meat than the American norm and eats fish twice a week, we are told, the meals can be no less appetizing and much more healthful. If one bases one’s cuisine largely on plant foods, consumption

will be of dishes rich in antioxidants,

fi-

ber, vitamins, minerals, and other components that help to ward off disease. The June 2002 issue of On Health, a publication of the Consumers Union, summarized several recent tests of the Mediterranean diet that all showed favorable results. In Italy, some 11,000 survivors of heart attacks were studied for three and a half years. Those who ate the most butter were almost three times as likely to die as those of the same age who substituted olive oil. In

118

Against the Stream

France, some 400 survivors of‘heart attacks who followed the diet for four years cut the risk of a second attack by 68 percent. During the last two years of the trial, two persons acquired cancer, compared with twelve

cancer cases in the control group. In short, to lose weight permanently and safely, one must be reborn into a different pattern of subsistence. According to medical opinion, all of the quick-slimming programs are in one sense fraudulent, for a permanent loss of weight can be realized safely only through a change of life style. As in many other contexts, in order to remedy overweight there is no free lunch.

13 On the Cause of Death Before discussing the reasons for variations in mortality, let us examine the full significance of both “cause” and “death,” for neither concept is as straightforward as it may seem. It is hard to resist interpreting the synchronous movements of two variables as a causeeffect relation. The very name of malaria, for instance, derives from mala aria, Italian for “foul air,” for the disease is endemic only where the air one breaths stinks. Moreover, when French officials had fetid swamps in Algeria drained, military deaths from malaria fell by 61 percent from 1846-48 to 1862-66, and presumably the unrecorded decline among civilians was of the same order. Such data seemingly verified the earlier false linkage between miasmas and the disease. In fact, the two are coupled by an intervening variable (in this case, mosquitoes, which breed in a bog and carry the malaria sporozoites). The pattern Apparent cause > Intervening variable > Apparent effect

is very common, for the mammoth transformations that we have labeled economic development and modernization comprised millions of changes, any two of which are likely to be highly correlated. 119

120

~=©Against the Stream

One might say that a prime objective in an elementary course in biostatistics would be to warn students to be wary in their interpretations of high correlations. It is startling to find that the authors of one standard text instruct their readers that “the stronger the association between two categories of events,...the more likely it is that the association is causal.”! It would be difficult to put the guideline less well. When I used to teach a course in elementary statistics, I devised several correlations to demonstrate to students why they must not interpret every

such association

as causal. For example,

as the

number of telegraph poles increased in Massachusetts, the state’s birth rate declined, for—in a Freudian interpretation hardly sillier than some that have been offered seriously—a symbolic phallus was being substituted for a real one. Therapeutic Systems Therapies with rather strange characteristics compete, especially in less developed countries, with modern Western medicine. As late as the 1950s two causes of death frequently cited in Ceylon were rather (literally, “redness”) and mandama (literally, “wasting”); Grahaniya, once the name of a female demon that killed children, was coded as a synonym of mandama.’ In most of Africa similar diagnoses have been incorporated, together with the associated therapies, into thera-

peutic systems that usually also include some elements of modern science. Take Chinese medicine as one prominent non-Western example. In a description that until recently was typical in the West, an American physician called it the “far-reaching and interlocking relationship of the prac-

On the Cause of Death

121

tices of medicine with superstition and religious practices, magic, geomancy, physiognomy, necromancy, spiritism, demonology, fortune telling, etc.’”? In Communist China, health professionals practice a strange combination of these centuries-old treatments with what we would term scientific therapies, but it is not appropriate to automatically dismiss all of the non-Western elements in such a blend. Consider how one American physician judged acupuncture: During the course of a series of professional travels to Chinese medical schools over a period of years, I became interested in acupuncture after seeing several demonstrations of its effectiveness as an anesthesia in major surgery....[One] team has produced a vast body of rather impressive experimental and clinical evidence.*

The mechanical manipulation of rotating needles brings about a considerable rise in the level of endorphins, the

opiates that the body generates. Nor do industrial countries lack irrational or false traditional beliefs. The remarkable revival in the United States of astrology, witchcraft, and other manifestations of the occult has apparently flourished across lines by social class or education.* The members of Christian Science are mainly of the middle class. Even though Wilhelm Reich’s variant of psychotherapy was so patently fraudulent that the Food and Drug Administration blocked the sale of his “orgone accumulator,” this obvious fakery was idolized by a sizable flock of wellto-do intellectuals. Moreover, when several of the possible causes of any death are concentrated in the same sector of society, they may be confused. Those obliged to live in squalor are generally malnourished and, by their way of life, are as likely to die from any of several death-dealing

122

= Against the Stream

agents. Thus it was that not only the public but also the American Medical Association once regarded pellagra, caused by a lack of niacin in the diet, to be an infectious disease, for it appeared most often among those lacking good sanitation. Definitions of Death

Death is the end of life, but according to beliefs held in many cultures, some posthumous remnant of the human being survives and continues an existence in an afterlife. The heroes of ancient Greece went to the Elysian fields; the home of the gods in Norse mythology included a beautiful hall, Valhalla, the final abode of slain heroes; the heaven of Christians is the final dwelling of the virtuous, a reward either for a praise-

worthy life or for God’s beneficence; the afterlife of virtuous Muslims is enlivened by such earthly delights the purest wine and the flesh of fowls, as well as “darkeyed houris, chaste as hidden

pearls” (Sura 56:15-24

of the Koran). In many religions such heavens have a counterpart, called Hades by the ancient Greeks, Orcus or Dis by the Romans, Hell by Christians. The word hell meant originally simply the place of the dead; it is related to Hel, in Old Norse the name of the goddess of the dead. However attractive the afterlife is pictured, even those certain that they will go to heaven seldom welcome the first step, their own death. The suicide bombers in the Near East generally have to be intensively indoctrinated to bring them to their manic state, even though they know that after they die Saudi Arabia will compensate their families handsomely. Calvin taught that an auspicious life on earth is a sign that the person is favored by

On the Cause of Death

123

God and is thus destined to end up in heaven, but even this guarantee has not induced upper-class members of Calvinist churches to embrace their decease. Belief in heaven and hell affects humans’ behavior, but in many instances only marginally. The traditional definition of human death given in Black's Law Dictionary, which has been frequently cited in American law cases, is “the cessation of life,...a total stoppage of the circulation of the blood and a cessation of the animal and vital functions consequent thereon, such as respiration, pulsation, etc.” With the many advances in medicine, that simple definition has become too simple, for it has become possible to keep patients “alive” but without some of the routine characteristics of a living being. In 1968, a committee of Harvard Medical School faculty proposed a new criterion of death—namely,

irreversible coma as inferred from a flat

electroencephalogram.® How the state designates “death” is not a trivial matter; it can partly determine such issues as, for instance, whether a death is a homicide or who shall inherit property. The revised definition was adopted in some jurisdictions in the United States but not others, so that whether a person was alive or dead depended in part on the location of the occurrence. In 1969-70, when Kansas passed a statutory definition based on the traditional whole-body criterion and then a second one on the new brain-oriented one, people there acquired two ways of dying. The proposal to define death differently set off an extensive debate in medical and legal journals. The many attempts to resolve the dilemma can be well exemplified by two papers at a symposium on “Problems on the Meaning of Death.” Robert S. Morison, a pro-

124

= Against the Stream

fessor at Cornell University, argued that both “life” and “death” are not events but rather processes. Using a specific criterion to define the series of changes in a person undergoing death, he held, is erroneous biologically, medically, legally, and morally. “We must shoulder the responsibility of deciding to act in such a way as to hasten the declining trajectories of some

lives,

while doing our best to slow down the decline of others.” Western values require that the decision be made for the benefit of the patient, but with an appropriate definition of death society will also gain from a more rational use of scarce medical resources.’ Leon R. Kass of the National Research Council insisted, on the contrary, that death be defined as an event

and not a process. For physicians and for society as a whole, what is important is the expiration of the person, not the decay of organs or cells. The criterion that defines death must be based not on social or moral grounds but on a physician’s judgment of medical symptoms, and the definition must be restricted to the condition of the dying person and not contaminated by the interests of relatives, potential transplant recipients, or society.® Variations on Old

Each species has its own normal longevity, and as an individual approaches its end its vital processes slow down. Persons approaching their death, however one specifies it, are generally viewed as having ambiguous attributes. Those advanced in years are often accorded deference or pseudo-deference, but they are also farther along the path to life’s termination, which removes

them from most contexts of collective life.

On the Cause of Death

125

A Chinese folk tale, “Four and Twenty Paragons of Filial Piety,” used to be recited to instruct children on how to pay honor to their elders. One Paragon slept uncovered, so that mosquitoes would concentrate on him and leave his parents undisturbed. A dutiful daughter clung to the jaws of a tiger, leaving her father free to escape unharmed. A man of 70 years dressed himself in baby’s clothes and crawled about the floor, thus convincing his 90-year-old parents that they could not be very old. Though hardly to this degree, traditional societies generally venerate their old people. One can suppose that the reason is that, from one generation to the next, the rules of life change but little, so that the gray-haired are best able to instruct others on the verities that also persist from one generation to the next. Moreover, special homage to the old need not be especially burdensome, for in such societies those well along in years never comprise more than a tiny proportion of the population. Bernard Lewis remarks on how strange it would have sounded in the Middle East of the mid-1960s to use the word “young” in the name of a revolutionary organization: The connotation of “young” was inexperienced and immature....All terms of respect mean old, senior....Both the Young Ottomans and their later successor, the Young Turks, avoided using the normal Turkish word for “young” in their nomenclature. The Young Ottomans called themselves Yeni, which literally means “new.” The Young Turks called themselves Jéntiirk, simply transliterating their French designation.”

In modern Western societies, with their rapid change in culture and technology and thus in appropriate behavior, the counsel of the old is typically out of date;

126

~=—Against the Stream

and after they pass middle age most of them learn to hold their tongue. They are well aware that for their offspring the sum of their pensions constitutes an unwelcome responsibility; this is generally a major component of what is called the conflict of generations. This dual view is reflected in the meanings given to the word “old.” In addition to its primary definition, “having lived for a relatively long time,” it can denote “former” (as in Old Style, of dates by the Julian calendar); “of long duration” (as in old-growth, of trees sur-

viving from before the last cutting); “approbative” (as in the saying, “As old wood is best to burn, an old horse to ride, old books to read, and old wine to drink, so are

old friends always most trusty to use”); “familiar” (as in old man for father); “deferential” (as in elder, one who bears an office in Christian churches); “conde-

scending” (as in such variations on old as oldster and oldish, which are usually intended as a cosmetic); “‘adverse” (as in the many phrases that remind us of the

infirmities associated with old age). Plato found old age “a dreary solitude,” Terence “a disease,” Seneca “an incurable disease,” Augustine a “lengthened infirmity.” A.K. is a euphemism for the Yiddish alte kacker, which in English is mellowed to Old Fart. On Immunity and Placebos If a bacterium invades a body and the person dies of the disease associated with the germ, such a sequence is the simplest example of the cause of death. But even in this instance, we must account for the complication that some who are infected die and others do not. That individuals vary dramatically in their reaction to infections is a relatively new datum. In 1952, a paper was

On the Cause of Death

127

published concerning a young child who over the course of four years was admitted to a hospital nineteen times, suffering from one infection after another. Because of a defect in his immune system, he produced no antibodies, and when these were injected his torment ended. The fact thus established, that persons differ in the relative immunity to various diseases, makes it considerably more difficult to pin down their causes. A more general complication is the apparent causeeffect interaction between mind and body. For example, several researchers have demonstrated that a patient who has been given what physicians call a dummy pill but believes that he has received a pain-relieving drug has a physiological reaction, a rise in opiate concentration in the blood. A generation ago most discussions of this so-called placebo effect were to be found in medical journals, but by now a summary of a dozen or so of such papers has appeared in the New York Times Magazine (January 9, 2000).

Today most physicians accept the seeming fact that the mind can affect bodily functions, for the evidence

that mind-body interaction exists is seemingly well established. A principal factor that fosters the placebo effect seems to be the patient’s faith in the therapeutic procedure or substance, and this is greatly affected by the authority of the medical personnel. Any advice or medicament,

it has been shown

in several

studies, is

more effective when administered by a physician than by a nurse. At the 104th annual meeting of the American Psychological Association, held in Toronto in 1996, two psychologists from the University of Connecticut reported on an analysis of thirty-nine studies of a total of 3,252 depressed patients. They concluded that a full

128

Against the Stream

half of the effect of antidepressant medication was based on the placebo effect. If the placebo effect is a factor in most or all therapy, it is more difficult to distinguish between quackery and scientific medicine, between excellent and poor health institutions, between

witchcraft or so-called “alterna-°

tive” treatments and conventional medicine, between impersonal and manifestly caring physicians; for in all these pairs the truth is to some degree indeterminate. Before the nineteenth century effective drugs numbered only four: quinine for malaria, vaccination for smallpox, iron for anemia, and mercury for syphilis. Everything else was, by the standards of the presentday West, physically ineffective. Such a routine process as bleeding the patient at best did no harm; it certainly did no direct good. That mortality in Western countries gradually fell from 1760 to 1840, which is the consensus of historians who have written on the subject, was apparently the consequence mainly of improved nutrition and the feeble beginnings of public health, with the possible addendum of the placebo effect.

The Recording of Deaths The ambiguities that encompass mortality obviously affect the way mortality is recorded. Sherwin Nuland, a physician who has written deservedly popular works on medical matters, remarked that there is in fact an “endless multitude” of reasons why life departs from each individual. Similarly, according to a discussion by demographers on the cause of death: There is no satisfactory procedure for reducing the spurious specificity of diagnosis. ...Exact diagnosis...is very difficult for an increasing

On the Cause of Death

129

number of deaths in industrialized countries...because of the changing cause and age structure of death.!°

Nuland notes, however, that one of the main causes of death is never mentioned: In its obsessive tidiness, the Report [issued annually by the U.S. Center for Health Statistics] assigns the specific clinical category of some fatal pathology to every octo- and nonagenarian in its neat columns....Everybody is required to die of anamed entity....Everywhere in the world, it is illegal to die of old age."!

Advanced countries require that every death be certified by a physician or, for deaths by violence, by a coroner. Even the standard form now used throughout the United States makes it apparent how tricky it is for a medical expert to specify why a person died. The pertinent section of the death certificate calls in Part I for (a), the “immediate cause [or] conditions, if any, which gave rise to the immediate cause”; then (b),

“due to, or a consequence of”; followed by (c), “due to, or a consequence of.” Then in Part II the physician is to record “other significant conditions [that is, those] contributing to death but not related to cause given in Part I (a).”

In recent years the ambiguous specification of the cause of death has encouraged trial lawyers to sue corporations or other wealthy institutions for damages. In the most notorious case, successful suits have been brought against tobacco companies, allegedly responsible for smokers’ deaths by lung cancer. When the death certificate of a Los Angeles resident labeled that city’s smog a significant factor in the man’s heart failure, the coroner refused to accept it, remarking, “Los Angeles is not a disease.” Everyone lives in ways not recommended by physicians, nutritionists, and other presumed

130

«Against the Stream

experts, and if there is money to be had, one can almost always blame the consequences of this less than optimum behavior on one institution or another. The data from death certificates throughout the fifty states are assembled at the National Center for Health Statistics. In 1968, the agency converted from the prior manual coding to what it terms the Automated Classification of Medical

Entities (ACME),

by which the un-

derlying cause was entered by computerized coding and all other causes continued to be entered manually. Then, in 1990, automatic coding was introduced for all causes,

and in 1993 an enhanced system that allows for the total entry of the multiple cause-of-death text as reported by the certifier. The accuracy of these entries, when measured by the proportion of deaths assigned to ill defined symptoms, signs, and conditions, has since 1900 remained stable at slightly more than 1 percent. As we have noted, death is caused not by one cause but by a number of factors. Typically, statistics on mortality from “immediate” causes differ considerably from rates based on “multiple” causes.'* Even if we ignore other circumstances, compilations of “immediate” causes as entered by the attending doctor would seem to be a rather dubious base for a country’s mortality data. The function of a physician is to cure, and if one of his patients dies he has in this case failed, possibly because he did not identify the illness correctly. Indeed, when diagnoses were checked by postmortem autopsies, this supposition has often been confirmed. Half the cases of terminal pulmonary embolism had been wrongly diagnosed, as well as 53 out of 85 cases of abscess of the liver, a third of the cases of gastro-intestinal hemorrhage, 83 percent of syphilitic aortitis. The

On the Cause of Death

131

conclusion from a compilation of such data was that an “antemortem diagnosis is frequently refuted, clarified,

modified, or elaborated by postmortem examination.”! Dr. John Prutting, who was president of a Foundation for the Advancement of Medical Knowledge, argued in a number of publications that all deaths should be followed by autopsies. However, many persons and some religions oppose what they regard as desecration, and it would require a very expensive increase in the number of pathologists to carry out such a program. Moreover, with the increasing life span, more and more very old people simply wear out, so that which of their organs first buckles is almost a matter of chance. In any case, some professionals concerned with mortality allege that Prutting’s counsel is out of date. Diagnoses have become more accurate, and some physicians see autopsies as a redundant exercise in pathology. The causes of death entered on a death certificate are supposed to follow the guidelines set in the /nternational Statistical Classification of Diseases and Related Health Problems, published in three volumes by the World Health Organization in 1993-94. This was the tenth revision of a schedule that, with a frequently revised title, has evolved gradually over the past century and a half. The first version came out in 1853, when the First Statistical Congress commissioned two of its members to work up an improvement on the alphabetical list of causes of death then in general use. Volume | of the current edition, which runs to over 1,000 pages, gives the full report of the conference that recommended

the revisions to be made, the regulations

for nomenclature, definitions of the terms used, and a list of categories of disease and morbid conditions classified

132

Against the Stream

by numbers of three or four digits. This is followed by an instruction manual in Volume 2 and a detailed index in Volume 3. The International Classification is a valuable guide to the state of the world’s health, but it also, like earlier editions, reflects compromises necessitated by the current state of medical

knowledge,

international differ-

ences in health care and record keeping, and the greater or lesser relevance, with respect to any category, of medicine,

anatomy,

or law. For example,

the sum

of

deaths attributable to alcohol does not include accidents, homicides, and other causes indirectly related to alcohol use, even though in many years the number of deaths caused by drunks who drive automobiles is probably larger than the sum of all the categories that are listed. Notes i¢

Brian MacMahon and Thomas F. Pugh, Epidemiology: Principles and Methods (Boston: Little Brown, 1970), 21.

D. 3h. 4.

ok 6.

Richard Padley, “Cause-of-Death Statements in Ceylon,” Bulletin of the World Health Organization 20 (1959), 677-95. William R. Morse, Chinese Medicine (AMS Press, 1938). Sherwin B. Nuland, How We Die: Reflections on Life’s Final Chapter (New York: Vintage, 1995), 132. Marcello Truzzi, “The Occult Revival as Popular Culture,” Sociological Quarterly 13 (1972), 16-36. Ad Hoc Committee. “A Definition of Irreversible Coma,” Journal ofthe American Medical Association 205 (1968), 85-88.

I

Robert S. Morison, “Death: Process or Event,” Science 173 (1971),

8.

694-98. Leon R. Kass, “Death as an Event:

2).

10.

A Commentary on Robert Morison,” Science 173 (1971), 698-702.

Bernard Lewis, What Went Wrong? Western Impact and Middle Eastern Response (New York: Oxford University Press, 2002), 58. Samuel H. Preston, Nathan Keyfitz, and Robert Schoen, Causes of Death: Life Tables for National Populations (New York: Seminar Press, 1972), 30-34.

On the Cause of Death le

We

13.

133

Nuland, How We Die, 43.

Kenneth G. Manton and Eric Stallard, “Trends in U.S. Multiple Cause of Death Mortality Data: 1968 to 1977,” Demography 19 (1982), 527-47. John Prutting, “Lack of Correlation between Antemortem and Postmortem Diagnoses,” N.Y. State Journal of Medicine 67 (1967), 2081-84.

ALY!

Wc

;

imac aettietge®> ay ttn) alAwe untPonta me inn) eta. and A bye Gusers Pegi Cy ee vot ioe covet dade beany nh ae ‘@Cbets

Li

way

Aves

Uw

Tat

“od bite

ariee app vreldieg

ice

aS

>

Ue 1 wen

~Ponteu

it

a

ia

@t'~ « @-acefp nlee® Tut ieee

» ;

Aé¢ns

iA

© wep.deae

Ca

eB

oe

(we

Hee

om

Jit

°

ee

om

Gy

i wits Bor

~

c

oo

be

WG

Siw '"

Ba © Pdi

heh,

Lt

ie

Ad ee tatty 2225.

te Cee? Peres » Mow Wa} Derteert Sepa vaves Cr

faa

eee TY

Aten,

oe a



(

Ae

a

tpeuiiese

ee

~~

ee

Ger" ia

(tap mS 2 [ee SSd Ree

VG

7

ee

Taare

Fe

Mia)

we

in pied?

aap Eb

eae a

IS

\ieuee

olay

161i

he

: (io

hig tat

Tigges

pe

lL @s

Web

es

_

dee®

a

aes

Me

seat

Index acupuncture, 121 Afghanistan, 24 Africa, 25 afterlife, 122 American Civil Liberties Union, 70 American Medical Association, 122 American Psychological Association, 127-128 American Sociological Review, 31-32 Antwerp, 36-37 Aptheker, Bettina, 65 Aronowitz, Stanley, 72-73 Artress, Lauren, 92-93 Ash Wednesday, 99

68-70; faculty, 65-68; as a

prototype, 70-75; Sociology Department, 72 Calvin, John, 122-123 Cambridge Encyclopedia of the English Language, 48 Campus Women for Peace, 71 Canadian, Chapter 6; brain drain, 55; future of, 57; languages, 48, 57-58; population, 47, 5053; railroads, 49-50, 51; relations with the United States, 47-48, 50-53, 53-55, 57-59; taxation, 54 Canadians for Language Fairness, 56 cause, 19-120 Central African Republic, 26

Ashworth, William, 13

Augustine, 126 autopsies, 130-131

Ceylon, 120 Chad, 25-2 6 Charles, Enid, 19 Chinese medicine, 120-121 Christian Science, 121 Christianity, xiii; holidays, Chapter 10; 122 Christmas, 95-96; dating of, 9798 city planning, Chapter 2, 14-17 Club of Rome, 19 Cockaigne, 113-114 college students, xii Columbia Archives, 16 Columbia, Maryland, 15-16 Columbus, 3 commuting, 14

baby boom, 20-21 bariatrics, 114-115 Basilides, 97 Belgium, xii, Chapter 5; bilin-

gualism 40-45, economy, 4243 Bellamy, Edward, 7-10, 11 Berkeley, xiii; Chapter 7 Boyle, Greg, 80 Britain, xi-xii Brookes, Alan A., 49 California, University of, Berke-

ley, Chapter 7; administration,

135

136

=>Against the Stream

Conquest, Robert, 26 Constantine, 97 Consumers Union,117 CORE, 71 Cruz, Cesar, 74 Cultural Council

feminism, 110

fertility, control of, x, excess, ix Flemings, Flemish, Chapter 5; culture, 43; language, 37-39, 40-43, movement, 36-40; re-

for Flanders,

41-42

da Gama, Vasco, 96 Davies, Robertson, 50-51

death, cause of, xili-xiv, Chapter 13; certificate128-129; defi-

nitions of, 22-124; multiple causes of 130; recording of, 128-132; responsibility for, 129-130 Debs, Eugene Victor, 10 delinquency, Chapter 8 demographic universalism, 2122

lations witht the Netherlands, 45-46 Fohr, xi Fonda, Jane, 73 Food and Agricultural Organization, 22-24 food: production, 22-24; shortages of, 24-27; stamps, 115 Fowler, H. W., 104, 108-109

France, 35 Fraser Institute, 54 Free Speech Movement, Chapter 757% French language, 40-4 Frisian(s), X1-x1i

Dewey, John,10

diets, 39, 116-118

gangs, 79-80

Dilmun, 2

Garden Cities, 10-13 Geddes, Patrick, 12 Genelin, Michael, 80

double negative, 111 Dreyfus, Alfred, 33 Drusus, xi

DuBois Club, 71 Durham Report, 48-49 Durkheim, Emile, xiii, Chapter 4 Dutch language, 37-39 earth warming, 1X-x Easter, 99-100 edens, 2,122 Emerick, Kenneth Fred, 52-53 Engels, Friedrich, 5 English language, xiii, Chapter 11; “errors” in, 108-112; foreign sources of, 103-104; history of, 107-110 environmentalism, ix-x, 21-22 Equatorial Guinea, 26 Ethiopia, 26 ethnicity, xi-x1i

Geyl, Pieter, 45 Goldberg, Art, 69 Good Friday, 98-99 Hague, Frank, xii Haiti, 24 Halbwachs, Maurice, 34 Hanukkah, 98 Harvard Medical School, 123 Haugen, Einar, 44-45 Hayden, Tom, 73 Hayek, Franz, 3 hell, 122 Henry VIII, 1 Heritage Dictionary, 108-109 Holderin, Friedrich, 3 holidays, xii Horowitz, David, 73 Housing, Task Force on, 16

Index

Howard, Ebenezer, 7, 10-13 Howells, William Dean, 10

137

Meech Lake Accord, 56-57 Mesa Verde, 22

metropolitans, 13 “Tam a Canadian,” 58 immunity, 126-127 Independent Socialist Club, 71 India, 24-25 International Classification of Diseases and Related Health Problems, Islam, 122

131-132

Jersey City, xii Judaism, 34, 99-100 juvenile delinquency, Chapter 8

Milton, John, 67 Military Medicine, 32 Mills, C. Wright, 105 Molson brewery, 58 Mombert, Paul, 19 More, Thomas, | Morison, Robert S., 123-124, 132 Morse, William R., 121, 132 Mumford, Lewis, 3, 12, 13

Murther & Walking Spirits, 5051

Kass, Leon R., 124, 132

Napoleon, 35, 65

Kelly, Michael, 74-75

Natal, 96

Kerr, Clark, 66, 68-69

National Review, 57

Keyfitz, Nathan,

Nationalist Party, 10 Negley, Glenn, 10

52, 128-129,

(ey Keys, Ancel, 21

Kilburn, William, 54 Kluft, P., 40

Lamberty, Max, 41 language: data on, 40-41 Lamont, Lansing, 56 Went. 99 Letchworth, 12-13, 14 Lewis, Bernard, 125, 132 Lomborg, Bjorn, x Looking Backward, 8-10, 11 Lower, Arthur, 49 Lupi, Antonio, 97 Maccabeus, Judas, 98 MacMahon, Brian, 120, 132 Malthus, Thomas Robert, 27 Manton, Kenneth G., 130, 133 Matthews, John, 97 Marx, Karl, 5, 29

McCarthy, Joseph, 54 McKie, Craig, 52

Nigeria, 25 Noble Savage, 2-3 Nuland, Sherwin B., 121, 129, 1325 183 nutrition, 21

obesity, xiv, 21, Chapter 12; by age, 114; in the United States, 114-115 old, meanings of, 124-126

OPEG 2s “optimism,” x-xi Osburn, Frederic, 12 Owen, Robert, 4

Padley, Richard, 120,132 Pakistan, 23

Paragons of filial piety,125 Parsons, Talcott, 105 Passover, 99-100 Patrick, J. Max, 10 Pauwels, J. L., 37-38 Pearlstein, Steven, 58

138

Against the Stream

Peirce, Charles, 105 pellagra, 122 Petersen, Renee, 106 Petersen, William, professor at Berkeley, 61, 68-69, 106; ethnic background of, xi-xil; point of view of, ix-xiv; on

usage in English, 104 Pipes, Richard, 2 placebos, 126-127

planning, 4, 7; city, Chapter 2, 14-17 Plato, 126

policy, 3-4, 13 population: conferences, 105106; forecasts, 3,19; growth, ix, xii, 19; incipient decline

of, 20; and sustenance, Chapter3 Preston,

Samuel

H., 128-129,

132

professional jargon, 105-106 Prutting, John, 131, 133 Pugh, Thomas F,, 120, 133

slimming, 116-117

SNEG, 7a “social currents,” 29-30 socialism, 4-5, 10

Society for American City and Regional Planning History, 15 sociologese, 105 sociology, 34 solstices, 97 Soviet Union, 23, 26 Spengler, Oswald, 20 split infinitive, 111 Stallard, Eric, 130, 133 Stanley, George F. G., 49 Steel, Duncan, 99 Student Committee for Travel to Cuba, 71 Student riots, Chapters 7-8 suburbs, 13-17 suicide: by age, 32-33; in Europe, 30; by religion, 30-31, 33-34; by sex, 32; statistics, 3133; in the United States, 32 Suicide, Chapter 4

Rodman, Peter, 57

Raet, Lodewijk de, 36

Tahiti, 2-3

Reich, Wilhelm, 121 Rogier, Charles, 36-37 Ross, Edward A., 20

Terence, 126

Rouse Company, 15-16 Rouse, James, 15-16 rural, 14

Therapeutic systems, 120-122 Thomas, Norman, 10 Thoreau, Henry, 67 Thrasher, Frederick, 79 Tombalbaye, Ngarta, 25-26 Truzzi, Marcello, 121, 132

Savio, Mario, 69-70, 73-74 Schoen, Robert, 128-129, 132 school lunches, 115

Selznick, Philip, 66

Uganda, 27 United States: Agricultural Research Service, 27; Bureau of

Seneca, 126

the Census,

Shaw, George Bernard, 12 Silberstein, Stephen, 74

Drug Administration, 121; National Center for Health Statistics, 130 State Department, 55; status in the world, 57-59

Simon, Julian, x Sint-Niklaas, 37

Slate, 61-62, 71

21; Food

and

Index

Unwin, Raymond, 12

Welwyn, 13, 14

urban, 14 utopia -nism, xii, Chapters doctrine of, 2

Weaver, R. Kent, 56 1-2;

Willem I, 35 Women, Infants, and Chi

Utopia, |

Woodcock, George, 55

van der Vorst, F., 40

Young Socialist Alliance, 71

Vietnam, 52-53, 65 Waldman, Peter, 106

Yule, 96

Walloons, Wallonia, 37 Weber, Max, 29

zoning, 17

139

7

Sane

6°2

Coste

apebaie

ae

tbe

Turret

OO

Snag A eae

inane

ene

ea

eran

me Ms eb gh T os

»

tone >

259

B

~

mp deat

eS]

c

. +

/}

Demography

Sociology History

AGAINST THE STREAM Reflections of an Unconventional William

Demographer

Petersen

With the insight and clarity that mark all of Petersen’s writings, Against the Stream brings together reflections of an unconventional demographer. Thirteen essays on various topics become a cohesive unit by virtue of the author’s unique point of view, and the understanding of contemporary events he has gathered in his long mastery of demography is evident in this volume. In a brief introduction the author points out that the viewpoints he expresses in the volume are unorthodox. He covers a variety of topics. Chapter 1 examines utopian thought, which Petersen notes usually gets good press that, in his view, is undeserved. Chapter 2 discusses planned communities and suburbanization, beginning with two famous utopias presented in books by Edward Bellamy and Ebenezer Howard, which had significant influence on American and British societies. Chapter 3 analyzes the perennial topic of how the balance between people and their sustenance will evolve. Chapter 4 critically explores Durkheim’s analysis of suicide. Chapters 5 and 6 analyze the culture, language, and geographical positions of the individual countries of Belgium and Canada, providing a fresh outlook on these routine topics. Chapters 7 and 8 evaluate rebellious Berkeley students and adolescent student rebels in general as the juvenile delinquents that they often are. Chapter 9 discusses the anti-urban bias of the mainline American Churches. Chapter 10 traces the historical roots of Christian holidays, pointing out their significant links with prior religions. Chapter 11 critically examines the history of the English language as a guide to current usage. Chapters 12 and 13 survey two widely misunderstood demographic topics—the cause of death and obesity—and provide some stimulating new ideas. This latest work by a distinguished demographer is a tightly knit, compact volume, a compendium of thought written in a nontechnical manner and about various subjects that will both interest the general reader and offer a different perspective of their disciplines to demographers and sociologists. About

the

Author

William Petersen, Robert Lazarus Professor of Social Demography Emeritus at The Ohio State University, is known throughout the profession as a leading demographer.

ISBN 0-7658-0222-8 90000 a

Z

>

Library of Congress: 2003066168 Printed in the U.S.A. Cover design by Jeremy Rech

| 9 *780765"802224

|ISBN: 0-7658-0222-8

|