Argumentation: Across the Lines of Disciplines [Reprint 2011 ed.] 9783110867718, 9783110132731


172 83 15MB

English Pages 411 [412] Year 1987

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Introduction
LOGICAL AND DIALECTICAL PERSPECTIVES
1. Probative Logic: Review & Preview
2 Logic to Some Purpose: Theses against the Deductive-Nomological Paradigm in the Science of Logic
3 Logic Naturalized: Recovering a Tradition
4 Beyond Induction and Deduction
5 Meaning Postulates and Rules of Argumentation: Remarks Concerning the Pragmatic Tie between Meaning (of Terms) and Truth (of Propositions)
6 Arab Dialecticians on Rational Discussion
RHETORETICAL AND EPISTEMOLOGICAL PERSPECTIVES
7 An Historical Approach to the Study of Argumentation
8 Argument and Usable Traditions
9 The Rhetorical Perspective on Argument
10 Rhetorical Communication as Argumentation
11 Argumentation without Proposition
12 Generational Argument
13 Valuing Dissensus
14 Rescher on Pascal’s Wager Argument
15 Argumentation and Narrativity
PRAGMATIC AND CONVERSATIONAL PERSPECTIVES
16 The Function of Argumentation: A Pragmatic Approach
17 Argumentation, Inquiry and Speech Act Theory
18 For Reason’s Sake: Maximal Argumentative Analysis of Discourse
19 Rational and Pragmatic Aspects of Argument
20 The Management of Disagreement in Conversation
21 Identity Management in Argumentative Discourse
22 Measuring Argumentative Competence
23 Some Figures of Speech
ARGUMENTATION ANALYSIS, EVALUATION AND FALLACIES
24 Towards a Typology of Argumentative Schemes
25 Enthymematic Arguments
26 Argument Evaluation
27 Naess’s Dichotomy of Tenability and Relevance
28 Rescher’s Plausibility Thesis and the Justification of Arguments: A Critical Appraisal
29 What is a Fallacy?
30 Some Fallacies about Fallacies
31 Ad Baculum, Self-Interest and Pascal’s Wager
APPLICATIONS OF ARGUMENTATION THEORY
32 Rational Argumentation and Political Deception
33 Evolution of Judicial Argument in Free Expression Cases
34 Argumentation in English and Finnish Editorials
35 Critical Thinking in the Strong Sense and the Role of Argumentation in Everyday Life
36 Informal Logic and the Deductive-Inductive Distinction
37 Arguments and Explanations
List of Contributors
Recommend Papers

Argumentation: Across the Lines of Disciplines [Reprint 2011 ed.]
 9783110867718, 9783110132731

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Argumentation: Across the Lines of Discipline

Studies of Argumentation in Pragmatics and Discourse Analysis (PDA) This series contains reports on original research in both pragmatics and discourse analysis. Contributions from linguists, philosophers, logicians, cognitive psychologists, and researchers in speech communication are brought together to promote interdisciplinary research into a variety of topics in the study of language use. In this series several kinds of studies are presented under headings such aß 'Argumentation', 'Conversation' and 'Interpretation'. Editors Frans Η. van Eemeren Rob Grootendorst University of Amsterdam Department of Speech Communication

Argumentation: Across the Lines of Discipline Proceedings of the Conference on Argumentation 1 9 8 6

Frans Η. van Eemeren Rob Grootendorst J. Anthony Blair Charles A. Willard (eds.)

1987 FORIS PUBLICATIONS Dordrecht-Holland/Providence-U.S.A.

Published by: Foris Publications Holland P.O. Box 5 0 9 3 3 0 0 A M Dordrecht, The Netherlands Sole distributor for the U.S.A. and Canada: Foris Publications USA, Inc. P.O. Box 5 9 0 4 Providence Rl 0 2 9 0 3 U.S.A. CIP-DATA

ISBN ISBN ISBN ISBN

90 90 90 90

6765 6765 6765 6765

321 256 319 320

7 3 5 9

(complete set) (this volume) (volume 3A) (volume 3B)

© 1986 by the authors No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission from the copyright owner. Printed in The Netherlands by ICG Printing, Dordrecht.

In memory of Wayne Brockriede

Contents

Introduction

1

LOGICAL AND DIALECTICAL PERSPECTIVES 1. 2

3 4 5

6

Michael Scriven Probative Logic: Review & Preview 7 E.M. Barth Logic to Some Purpose: Theses against the Deductive-Nomological Paradigm in the Science of Logic 33 Ralph H. Johnson Logic Naturalized: Recovering a Tradition 47 Trudy Govier Beyond Induction and Deduction 57 Kuno Lorenz Meaning Postulates and Rules of Argumentation: Remarks Concerning the Pragmatic Tie between Meaning (of Terms) and Truth (of Propositions) 65 Abderrahmane Taha Arab Dialecticians on Rational Discussion 73

RHETORETICAL AND EPISTEMOLOGICAL PERSPECTIVES 7 8 9 10 11 12 13 14

Maurice A. Finocchiaro An Historical Approach to the Study of Argumentation J. Robert Cox Argument and Usable Traditions 93 Joseph W. Wenzel The Rhetorical Perspective on Argument 101 Hellmut Geissner Rhetorical Communication as Argumentation 111 Michel Meyer Argumentation without Proposition 121 G. Thomas Goodnight Generational Argument 129 Charles Arthur Willard Valuing Dissensus 145 Timo Airaksinen Rescher on Pascal's Wager Argument 159

81

viii

Contents

15 Herman Parret Argumentation and Narrativity

165

PRAGMATIC AND CONVERSATIONAL PERSPECTIVES 16 Josef Kopperschmidt The Function of Argumentation: A Pragmatic Approach 179 17 J. Anthony Blair Argumentation, Inquiry and Speech Act Theory 189 18 Frans Η. van Eemeren For Reason's Sake: Maximal Argumentative Analysis of Discourse 19 Sally Jackson Rational and Pragmatic Aspects of Argument 217 20 Scott Jacobs The Management of Disagreement in Conversation 229 21 Susan L. Kline Identity Management in Argumentative Discourse 241 22 Robert Trapp, Julie M. Yingling and James Wanner Measuring Argumentative Competence 253 23 Robert J. Fogelin Some Figures of Speech 263

201

ARGUMENTATION ANALYSIS, EVALUATION AND FALLACIES 24 Manfred Kienpointer Towards a Typology of Argumentative Schemes 275 25 David Hitchcock Enthymematic Arguments 289 26 William L. Benoit Argument Evaluation 299 27 Erik C.W. Krabbe Naess's Dichotomy of Tenability and Relevance 307 28 Raymie E. McKerrow Rescher's Plausibility Thesis and the Justification of Arguments: A Critical Appraisal 317 29 Douglas N. Walton What is a Fallacy? 323 30 Rob Grootendorst Some Fallacies about Fallacies 331 31 John Woods Ad Baculum, Self-Interest and Pascal's Wager 343 APPLICATIONS OF ARGUMENTATION THEORY 32 Evert Vedung Rational Argumentation and Political Deception

353

Contents 33 Richard D. Rieke Evolution of Judicial Argument in Free Expression Cases 365 34 Sonja Tirkkonen-Condit Argumentation in English and Finnish Editorials 373 35 Richard W. Paul Critical Thinking in the Strong Sense and the Role of Argumentation in Everyday Life 379 36 Perry Weddle Informal Logic and the Deductive-Inductive Distinction 383 37 John Hoaglund Arguments and Explanations 389 List of Contributors 395

ix

Introduction Frans Η. van Eemeren, Rob Grootendorst, J. Anthony Blair and Charles A. Willard

A great many students of argumentation assembled in June 1986 in the Netherlands to attend the first International Conference on Argumentation of the University of Amsterdam. The Conference was called to cultivate the interdisciplinary study of argumentation and its applications. Its aim was to bring together argumentation scholars from around the world to listen to each other, to talk together, and in general to increase the exchange of ideas about argumentation. The three volumes in the series Studies of Argumentation, constitute the record of its formal presentations. The papers contained in these three volumes are certainly a mixed offering. They represent differences in disciplines, divergencies among research traditions, and cultural differences. By no means do they make up a unified body of knowledge. The conference's aim was to stimulate the flow of discourse across the main boundaries, not in the hope that one or another particular tradition would eventually subordinate the others but in the hope that cross-boundary communication among these traditions would strengthen them all. These Proceedings, as was the Conference, are truly international in scope. Europe is represented by scholars from Austria, Belgium, Bulgaria, Denmark, Finland, France, Great-Britain, Greece, Holland, Hungary, Italy, Poland, Spain, Sweden, Switzerland, West Germany and Yugoslavia; other continents, by contributors come from Israel and Morocco, Canada, the United States and the West Indies, and Australia - in fact, some 60 percent of the more than 150 papers read at the Conference were flown in, and as far as quantity goes the English-speaking world clearly outweighed the plurilingual Continentals. The geographical diversity of the theorists represented in these volumes is exceeded by the range of their intellectual backgrounds. There are philosophers and linguists, logicians and rhetoricians, speculative theorists and empirical researchers, generalists and specialists - and some who are all of these combined. Many work in Speech or Communication, or Philosophy departments, others came from institutes for educational research and development, colleges of Arts and Sciences, Psychology laboratories, and schools of Language or Social Studies (or their local equivalents). These scholars cultivate a striking diversity of disciplines, and favour a wide variety of professional organizations and movements, ranging from the American Forensic Association (AFA) and the Association for Informal Logic and Critical Thinking (AILACT) on one side of the Atlantic, to, for example, the Centre Europeen pour l'Etude de l'Argumentation (CEEA) and the International Centre for the Study of Argumentation (SICSAT), on the other. As well, many theorists not committed to any organization or manifesto play a major role. The Conference objective of drawing different scholars together was clearly

2

Introduction

successful. Its goal of stimulating the exchange of ideas and insights was sought by a programme which embodied as many aspects of argumentation theory as possible. Thus, the programme had several sections, each representing either a theoretical perspective on argumentation or a major topic of study by argumentation scholars. Those papers which were suitable for publication have been included in these Proceedings of the Conference. As a result, these volumes contain a smorgasbord: something for everyone. Although the merits of other arrangements: by country of origin, or by different traditions, are undoubted, their demerits are equally obvious. We have chosen to arrange the papers thematically - more or less according to the broad outlines of the Conference programme - in order to capture by their juxtaposition in the Proceedings the exchange of ideas that occurred at the Conference when scholars from different countries and traditions rubbed shoulders in the same section of the programme. The Proceedings are divided into three volumes. The papers read by the invited speakers, which have a more general interest for argumentation theory, are assembled in the first: Argumentation: Across the Lines of Discipline. The main themes which are already to be found in this volume, are elaborated on in the second and third volumes. Argumentation: Perspectives and Approaches stresses the theoretical aspects and Argumentation: Analysis and Practices the more practical ones. Argumentation: Across the Lines of Discipline, opens the trilogy. Its first three sections draw different perspectives on the study of argumentation. In (7) Logical and Dialectical Perspectives, are the papers by the philosophers and logicians Scriven, Barth, Johnson, Govier, Lorenzand Taha. They are joined, in ( I I ) Rhetorical and Epistemological Perspectives, by their rhetorically and epistemologically oriented colleagues - Finocchiaro, Cox, Wenzel, Geissner, Meyer, Goodnight, Willard, Airaksinen and Parret. With ( I I I ) Pragmatic and Conversational Perspectives, are added the papers of the linguists and language philosophers - Kopperschmidt, Blair, van Eemeren, Jackson, Jacobs, Kline, Trapp & Yingling & Wanner, and Fogelin. The reward for this crisscrossing of disciplinary borders may well be the relocation of the boundary lines. In any case, the reader is given fair warning by this volume's title. Across the Lines of Discipline applies also to the last two sections of volume one, where the focus is on specific topics of argumentation studies. {IV) Argumentation Analysis, Evaluation and Fallacies, group together papers by Kienpointner, Hitchcock, William Benoit, Krabbe, McKerrow, Walton, Grootendorst and Woods, and (V) Applications of Argumentation Theory, contains the work of Vedung, Rieke, Tirkkonen-Condit, Paul, Weddle, and Hoaglund. In these two sections the authors deal with similar problems in the study of argumentation, but approach them with various intellectual backgrounds and from diverse starting-points. Argumentation: Perspectives and Approaches, the second volume, contains the papers which seem to relate most naturally to the three theoretical perspectives of volume one, but they have been subdivided further and ordered differently. Thus, corresponding to section III of volume one ('Pragmatic and Conversational Perspectives') are in volume two: (I) Pragmatic Approaches, with papers by Sbisä,

Introduction

3

Benjamin, Kryk, Losier, Primatarova-Miltscheva, Carroll & Simon-VandenBergen & Vandepitte, Lundquist, van Eemeren & Kruiger, and Komlosi & Knipf; (//) Conversational Approaches, containing papers by Allen & Burrell & Mineo, Gnamus, Pander Maat, Schwitalla, Bax, Verbiest, and Pamela Benoit; and ( Ι Ι Γ ) Cognitive and Empirical Approaches, which includes the papers of Brandon, van Ditmarsch, Caron & Caron-Pargue, Dascal & Dascal & Landau, Völzing, Willbrand, Meyers & Seibold, Benoit & Lindsey, and Hample & Dallinger. Relating to section II of volume one ('Rhetorical and Epistemological Perspectives'), volume two contains ( I V ) Rhetorical Perspectives, containing the papers of Brandes, Brinton, Prelli & Pace, Alexandrova, Tindale &Groarke, Rossetti, Varga and van der Zwaal; and {V) Epistemological Perspectives, with the contributions of Caton, Fuller & Willard, Gasser, Wohlrapp, Fusfield, Gross, Furlong, van den Hoven, Weinstein, Murnion, Astroh and Gutenberg. The closing section of the second volume, {VI) Formal Perspectives, with papers by Apotheloz, Nolt, Pena, Hirsch and Brown, corresponds most closely to the first section of volume one ('Logical and Dialectical Perspectives'). Argumentation: Analysis and Practices - volume three - collects the papers corresponding to the second part of volume one {IV 'Argumentation Analysis, Evaluation and Fallacies' and V 'Applications of Argument Theory'). Thus there is, first, (/) Argumentation Evaluation, with papers by Ulrich, Grennan, Fisher and Schellens, and {IT) Fallacies, containing the papers of Rohatyn, Wreen, Biro and Maier. Second, there are four sections of papers on applying argumentation theory. {Ill) Legal Argumentation, includes papers on argument and law by Hynes Jr., Dellapenna & Farrell, Soeteman, Waaldijk, Seibert, Henket, Buchanan, Riley & Hollihan & Freadhoff, and Camp. {IV) Special Fields, covers different kinds of argumentation practices, varying from moral and aesthetic argumentation to argument in international organization, with papers by Garver, Manning, Bailin, Hudson, Brownlee, Berube, Tarnay, Zappel, Hazen and Walker. {V) Case studies, contains analyses of such specific argumentations as those concerning the SaccoVanzetti trial, the 1982 Falklands/Malvinas war, and job interviews, with papers by Bruner, Simon-VandenBergen, Kaehlbrandt, Fiordo, Schuetz, Williams, KakkuriKnuuttila, Smit, Adelswärd and Futrell. Lastly, in {VI)Education in Argumentation, there are collected papers concerned with the teaching of argumentation by Nolen, Siegel, Langsdorf, Makau, Marshall, Norris & Ryan, and Collison. Each volume, Across the Lines of Discipline, Perspectives and Approaches and Analysis and Practices, is a valuable collection in its own right. Of course, these books can be read independently of one another, but just one or another alone will not suffice to get a good picture of the state of the art in argumentation theory. For that purpose, one has to read all of them. Their interconnectedness then, no doubt, becomes more distinct. This would already be a good result for the study of argumentation, for although falling short of the cross-fertilization and even the fusing of disciplines which are some argumentation theorists' dreams of the future, such an understanding is an indispensable starting-point for co-operation in the further development of argumentation theory, which is exactly the main goal of the International Society for the Study of Argumentation (ISSA), founded at the end of the Amsterdam Conference. These three Argumentation volumes are the first step in realizing this goal.

Logical and Dialectical Perspectives

Chapter 1

Probative Logic Michael

Scriven

I REVIEW & PREVIEW 1. Probative Logic and Demonstrative Logic The classical models of reasoning provide inadequate and in fact seriously misleading accounts of most practical and academic reasoning - the reasoning of the kitchen, surgery and workshop, the law courts, paddock, office and battlefield; and of the disciplines. Most reasoning is not the deductive reasoning of the mathematician or pure logician, nor is it best interpreted as an incomplete version of deductive reasoning. Neither is it the reasoning of the pollster using inferential statistics, or of the gambler using probability theory - although, like these, it often leads to probabilistic conclusions. Nor is it the reasoning modelled by any of the attempts at a formal inductive logic, attempts which are doomed by their definition. The kind of reasoning that is most common but poorly served by the logicians will be referred to here as probative inference, and we will say that it is 'governed by' probative logic. Probative logic is not a system of rules ofprobative inference, for probative inference essentially transcends systems of rules; probative logic is mainly an account of probative inference. Probative logic is an approach or point of view as much as it is a set of specific procedures or doctrines, but the approach is based on a number of doctrines and generates a number of procedures as well as the rejection of some others that are widely accepted. In this paper, probative logic will be roughly located in logical space, and some of the more interesting features of this territory - and of its connections to its logical neighbors and predecessors - will be sketched in. It is intended to provide more detailed maps and the results of further mining operations in later papers. Common reasoning often has components that are or can usefully be represented as tidy demonstrations governed by the logics of deduction and probability; the problem is that they are only components, and a completely distorted picture of the nature of reasoning results from supposing that these neat pieces are what reasoning, inference, and hence logic, is all about. One might as well suppose that all description is or could be quantitative, or that all evaluation could be reduced to measurement. 'Probative logic' is the name used here for a large area of the study and practice of reasoning that does not fit the classical model, either because it involves new territory or because it deals with the old territory in new ways. Its concerns are intensely practical; its roots are in the reasoning that was used

8

Michael Scriven

long before logic existed - reasoning which has benefitted little if at all from the application of logic - and in the practices of a hundred trades, a dozen professions, and the life of every child and citizen. Its everyday concerns range from (certain aspects of) assessing the words used to express an observation, through (certain aspects of) assessing a proposed explanation for a disaster, to finding the right point to attack or the right way to defend the presuppositions of a political position. While probative logic focusses on particular types of practical argument - most notably, on sets of reasons that cannot be sensibly supplemented to make up a classical demonstration - it shares the general point of view or role of logic. That point of view can perhaps best be defined by exclusion and example, although only imprecisely. Each of the three issues just mentioned has a factual or 'local' component which is (usually) not the business of logic, because too specific - the logical appraisal would survive some changes of subject matter. Logic is also contrasted with the assessment of grammar or spelling (because logical comments survive improvements in those respects), more general than assessing rhetoric in the narrowest sense of speech skills or elocution (because logical criticism transcends mere variations of delivery); and more general than the subtler analyses of connotations (since they survive translation to another language). Yet practitioners concerned with the more general aspects of those areas come so close to logic that all can meet in Amsterdam and on these pages with mutual benefit. Logic, as any discipline, is especially concerned with identifying pervasive patterns in reasoning (describing etc.) that can be studied independently of specific applications, with the intent of locating fallacies or formulating rules (of thumb) to follow in order to improve practice, or for their own sake. This propensity for generalization, while natural and appropriate to a degree, has in fact been the principal cause of the failure of formal logic to discharge its primary obligation, the clarification of significant arguments and their structure. Additionally, and traditionally, like philosophy, logic is concerned with certain areas of thought that have not yet been allotted to other disciplines but which can be clarified significantly by study of the reasoning that goes on there; the history of metamathematics and some aspects of linguistics are cases in point, and the discussion of the logical aspects of currently controversial issues in the public domain is another. The term 'probative', although unusual, is not being significantly extended here, since it means 'establishing or contributing towards p r o o f . But it is being restricted for convenience in making a distinction, since it will normally be used to refer to only that kind of activity that contributes towards proof which has not been traditionally covered by the subject of logic. (The justification for this is that the term is rarely used and hence can usefully be employed for the hitherto unlabelled part of what it officially describes.) The term 'proof could be said to be appropriate here on the grounds that it refers to something regarded, in practice and the professions, as doing the same kind of job as a formal proof, especially in that it is of great value in guiding rational belief and action. If one were to reserve the term 'proof for the product of traditional logic, one might

Probative Logic: Review & Preview

9

instead talk here of para-proof or proto-proof. In either case, the resulting inferences - and, sometimes, arguments - are of a kind that is neither deductive, nor quantitatively probabilistic, but, thoughtful people normally believe, properly thought of as strongly persuasive to the rational faculty. It may be argued that the term 'probative' is being extended if we use it to cover arguments about the appropriate choice of a description or explanation, for example; surely, it may be felt, these are not really proofs. This reaction is common amongst logicians, but not justified by the dictionary definitions (OED or the much more recent Shorter Oxford and Collins). It may help to get the pill down if it is realized that such cases are included only when the context is one of 'critical appraisal', that is, a context where the issue is whether the description (for example) can be justified or shown to be correct- or at least to be defensible (i.e. as correct as any other), or to be aesthetically meritorious (if and where that can be done with objectivity). The process of 'showing' is here treated, as it is in the courts or the columns of commentators or critics, as a kind of, or part of proof. It is significant that it is a visual metaphor, since much probative inference involves pattern-recognition. The patterns which are important to probative inference are ones often dismissed or crudely misrepresented by formal logics; they are far from the exceptionless exactitude of the universally quantified statement. Indeed, they are often not expressible at all; but one of the clues to the presence of probative inference is the use of terms like 'typically', 'ideally', 'essentially', 'naturally', and in some uses, 'approximately', and 'generally' or its informal counterparts 'most' or 'mostly'. The antiquity of this kind of reasoning is illustrated by the way in which we continue to use the peculiarly appropriate terms 'ceteris paribus' to qualify the generalizations and 'prima facie' to qualify the conclusions. Again, these are not terms which have received any specific recognition by formal logic, being dismissed as expressions of ignorance or probability. Thus probative logic is partly defined as dealing with much that formal logic rejects as makeshift or illicit. It may be that a phoenix can arise from those ashes. The term 'probative' has connotations which seem appropriate for the proposed appropriation - probing and probable·, putative, prove, and even perhaps probationary. Note that the term is not here used to refer to 'purely psychological' forms of persuasion, such as playing on the emotions when this is not rationally justified, or using inappropriate pressure tactics, drugs or torture. The contrast term for 'probative logic' is demonstrative logic, which is concerned with proofs that can be set out as standalone demonstrations - as static structures, with impeccable credentials. Probative logic can establish conclusions, but not in that way. Its principal domain consists of those situations where only the tip of an iceberg of support for certain conclusions is visible - and the accessible part of the subsurface support does not suffice for a demonstrative proof. Of course, it is not always clear in advance what will be found under the surface, and sometimes the rest of a demonstrative proof is hidden there; sometimes nothing of value. In any case, getting to this undersea underpinning is often important, though it can be arduous; and probative logic is concerned with the description and improvement of this process. This search for more of the iceberg because

10

Michael Scriven

difficult and time-consuming, is usually done only when and where needed. It is usually a series of specific forays into particular regions, a process of dynamic investigation, of challenge and response. More than this is unnecessary and hence has little part in practical reasoning. That more is not required in a particular case is often a matter of skilled judgement, and much of probative inference is conducted in a context rich in such judgements. The thrust and parry of discussion and debate makes sense only because of what is not mentioned at all, and there is far more of this that is relevant, indeed crucial, to the significance of what is actually said than the transcript includes. Trying to formulate the 'missing premises' (they would often be premises of a meta-argument about what kind of argument is called for) is not getting on with the job of discussion and debate unless there is just cause for doing so. If just cause of this kind were common, conclusions would be rare; we would never have time to get to them.

2. Classical Alternatives to Demonstrative Logic The reference to processes like discussion and debate reminds us of the concepts of dialectic and rhetoric (in honorable though not unique senses of those terms); but the efforts to develop theories of argumentation under those headings have not yet resulted in an alternative that is sufficiently tangible and attractive to wean many logicians - even informal logicians - away from traditional models. That may just be a sign of obstinacy or obduracy on their part, but we here explore the possibility that a different kind of approach is needed. In particular, we need one that will wean informal logicians away from the material charms of what might be called quasi-formal accounts of argumentation - ones that depend on reference to a shadowy formal structure - as it has become increasingly clear that these accounts are Procrustean efforts analogous to those associated with syllogism theory beginning in the Middle Ages. A similar approach has long held us up in dealing with causation and the analogy is so close that it may be more persuasive than any other arguments in this paper to those familiar with the history of attempts to analyze the concept of cause. The formalists - in that context, the neo-positivist philosophers of science - thought that a particular causal claim could only be justified when an underlying structure of laws could be exhibited from which the claim could be derived. We now understand that this was a complete blunder. While determinism may require that such an underlying structure of laws exist, scientific or historical method does not require that such laws be known in order that we can justify particular causal claims. Causation has a complete macro logic, independent of connections to the micro world, one which enables the coroner and the mechanic and the cook - and those who train them - to proceed with their causal analyses rationally, defensibly, and often conclusively, without the slightest need for any knowledge of 'deeper' laws. This is not only practically important since it enables the practical domains to proceed systematically without waiting upon progress in the depths, but it

Probative Logic: Review & Preview

11

is of some theoretical interest since determinism is in fact false. In the language of argument analysis, the neo-positivists did a poor job of identifying the missing premises for the 'demonstrative argument' they thought was required; it did not need the exceptionless truth of determinism, but something much weaker. They got as far as recognizing that statistical determinism would be enough, since - they argued - it would support the derivation of a high probability for the consequent of the causal claim, but in fact much less than that is required. Top be precise, all that is required is that if there are micro-laws governing the macro-phenomena referred to by causal claims, they have to be laws that are consistent with the causal claims. In short, the 'missing premises' only have to be permissive not authoritative; they are vacuous in terms of specific implications. And so it proves with reasoning in general. Here it is probative logic that represents the rules of the game of practical inference and argument, a game whose value is usually entirely independent of the possibility of finding unstated propositions to set up complete demonstrations. In both cases, there are occasions where something from the depths is useful, others where it is crucial; still others where the success of the macrolevel activity depends on the existence but not the production of the hidden truth. But the canons of experimental design and Mill's Methods do not require the hidden truths in order to work; and it is those canons, and some others at the same level, that the practical reasoner must learn and follow.

3. The Expert Reasoner Apart from the increasingly obvious failure of formal logic to provide a useful account of most reasoning, it is important to develop an account that will deal with the emergence of new results in the area of cognitive science, in particular the work on expert systems and artificial intelligence. For here we have 'models of inference' - in some sense - that are becoming extremely important in practical life, and which are described in ways that challenge some standard accounts of'logic in practice'. This work now appears likely to provide the most important challenge to rethink the nature of practical inference so far posed, and deserves a brief digression. Probative logic is designed to cover - and is to some extent explicated by - certain processes whereby experts and expert systems arrive at and defend their own conclusions. With experts, this often involves a dialectical process of coping with objections (actual or potential) rather than a process of constructing - with the help of guesses or insights - a proof that the expert 'had in mind' as a demonstration of his or her conclusions. Probative logic reflects this feature. But there is another aspect of the matter, the matter of coping with essential incompleteness, which is moving in from the computer horizon and is already present with human reasoners-in-context. Essential incompleteness is the property of good inferences/arguments that cannot be made demonstrative by adding premises from the stock of independently known truths. (That there are such arguments - in fact, several types of them - is one of the reasons for thinking we need probative logic.)

12

Michael Scriven

The computer programs known as expert systems (ES) aim for a reproducible and testable demonstration of their line of reasoning, enough to justify a demonstrative proof, and for this reason they can be seen as descendants of the efforts by formal and quasi-formal logicians to handle practical reasoning. But to achieve this they often use combinatorial procedures - and sometimes a set of inference rules or data - that are too complex to be consciously used by human experts. In fact, it seems implausible to argue that systems sometimes including thousands of rules, each with a quantitative estimate of likelihood, could be kept in mind at all, let alone rigorously combined in a particular case. That is, it seems implausible to suppose that they could even be used unconsciously. ES programs should thus not, in general, be regarded as computer simulations of human mental activity but as examples of artificial intelligence (which simulates achievements but not process). Hence they are not really reconstructions of the line of thought of an arguer, in any traditional sense. In some cases, however, it can be argued that they are - intentionally or not - modelling the brain processes of the expert, at some level of description of the latter in terms of information processing. It's no longer clear whether the term 'computer simulation', in psychology, would cover this case or whether it is restricted to the simulation of mental processes, i.e. those that are accessible to introspection, possibly with skilled external assistance from interrogators and drugs or special regimens. Identity theorists in the philosophy of mind would be likely to favor the extended view, i.e. the view that thought processes are brain processes and hence that ES programs that do model brain processes are really reconstructions of practical reasoning. But it is not going to be possible to know whether the ES is modeling our brain processes untill the brain sciences have developed considerably - that is, we will not know for a very long time, if ever. We need to avoid any commitment on this point if our conclusions are to be independent of that future research. It is suggested here that one should regard any such future investigations as only relevant to the psychological study of how human experts operate, and not to logic in any useful sense. Such research corresponds to the work of the particle physicist in our analogy with causal claims; it's interesting and it's even remotely possible that it may have some payoffs at the practical level, but its outcomes are entirely irrelevant to the validity or the presentation of practical argumentation. There are, however, some issues relevant to logic in the expert's performance that are not normally included in logic, though commonsense has never dismissed them. One concerns the justification of perceptual or 'intuitive'judgement claims by digging up explications of them when challenged. Another, not wholly distinct, concerns cases of probative inference where the arguer runs out of arguments - and of facts that are independently testable and demonstrably relevant to the conclusion in question - but where we can still turn to evidence of the arguer's track record (and to other secondary indicators, such as efforts at empathic identification). These appeals to track record can be described as meta-arguments. (Some of the fallacies - ad hominem is an obvious case - are probably best treated as meta-arguments.) Even in mathematics, home of the demonstrative

Probative Logic: Review & Preview

13

proof, it has always been clear that perception/intuition has had more place than the formalists sometimes suggested; to show that a proof is demonstrative, or that an iteration is exhaustive, ultimately requires the recognition of certain patterns. We rely on the fact, which we would establish by meta-arguments if challenged, that there is a very high degree of reliability in this task. But we are now entering territory where that line of argument is not so easily sold. We are familiar, from the contretemps about the proof of the Four Color Theorem, with the possibility of computers doing reasoning whose validity has to be inferred rather than directly checked. Thus meta-arguments already apply to ES and not just to human experts, as was obvious the moment that computers acquired the speed to be able to check proofs with so many steps that humans could not check them in a lifetime. We must expect the same situation to become more common as computers move to surpass us in their capacity to recognize certain patterns (they are unlikely ever to surpass us in recognizing patterns which we evolved to recognize e.g. faces and flora). This will lead to a second type of limitation on demonstrations in mathematics and elsewhere. The computers will eventually only be able to report features - highlights - of their line of reasoning in a human vocabulary, and this will sometimes not be enough to allow the reconstruction of a full demonstration. That is the typical situation in probative inference by humans. Combined with substantial evidence from prior track record, it is certainly enough to establish that the conclusion should be believed. Is it enough to establish that the inference is sound? Clearly not in the sense of exhibiting a demonstrative proof, exhypothesi, but, according to the canons of probative logic, it can do so in the sense of providing good grounds for thinking that the reasoning process will lead to true conclusions from true premises. In particular, it will often be the case that the probative arguer can: (i) exhibit enough premises to cope fully with the most common failures and fallacies that infest arguments of this kind or in this context; and/or (ii) establish enough general expertise - sometimes this is little more than intellectual maturity - to make it reasonable to suppose that the remaining gaps have been bridged by the recognition of patterns in the evidence that cannot be completely expressed, let alone verified by others. This kind of argument sometimes falls into the category described as 'establishing a prima facie case', or as 'providing plausibility considerations' and those are terms which flag the use of probative reasoning; but probative reasoning can sometimes establish conclusions beyond all reasonable doubt. The human expert reasoner has mastered the intuitive/probative/skilled judgement approach; the contemporary ES program is still rule-bound. This situation is encouraged by the current requirements that it be able to report back on its line of argument and justify its request for further information its metacognitive capabilities - e.g. by displaying the network diagram of the reasoning path. One change in the ES of the future will be a reduced extent to which that report will enable the human monitor to learn exactly (reproducibly) what is going on in the computer. At the moment, computer design is constrained by this need to justify its reasoning to a human, in detail. A better understanding of how human experts transcend this necessity in practice should facilitate moving

14

Michael Scriven

to the next step in expert system. And a better understanding of how computer expert systems could, in principle, do this, can help us formulate appropriate rather than formalist demands on 'incomplete arguments'. Note that an ex post facto understanding of what happens in a computer does not guarantee the capacity to reproduce the inferences; if one of the problem-solving strategies involves generating random numbers in order to sample the space of possibilities, we would not be likely to hit on the same random number. Since computers can generate and test several million possibilities a second, the whole 'guess & test' approach becomes a much more formidable weapon for induction in the hands of computers, and 'making a lucky guess' becomes much less surprising. Furthermore, the notion of 'monitoring' this process is obliged to become a second-order process, involving meta-arguments about what actually happened. The human expert uses sophisticated gestalt pattern recognition·, ES programs of the current generation can't do that - the only patterns they can recognize are those that can be reduced to rules linking relatively simple features. It is the programs' consistency and their combinatorial capabilities that sometimes more than make up for this loss of sensory sophistication, just as simple statistical models of practical predictive inference, using parameters obtained from analysis of past cases, almost always outperform human experts (see Clinical versus Statistical Prediction by Paul Meehl, University of Minnesota Press, 1954). While Dreyfus & Dreyfus {Mind over Machine, Prentice Hall, 1986) see the gestalt point as an essential limitation that will always preserve a superiority of the human over the computer, it is as well to remember that it is also the cause of the human failure in these head-to-head comparisons. The 'blooming buzzing confusion' of perceptions overwhelms the need for consistency in the combinatorial process. Thus, another way in which ES programs can easily become more sophisticated involves doing their own validation of empirical rules from the data instead of mining them from the experts. The other way, adumbrated above, is to go on to the realms beyond rules, the realms of probative inference and pattern recognition, and thence of new concept formation; they will then only be able to hint at the reasons they have for their conclusions (or for thinking that further information of a particular kind would be valuable). At that point they will be closer to simulation of human mental processes than they are now. They will be much more competent in more areas - but in one sense less accountable. When asked to explain their inferences, they will report on the reasoning network they have used, as now - but the network may be one that humans can't follow, linking patterns that we can't recognize with connections that we couldn't make. Or, it may be comprehensible but incomplete, in the same way that is already common with the reports from human experts. Bridging the gap is sometimes but not always possible and sometimes but not always essential. To make a reader enjoy the story and to take the criminal to court we have to be able to do better than quote the intuitions of a Hercule Poirot. Probative logic, like expert systems, covers the processes of arriving at conclusions in practical argumentation as well as the processes involved in justifying them, and hence does not recognize an unbridgeable chasm between the 'context of discovery' and the 'context of justification', nor between induction

Probative Logic: Review & Preview

15

and deduction. ES programs use either or both 'forward chaining' (which goes from data to hypotheses) and 'backward chaining' (which goes from goals, including hypothesis verification, to truth values) - and sometimes iterations that involve both - in order to reach conclusions. But each of these processes begins with propositions rather than observations, unlike much expert reasoning. Moreover, this is not the deduction vs. induction dichotomy, nor the discovery vs. justification dichotomy; logicians need to look carefully at it, since it is a new and pragmatically justified model of inference. Probative logic represents, in part, an attempt to develop a framework into which the operation of ES programs will fit better than they do into standard logical categories; but in part, probative logic goes beyond the conceptualization of inference used by ES designers today.

4. The Relation of Probative Logic to Informal Logic Probative logic (PL) is intended to be in part a successor to and in part a supplement to common versions of 'informal logic'' (IL), the name used here to refer to the important recent efforts at improving on the limited utility of formal logic for the analysis of common forms of argumentation. PL looks in detail at some concepts that are important in IL but which have not yet been given much attention there; but it also goes on to develop a broader perspective, an account which is supposed to apply to a wider range of cognitive processes. These range from informative perception, problem-solving and language translation, through knowledge representation (classification, stereotyping, idealization, approximation, summarization, caricature, graphics, date structures, and other data compression techniques), to interactive discussion of various kinds. About the latter it stresses the importance of the range illustrated by, and the distinction between; debate, brainstorming, auctioning, suspect interrogation, candidate interviews, expert (and other) witness cross-questioning, and oral testing of students, each with its own protocols and hence its own standards of plausibility reasoning. Now, IL is not just a mechanic's minimal toolkit, it is a wideranging technology of proven practical value, including limited 'theories' - they might perhaps more accurately be called general procedures supported by limited rationales - such as fallacy theory, assumption theory and what we might call argument graphics (e.g. modified Venn diagrams and tree diagrams). Nevertheless, PL extends the choice of tools in the toolbox considerably, to cover the topics just mentioned, and others; but it does something else as well. For IL often strikes potential users as a technology without a science, and, as such, seems incomplete. Such observers are then particularly vulnerable to exactly the kind of academic confidence trick that has led many people to think of physics as the science from which technology flows, and hence to think that increasing expenditures on physics (and perhaps chemistry and biology) labs, and teaching, is the appropriate way to improve technology. In our area, the analogous fallacy is that of falling back onto formal logic as a foundation for teaching the practical

16

Michael Scriven

techniques of reasoning - or even for conceptualising them. An example would be the definition of 'assumption' (or 'missing premise') as 'a proposition whose conjunction with the stated premises - and possibly with other assumptions - would convert an argument into a deduction'; serious argument analysis requires a concept that is both more and less than this, and which varies enormously depending on the type of investigation. The most obvious point is that one has to avoid 'trivial' conversion of an argument into a deduction, e.g. by adding the conclusion; reflection on what counts as 'trivial' is the first step towards understanding the size of the gulf between IL or PL, on the one hand, and FL on the other. And perhaps one could put one difference between PL and IL by saying that PL treats the cases where the assumptions would convert the argument into a deduction as special cases, not to be expected; the normal case is where the assumptions are correctly called 'further reasons' - they must also meet other conditions, in order to be assumptions - and the set of premises including the further reasons does not allow the derivation of the conclusion, even with probability.

5. Engineering vs. Physics as Sources of Technology. It's worth looking in a little more detail at the line of reasoning that seduced people into thinking that technology 'depends on' or derives from physics. Historically, of course, the facts are exactly the reverse; it was technology from which much of physics came. It is interesting that such a simple error could have been so widely accepted as the truth; it tells us something about the naivete of much of our thinking about the relation of theory to practice. Similarly, it was reflection on the rough-and-ready 'rules' of practical argumentation for example, the tradecraft of the Sophists, and the 'principles' of the rhetoricians - that led to much of logic. And, more recently, it has been a re-examination of everyday and professional argumentation - and the principles that are said to govern it, such as the avoidance of fallacies - that has led us to reconsider and eventually reject formal logic (whether syllogism theory or modern symbolic logic), and classical fallacy theory, as adequate accounts of argument. The new IL did not come from the 'science' of formal logic, it came from reconsidering the practice of reasoning. Recent work on the nature and modern history of technology has made it clear that much of it still comes from what has been called 'inspired tinkering' in the rewriting of the history of the Industrial Revolution. Even when that is not the case, it is engineering rather than physics which fathers and fosters most of the remaining technological breakthroughs on the physical side (a similar argument applies to biotechnology). Physics, like logic, became increasingly concerned with its own (legitimate and fascinating) questions and drifted away from technology, which (along with philosophy) gave birth to it. It soon lost its utility as a theory or progenitor of technology, just as formal logic soon lost its value as a theory of argumentation. But the engineering that built the pyramids, the Roman viaducts and the Carthaginian siege engines was there

Probative Logic: Review & Preview

17

to fill in the gap. Today we need a kind of logical engineering, not the logical physics of formal logic, to provide us with a pedagogically and intellectually useful backup to the practice of reasoning. We need to remember, too, that the place of inspired tinkering has not gone and will never go; that is, many of the new technologies of argument analysis will spring out of specific analyses of practice, not from theory at all. (An example of this is the 'fallacy of statistical surrogation' which will be described in the Fallacies issue of Argumentation.) In a way, what we need is the face of the coin whose reverse is the fallacies approach. The fallacies approach is specific and can be very useful (when the fallacies are redefined in the way that Charles Hamblin pioneered) - but it is essentially negative. It needs a better half, a more positive half. Probative logic is intended to provide that better half, and indeed to display the fallacies side of the coin as an application of its 'engineering' principles. While largely programmatic, this paper is intended to include enough details to prove the existence and show the size and riches of a logical continent that still needs detailed mapping; a sketch map follows.

6. The Probative Logic Game Plan. It is clear from what has already been said that probative logic - when working in the argument analysis mode - has a somewhat different approach or point of view from much of IL, and a fortiori FL. Probative logic focuses on the argument-as-given rather than on the attempt to reconstruct it, and allows some ways to present and support arguments that even IL excludes. This makes it harder to demonstrate that a fallacy has been committed, easier to determine that some truth has been established or made plausible. But what are the 'engineering principles'? They involve a reconceptualization of the key notions of practical logic - notions such as 'definition', 'explanation', 'grounds', 'evidence', 'fallacy', and 'proof (in toto, about twenty concepts). This change is designed to eliminate the formalist elements still found in most IL textbook definitions of such concepts, and replace them with a radically 'subjective' alternative account, according to which each is understood in terms of - speaking loosely, defined in terms of - two basic concepts: (i) the 'subjective' notion of information that is developed from the evaluative use of 'informative' meaning 'useful new knowledge'; (ii) the 'subjective' notion of information structures meaning 'ways of organizing one's knowledge so as to meet retrieval needs and demands'. (This second notion is the key to analysis of explanations and understanding, by contrast with stating and knowing.) Prototype analyses of this kind have already been published as part of the philosophy of science literature (e.g. Scriven, 'The Concept of Comprehension: from Semantics to Software', in Language Comprehension and the Acquisition of Knowledge, eds. R.O. Freedle and J.B. Carroll, V.H. Winston & Sons, Washington D.C., 1972). The key concept in engineering, the concept that most clearly distinguishes it from sciences, is usually identified by engineers as 'design'. Now there can be no such thing as a good design without a design problem, which normally comes from a client but may be proposed by a teacher

18

Michael Scriven

or an engineer in an exploratory mode; it must propose a goal which translates as specifications to be achieved. For probative logic, the key concept is 'informative' and nothing can be informative unless there is someone who is to be informed. Without specifications from and of that person, group, or hypothesized audience, logical analyses cannot be judged and indeed cannot be sensibly performed. Thus the logical notions are relativised to context and audience (though not in the more linguistic way that characterizes the speech act analysts). This approach provides a 'logical engineering', a positive conception of argument, inference, and reasoning in general, in terms of which we can fit the fallacies in as the reverse of sound argument, as malfunctions of the main process.

7. The Logic Stakes. These issues about the kind of subject needed to 'support' - that is, to explain and extend - sound logical practice are obviously not just academic issues. We are now pouring resources into teaching informal logic of one kind or another, even legislating to require that it be part of secondary or tertiary education, as in California. It is instructive to look at what is happening in those states which are thus trying to boost their 'technological' performance in these hightechnology times. All too often we find that they have been seduced into thinking that teaching the equivalent of physics is the solution, and we find that the content of the courses, at least the conceptual content, is either formal logic or something very close to it. One can expect little benefit from that approach; if there is any transfer to practical reasoning from formal logic, it has yet to be demonstrated. Formal logic is, as far as we know, simply an intellectual chess game with no payoff for either practical argumentation or, more sadly, for the other pure and applied disciplines. Even computer programming, which in a way sprang from formal logic, has long transcended it for payoff and matched it for intellectual challenge; and has not been significantly helped by those who could so easily have made it a major specialty. We who are interested in teaching or understanding argumentation, everyday or specialized, must now go elsewhere. Informal logic, on the other hand, if kept close to payoff territory of important arguments, is in no danger of shortchanging the customer, even if we accept McPeck's view that we have to be teaching some content material (or nothing useful). If no-one else is bothering with this content, and it's important, that's nothing for which we need apologize. But of course we are doing more than this; we are teaching the proper use of the large and useful vocabulary of logical criticism that is pervasive in our language and life, and that means we are teaching people how to do and report on logical criticism. We are also teaching the metacognitive skills that are now thought to be crucial in most areas of problemsolving (and even in beginning reading), and are obviously crucial in logic. Even so, to attract more supporters and to satisfy the intellectual curiosity of adherents, IL needs to identify, advertise and protect a respectable genealogy, a decent engineering ancestry rather than a choise between bastardry from the bed of the formalists or a tinker's trade.

Probative Logic: Review & Preview

19

The trouble with giving technology the wrong parents is not just inefficiency. It is in some respects seriously counter-productive. In particular, it fosters the insidious idea that those students who are not good at mastering highly abstract theories and formalisms are not likely to be major contributors to technology, whether it be the technology of reasoning or medicine or war. With respect to hardware technology, physicists tend to foster this kind of belief for the obvious reasons - it is ego-supportive, it contributes to their budgets and power, they can think of some good examples of physics generating technology (e.g. nuclear power) - and they know next to nothing about the recent conceptual revolution in the disciplines of the history and logic of technology which have exposed this misrepresentation. In our field, the formalists suffer from most of the same rationalizations in claiming parenthood or ownership of undergraduate logic. The cost of this mistake must be counted not only in terms of failure to get improved practice, but also in terms of unnecessarily turned-off students and undiscovered talent. We should be involved in the recognition, appreciation, and multiplication of the 'inspired tinkerer' in reasoning - the individual who instinctively locates the jugular of an argument, who has a fine sense for the most powerful reconstruction in face of threat, a neatness in formulating the definitive counterexample or argument diagram. These people are not necessarily - and possibly not commonly - good formalists, any more than the best programmers are commonly good mathematicians or the best architects good physicists. They are nearer to being good writers or perhaps good historians or journalists or interviewers; or barristers or critics or managers. At the moment there is a tendency to assign people to teach IL who did well at formal logic, and they fall back on teaching the territory they know best and identifying as the best students those who do well on that turf. This vicious circle needs to be broken and it can only be broken if we can provide a complete intellectual package for the teacher, a package that contains a general account as well as specific techniques. So we need to develop an alternative genealogy, if the obvious charms of formal logic - precision, abstraction, tradition, prestige - are to be offset. Probative logic attempts to provide this via some engineering to underpin the IL technology, first by showing that the engineering is missing. In this way it tries to provide the self-concept that IL needs in order to crystallize a separate identity from formal logic, the physics of argumentation. This might be called the ideological need to which probative logic is a response. Later, we'll look at the technical need.

8. Other Alternatives

Powerfull antidotes to formalism and supplements to IL are offered by other approaches, such as Richard Paul's 'critical thinking in the strong sense', or Ennis and Siegel's 'critical attitude' or ''critical spirit'. Even they would be strengthened by alternative structural models or paradigms of the larger enterprise to which they contribute. It is easy to demonstrate that skill at deep criticism does not

20

Michael Scriven

follow from any (known or likely) set of rules; and just as easy to show that students can learn to improve their first-level skills in identifying (in others) and avoiding (in themselves) the more seductive and interesting fallacies without getting into the realm of deep criticism. So, good critical thinking involves more than mere techniques, but teaching the techniques is valuable even if you can't teach - or the students can't learn - the other skills. And deep criticism is also improvable, by the good teacher, even if not by means of learning formulas. Probative logic can't provide rules to generate insights, but it can go beyond the traditional reactive role of logic and help to stimulate certain kinds of (relatively low-level) insights in the same way as the best of the new 'idea processors' (e.g. Marthink for the IBM). An example of this could be developed using 'eliminative inference' and the 'exhaustive list' discussed later. There is no sharp line between the teaching of systematic and disciplined approaches to argument analysis, creation and display, and the teaching of insight, just as there is no sharp line between algorithms and heuristics in the programming of computer expert systems or the instruction of mathematics students. Nevertheless, it is essential to emphasize that of all the debates about the nature of logic occurring around the emergence of IL, the one with the greatest significance for society, and for the development of thought itself, concerns the importance of the 'critical spirit'. Even small contributions from the technology or engineering of PL to successful pedagogy in this area acquire significance because of the importance of the cause. There remains the ugly fact that most current evaluation of social institutions and political positions is superficial to the point of triviality or invalidity in a way that cannot be cured by teaching the techniques of practical logic; here the challenge to us all is to teach, with the help of those techniques, the critical spirit itself. Allegedly serious discussions of current hot topics - drug laws, workfare, prison conditions, theism, the level and types of 'defense' expenditure, homosexuality, farm subsidies, immigration control, offshore military activities, pornography, infanticide, or suicide frequently fail to identify the most crucial and questionable of the assumptions on which a positions rests; or the critical alternatives to which it should be compared; or the agents who are ultimately responsible for the weaknesses or strengths of an approach; or the opportunity and other costs involved in its adoption/rejection; or the most significant of the possibilities or implications or potentialities or generalizations of the approach; or, for that matter, the known relevant and significant evidence. Consequently, they often fail to see that there is a significant reasonable alternative. There is no rule which avoids this mistake - in the Watergate affair, The New York Times and almost everyone else made the mistake and only The Washington Post and a very few commentators avoided it. Only a critical spirit, great imagination, considerable courage, and a large repertoire of experience - personal or vicarious or communal - makes it likely that you will be the first to see through a major new deception like that one - and you still need to be lucky. Despite all that can be done to make critical assessment more scientific, or at least to ensure it is better engineered, originality in that great endeavor, as in engineering and science themselves, will in the end depend on great insight. Thus highly original critical thinking often requires

Probative Logic: Review & Preview

21

great creativity, in order to see alternatives that others have missed. But systematic thinking about these issues - everyday critical thinking - can be taught, habituation to the idea of criticizing the establishment view can be taught, recognition of the still-common fallacies can be taught, and thus the frequency of succes can be brought up from almost zero to a respectable level. You really don't have to provide general theories of logic or argument to improve either technique or deep criticism, but the idea that providing some kind of general theory is an obligation for the educator dies hard, so formal logic gets dragged in. A better step is to tie the techniques to an alternative conception of argumentation from that provided by formal logic; given that, the teacher and the student can avoid seeing the alternatives as something 'scientific' vs. something lower-class because lacking in explanatory capability - something 'from the mechanic's toolbox'. Thus engineering is important because there is a kind of intellectual imperative which demands some kind of theory. The sophists must have had a useful slice of IL technology in those early days, but they lacked the theory that Aristotle provided (and perhaps the ethics that Plato demanded). Though we have no reason to think that Aristotle's logical theory made him any better at practical argumentation than the sophists, it was hardly a close race for a place in intellectual history. The theoreticians may be right or they may just be snobs, but in either case it helps to offer them something they can identify as much more than a toolkit - and not just more in the sense of going deeper into presuppositions and political implications. They need at least a paradigm and preferably a fullfledged account, even if it is 'only' at the engineering level. It seems clear that going to anything more abstract is less useful. In recent years we have seen some notable efforts to provide alternatives that might be called revisionist, as well as some attempts to reach out further (besides those mentioned). In the revisionist group one might include many-valued and inductive logics, fuzzy set theory and relevance theory as particularly interesting examples, each of them offering at least some possibility of breaking one or two shackles of previously existing versions of formalism. In the more radical group, Stephen Toulmin's The Uses of Argument (Cambridge University Press, 1958) was an outstanding example of an attempt to redraw the nature of reasoningin-context. Perelman and Olbrecht-Tyteca's The New Rhetoric a Treatise on Argumentation (English translation, University of Notre Dame Press, 1969), though the specific recommendations are less clearly formulated, provides another. A third comes from the speech-act approach of Austin, then Searle, and now the Dutch group including our hosts van Eemeren and Grootendorst; and Michael Polanyi's work on 'implicit knowledge', with its emphasis on the perceptual or intuitive transcendence of formalism provides a fourth. Something important can be learnt from each of them. Nevertheless, it seems that more is needed. A 'full-fledged' theory of probative logic as of fallacy theory and explanation theory surely has to draw heavily on information theory and cognitive/perceptual psychology; there is some hint of this in earlier pages. But a simple paradigm may be worth having immediately, since it can help guide us through the detailed

22

Michael Scriven

examples that must antedate any general theory - help us to avoid the haunting thought that all of them could 'really' be reconstructed in formal logic. We need to see from the start that they can be better portrayed in terms of another approach. This paper did not begin with a detailed example which cannot be handled by the use of existing logics, because it is first necessary to soften up the resolve in force everything into those moulds. The appearance of positrons in cloud chambers, like epicycles in planetary motion, was happily absorbed into the current Zeitgeist until an alternative overall perspective was made plausible by arguments that did not depend solely on the exhibition of counter-examples.

II SKETCHES AND SPECIFICATIONS 9. A Metaphor for Probative Logic Paradigms usually center around some key metaphor, though it is often refined or radically revised in later phases of the same development. Sometimes the point of a paradigm is expressed in terms of a pair of contrasting metaphors - the geocentric vs. the heliocentric models of the solar system, the particle vs. the wave models of the electron, the evolutionary vs. the teleological theory of speciation. In this case we will use as an illustration of alternative structural models the contrast between the iceberg and the oil rig, as two ways to support something above the surface of the sea. We can think of the above-surface part of either as corresponding to the spoken or written premises and the (possibly implied) conclusions of an 'incomplete' argument. The approach corresponding to the oil rig - in this case, an offshore drilling platform attached to the sea floor - is demonstrative logic, and it assumes that the visible argument is secure only if there is an underlying structure connecting it to the security of the ocean floor - a structure, moreover, that conforms to the same design principles as above-ground structures. The iceberg paradigm suggests that probative logic favors a completely different method of support for its conclusions, albeit without breaking any laws of nature. The iceberg has no connection to the bottom of the sea at all, and its survival is not threatened by sudden increase of depth (longer chains of inference from the axioms). The iceberg lacks even the right kind of shape for supporting its superstructure in another medium; once removed from the sea, it can easily roll straight over. What the iceberg does is to rely on its special context, the medium it inhabits, to provide it with the needed support; it does not look to the ultimate contact with earth. For the oil rig, that ambient medium is the enemy, the cause of disasters, something to be overcome; for the iceberg, it is the means of survival - for a reasonable length of time. The probative inference is a creature of its context, a polymorphic natural entity, not a highly structured intrusion; something with a useful life but not aimed at indefinite longevity. As to investigating the security which a particular iceberg provides for its visible portion, this cannot be done by looking to see if there is a simple regular structure beneath it, constructed using the laws of statics and the known strength

Probative Logic: Review & Preview

23

of materials. In evaluating the stability of icebergs, it is hydrostatics rather than statics and hydrodynamics that must be obeyed, temperature not pressure that is important to survival, and the short run rather than the long run that defines the range of confidence. An iceberg is not an oil rig in disguise, and teaching students of icebergs how to put oil rigs under them is not equipping them for survival. An iceberg can be checked for certain serious possibilities of danger that the expert eye detects - not for all such possibilities - by making specific investigations. One looks at the fall lines, the contours, the crevasses, as indicators of subsea structure and of morbidity but even those checks are tentative because of such differences as: the differential melt rates above and below the surface; environmental temperature and humidity gradients due to currents and winds; the anisotropy and variable specific gravity of ice itself; and the unpredictable and powerful gross weather effects. About the only way one can get comprehensive answers about iceberg stability, other than for very simple shapes, is by constructing a scale model - which we know in advance is a serious oversimplification since it ignores all the factors just mentioned. Even to gather the data for the scale model would, typically, be extremely difficult; and it might well take so long to model a big berg accurately that melting would make the data obsolete by the time it had been assembled. An iceberg is an evanescent natural object, not a semi-permanent artefact. A probative inference is also a natural phenomenon, viable in one context but not another. You find out about its strength by specific search and skills, not by requiring that its hidden parts provide the kind of structure that you would put there if you had to build an oil rig to support the same upper surface. So much for a metaphor to hang one's hat on. Can it be filled out with some iceberg engineering?

10. The Technical Need for Probative Logic Probative logic addresses three major difficulties that arise in applying conventional logics to a wide range of informal argumentation. These constitute the main reasons for the failure of formal logic to assist with practical argumentation. They can be described as the difficulties posed by incompleteness, imprecision and insecurity. (These terms are the ones that would be used by the formalist; as we will see, they not only refer to features of real arguments (descriptions, inferences, etc.) that are inescapable but those features are not without their own virtue.) We have already referred to the first of these and our metaphor suggests that the entire approach to so-called'incomplete arguments' should be switched from trying to discover or construct underpinnings that look like an oil rig built to standard specifications, to seeing whether any of the specific sources of doubt apply that are occasioned by the specific shape of this argument, treated as an iceberg. A good argument, in these terms, only has to float; it doesn't have to stand up. Even if it turns out that it will stand up, we may still need to look to its security, depending on what threats impend.

24

Michael Scriven

In fact, it only has to float today, in this weather and sea, not in all future storms, in order that it support its superstructure. Thus, one of the signs that we are dealing with an iceberg argument is the use of terms like 'now' or 'here' or 'looks like that' in the discussion of the inference - either from the protagonist or the antagonist - terms which allow some immediate checking but none later. And many of these terms reference things that can only be recognized as relevant or decisive by someone with special skills, the Aleuts or Inuits of the iceworld; again, a feature that cannot be built into premises that will be acceptable in a formal reconstruction. But they refer to facts that are crucial to the inference, and are entirely testable now, if the testing is done by the right people. There is no loss of objectivity or validity from these requirements, only of the permanence of the testing conditions and the generality of the tester's skills. The formalist has sold us on the idea that in a real proof, once stated in full, it is obvious to anyone that each step follows from the previous one. That's certainly false; what we actually use is 'obvious to a fairly smart and experienced reader of this kind of symbolism'. Thus even there, and - common sense would suggest - in general, the operative requirement is only that the inference be 'obviously sound to someone whose skills are demonstrable', be it a human or a computer. Notice how close we come here to perception as the replacement for oil rig inference; this is a recurrent theme in probative logic, where evaluating and unpacking perceptions is as common as unpacking the subtleties of highly emotional language. Inference develops into perception with the growth of experience or expertise, and hence any approach which claims to be able to deal with one must be able to trace its encapsulation in the other. This is no sudden process; the demonstrative stepwise inference becomes a process of skipping steps, then of sliding and then of seeing. Probative logic can often reconstruct evidence for the conclusions of the latter two processes just because it does not demand that the forced standards of quasi-formalism be met. When someone indentifies a species of bird from a fleeting glance as it flashes across a forest glade, one is imposing rather than discovering structure if one insists that this was a conclusion inferred from features. Yet, in defending the claim, its author will resurrect those features - and other facts - from memory and argue for the identification as if it had been the conclusion of an inference. And this is entirely appropriate, even though she is in fact defending a (claimed) perception rather than an inference, since the aim is now to convince others. And, for what it's worth, the brain processes - perhaps still and certainly during the learning process - followed these steps. (Another way of looking at the difference between the context of justification and the context of discovery.) One aspect of the ''imprecision difficulty' is an extension of the incompleteness problem. References to 'the way the sky looks now' are incomplete for purposes of demonstrative argument reconstruction - though in fact acceptable for probative argument - but a formalist might also describe them as imprecise. There is a larger family of imprecise terms, however, terms like 'most' and 'many' and 'more' and 'nearly all', converted by the syllogisters and most of the modern formal calculi into the absurdly inadequate dichotomy of 'all' or 'one or more'.

Probative Logic: Review & Preview

25

To accept such a conversion is to throw away most of their meaning and thus to invalidate many good arguments using them. Geach, in Reason and Argument (Oxford, 1976) reminds us how narrow-minded even twentieth-century logicians have been, with his example of the failure to follow Sir William Rowan Hamilton in recognizing the inference from 'Most As are Bs' and 'Most As are Cs' to 'Some Bs are Cs'. One might refer to the study of this area as 'the logic of weak generalization'. There are vast realms of the imprecise beyond this narrow but important corner; all the subtleties of the informal probabilities, for example, which relate to the distinctions between 'just possibly', 'possibly', 'quite likely', 'very likely', and 'certainly' - and many more, even along that one dimension; then there are the size terms, from 'tiny' to 'colossal', each completely non-substitutable for any other, in certain claims in certain contexts. And so on. This much one might expect from a chapter in a comprehensive future book on informal logic; thus in part, PL simply adds to the repertoire of IL. But probative logic includes a general element missing here - the emphasis on analysis of the context which so often anchors one or another element in these floating scales. Here the problem of incompleteness combines with that of imprecision. We have never really worked out the engineering to deal with context-dependence. We might be able to cope with simple examples like 'Very small elephants are much larger than very large mice', but not the harder cases like 'Good thieves are worse than bad thieves' or 'Good cars have nothing in common with good bridge hands'. The discussion of this kind of point with respect to the term 'certain' was highlighted 44 years ago with Norman Malcolm's telling 'Certainly and Empirical Statements' in Mind (1942 pp. 18-46) - Toulmin was doing an analogous analysis of 'possible' in lectures at Oxford in 1950 - but it has not really been extended or tied to an engineering-level account of language in context. Law courts sometimes get into this area when dealing with evidence of libel or conspiracy, but their level of analysis is there, unusually, still quite simplistic. Literary exegesis often overdoes the subtleties here, and is not a good paradigm for practical argumentation; propaganda analysis and communication studies have a spotty record on getting the details right, but are certainly hunting in the right place. Recent speech act theory offers a new change to get a general engineering account going. But it is sometimes built on sand, for even the experts are often wrong in doing context analysis; what seemed an obvious assumption to Grice, analyzing a quotation out of context, would not have been an assumption at all in a different context, one that hadn't occurred to him. From the probative point of view, what is said is only a first indication of what is meant, and the first tools to teach the student are the tools of inquiry to pin down meaning before we get to the tools of reconstructive surgery, or those used in classifying an argument as valid or invalid. Where the author is not available, then we must have recourse to the construction of what we can call PLOTs (for Possible Lines of Thought), with little commitment to the claim that one of these was in the author's mind (i.e. that they are truly reconstructions), but with some interest in proving that one or none of them would salvage the conclusion or 'the' argument (actually, a related argument).

26

Michael Scriven

In the absence of the author, the context can sometimes provide enough clues to pin down meaning; but the general emphasis in probative logic shifts from blaming to investigating. Even when the author is present, what happens when the challenge is laid down is that the author thinks up answers and these are not always properly thought of as something the author had in mind when the argument was put forward in the first place. Thus a typical investigation is creative and not just analytic; and using the PLOTs terminology rather than talking of argument reconstruction and 'missing premises' reflects this shift to the truth-finding rather than the argument-grading mode. The insecurity of the formalists about imprecision and incompleteness was nowhere more pronounced than in the specification of meaning. The formalist model of definitions as perfect translations not only got them into the wellknown difficulty arising from substitution in the definition itself, it has almost no real cases outside the domains of algebra and recent neologisms. In the real world, we specify meaning by the use of examples and contrasts and analogies or explanations, filled out to the degree that our audience needs; a perfect example of the context-dependent and effective processes of probative logic. Where oil rigs are impossible, the iceberg floats safely on. The neo-positivists gave lip service to something like this with occasional references to 'implicit definition', but of course what they were talking about was meaning embedded in axioms, not meaning embedded in the non-linguistic context. For them, the extension from syntactics to semantics was a significant advance (though to most readers it looked like more of the same); but going on to pragmatics was 'giving up on logic'. From the point of view of probative logic, if you do not take that step you never reach the domain of useful logic; 'pragmatic' reverts to its original meaning and becomes a term of approval. In the long debates about explanation theory, for example, not only Hempel and Feigl but Braithwaite and Popper thought that any references to the level of understanding of those to whom an explanation was directed was 'mere pragmatics', completely out of place in a logical analysis; for Campbell, Dray and Scriven (amongst others), any account which left it out - indeed, any which treated it (or something equivalent to it) as less than the key notion - could not even qualify as a logical analysis. In later work on the other components of the grammar of science, it became increasingly obvious that virtually every concept that had looked quite suitable for formal analysis - not just 'proof and 'law' and 'explanation', but 'prediction' and 'probability', .'observation' and 'information', 'theory' and 'refutation' - was in reality 'essentially subjective', the neo-positivist epithet for any analysis that involved essential reference to the state of knowledge or understanding of some actual or potential person or role-marker in a presentation or interaction. Thus the roots of the present debate about logic spread far into the domains of philosophy of science and of probability theory, and await development in the philosophy of quantum mechanics and relativity where the issue of 'subjectivity' is crucial to the fundamental concepts of measurement, absolute space and simultaneity. What is happening in logic is just one aspect of what might be called either the 'objectifying of subjectivity' or the 'subjectifying of objectivity'.

Probative Logic: Review & Preview

27

The development of cognitive psychology, and the computerization of patternrecognition and expert systems have made the path easier than it was in the '50s, though the arguments have changed little. The fact is that we now have to include the information content of the environment - and the people in it - as parameters in our logical analysis that are often more important than the information content of the language we see as our first target. Indeed the latter concept makes little sense without bringing in the former. Of course, the language games approach of Wittgenstein made this obvious a long time ago - in a way. And others, like Morris with semiotic, and Whorf with linguistics, have stressed it in their way, as the speech act group have in theirs. The real question is whether one can convert the essential insight into new ways of doing analysis that work better by consumer standards. This only shows up in the detailed analysis of specific concepts, in logic and in other areas we subject to logical analysis. Enough of this has been done in analyzing definitions, explanations and laws, to make the claim of progress testable; but more needs to be done at the level of restructuring elementary reasoning textbooks. Perhaps the most important applications of probative logic to the domain of the imprecise involve the 'logic of possibility'' and the 'logic of approximation'·, important partly because of the amount of work that has been done (the use of ideal types, stereotypes, and Zadeh's ''fuzzy logic' are special cases of the logic of approximation), and partly because of the significance of the applications of this work. These new ' s o f t logics' including the 'logic of weak generalization' revolutionize the simplistic post-positivist account of the physical sciences as the domain of precise and exceptionless empirical generalizations. In the first place, these extensions made it possible to correct a completely distorted picture of the logic of the physical sciences, by stressing that most physical/biological laws are: (i) only approximate; (ii) only apply to most cases; (iii) are quite often not precise; and (iv) are quite often tautological or very close to it. The last charge was peculiarly painful to the neo-positivist, because on their view tautologies could not be informative about the world (they were only informative about linguistic rules, which were essentially arbitrary). This naturally entailed - given their view of mathematics as a system of tautologies - that mathematics could not be informative about the world. A more sensible view is that mathematics is, like physics, very largely a system of propositions referring to idealizations about the world, which are useful approximations to reality and hence informative about it to the extent that the approximations are close. These propositions are not pure tautologies though often quite close; so the Parallel Postulate can turn out to be false. Even if they are tautologies, as some of them are, they can be informative about the world because they connect concepts which are related to the world, in ways that we had not previously recognized (note the essential reference to our state of knowledge in any sensible account of what is to count as 'informative'). So the Four Color Theorem really is informative - as anyone not corrupted by bad epistemology would say about the number of colors you need to make maps, and Pythagoras' Theorem really is informative about surveyed triangles etc., just as Hooke's Law or Boyle's Law is informative about everyday elastic substances or gases.

28

Michael Scriven

The lessons for logic are many and imply a rejection of many sections in current IL texts. For example: (i) it is not appropriate to dismiss premises or conclusions that turn out to be implicitly definitional as 'lacking in empirical content'; (ii) it is completely wrong to think that the falsification process idolized by Popper is a definitive test of empirical content - since it would reject most named scientific laws (in fact, it is logically incapable of coping with informal approximations i.e. most approximations that have any scientific currency and utility); (iii) if 'inductive arguments' are treated on the model of 'inference to the best explanation', the account of 'explanation' cannot be formalist or the whole analysis will lose touch with reality; (iv) the accounts of 'definition' need to avoid the suggestions that definitions in the sciences and the law - except of recently coined neologisms - are either arbitrary or tautological, and that they cannot entail empirically informative consequences; (v) one reason why tautologies and truisms can serve as valuable parts of explanations and proofs is their role in organizing knowledge rather than extending it. But there is little discussion in IL of the information-organizing function despite its obvious importance in classification which has long been on the logic turf. It is of course acknowledged in Speech, Communications and Rhetoric, not to mention casepreparation tradecraft amongst trial lawyers; it is the key to information storage technology and hence significant for expert system design; it is now receiving a great deal of attention as we look more carefully at computer graphics for information presentation; it is a most important dimension of advertising and propaganda, which we try to analyze in IL classes and papers; it is an important and little-discussed aspect of the doctrine that 'the medium is the message'. Logic should pay more attention to it, not just for the 'applied' reasons just mentioned, but also because the role and value of axiomatic systems, the notion of elegance in proofs, and the use of Venn and tree diagrams would be likely to benefit. Finally, (vi) any claims about the propositions of logic itself have to be looked at critically. It has plenty of empirical content in the only appropriate sense (the De Morgan Theorems are informative to most people who come across them for the first time, for example; and the Incompleteness Theorems tell us something that is just as important about efforts at complete formalization as the implications of the laws of thermodynamics are for efforts at perpetual motion). And yet it is still correct to say that most of logic's simpler propositions, such as modus ponens or modus tollens are true because of the meaning of the terms in them. We have to rethink the usual assumption that these views are incompatible. Thus McPeck is right, albeit in a way he did not have in mind; we are teaching a subject with content. The second payoff from creating these new 'soft' logics was to show that the knowledge uncovered in the social sciences - and in history, the law, teaching, crafts, management, literature - has a perfectly respectable format and status, rather than the 'soiled dove' status accorded it by the neo-positivists with their concept of proper scientific knowledge as a simplistically conceived physics. And thirdly, this new status for the libelously labeled 'imprecise' and 'incomplete' makes the analysis of everyday argumentation much more accurate; much

Probative Logic: Review & Preview

29

commonsense knowledge that was dismissed before can now be recognized as highly informative. The neo-positivists made room for one kind of 'less than perfect' claim, the individual probability claim - though it was more than the positivists cared to concede - as they eventually did for statistical laws. But only quantitative probability was granted any status. In fact, quantitative probability applies to a very minor though delightfully manipulable part of probabilistic knowledge, and without a substantial training in handling qualitative probability, the scientist or citizen can hardly cope with common arguments or advice at all. When pressed, formalists usually act as if informal probability language is simply a crude estimate of quantitative probability. But it's generally accepted that a probability of 1 is not the same as certainty; that the crucial distinctions between logical possibility and empirical possibility and practical possibility are not distinctions on the numerical scale; that 'plausible' and ''prima facie likely' are not on the scale at all, though they have something to do with probability; and that any axioms of probability would have probabilities of their own which are presumably not covered by the axioms. And we find that in the practical world of expert system design, many systems run very well without any allowance for probability at all; others abandon the usual laws governing it and use entirely different ones. This is not a picture of the domain of an unidimensional variable, nor of Carnap's 'two senses of probability'. It is a picture of a complex theoretical concept that is connected to degrees of belief and to the axioms of probability calculus and to relative frequency, but escapes being reduced to one or both or a disjunction of the three for the same reason that validity can't be reduced to provability or number to the Peano postulates - it has a meaning that transcends any single level of language. It can be as easily applied to metareasons as to reasons, to axiom sets as to propositions, to theories as to events or ensembles, to logics as to logical values - and it is the same concept in all these cases. It's perhaps a slightly more general notion than truth, and certainly no less elusive. Its uses require all the skill of the practical logician to explicate, and the idea that there might be a single explicit definition of it (or set of axioms implicitly defining it) is as naive as the idea that the same could be done for 'truth' or 'value' or 'existence'. Probative logic is committed to the idea that every use of such terms can be clarified, every argument involving them can be assessed, without there being the slightest possibility of general definitions - or disjunctive sets of them. This is not some peak performance by probative logic, called up to deal with the most basic notions of epistemology, it is the standard assumption in dealing with concepts of great interest, from politics ('democracy') or ethics ('rights') or biology ('species'). The richer they are, the harder they fall to definition. Concepts that are hard to pin down are more likely to create logical problems and more likely to take all the skills of the practical logician. Korzybski, Hayakawa, and the rest of the General Semantics group were right to realize the importance of meaning clarification, but they omitted half the task. G.G. Simpson used to talk of the two extremes in approaching palaeontological taxonomy; the 'splitters' were inclined to create a new species

30

Michael Scriven

whenever they saw a new fossil while the 'lumpers' would assign it to an existing taxon, no matter what. The problem with the General Semantics group was that they were congenital splitters who never saw the equal importance of 'lumping'. Carnap's approach to 'probability' was a simple splitting approach. He never saw that the crucial question was why the same term was used for both concepts. There will indeed be contexts where splitting is a good way to handle a problem; but these will be matched by those where it misses the point entirely. In any case, the logic of possibility is far more important in probative inference than the calculus of probability. For the logic of possibility drives eliminative inference, perhaps the most useful single pattern of reasoning in the applied sciences after modus ponens. The major premise in eliminative inference is a list of all possible factors of a particular kind, e.g. causes of death, outcomes of increasing tariffs, explanations of gold price changes, motives for murder, suspects in a robbery; the minor premises establish, often indirectly (e.g. by appeal to modus operandi), the impossibility that one of these factors could have been involved on this occasion. (Examples of this kind of inference have been worked out in some detail under the labels 'selection explanations' and the 'modus operandi method' by Scriven, 'inference to the best explanation' by Harmon, and 'strong inference' by Piatt). While the inferences within an elimination schema are sometimes more or less deductive, the 'exhaustive list' major premise is often only more or less exhaustive and more or less precisely descriptive of the factors, so that a good deal of slippage is possible. And of course it is only more or less probable, which puts constraints on the probability of the conclusion quite apart from the other limitations. Discovering or amending or improving an exhaustive list often has more practical importance for everyday investigations than discovering a new law; but it is only as useful as one's understanding of its informal parameters, its concepts, and its contextual limitations. This is the territory of probative logic, not of the probability calculus with its misleadingly exact figures resting in their turn on intuitive estimates of qualitative probability that are essentially incalculable. We should not leave this topic without saying something about the virtues of 'imprecision'. The various devices referred to by this epithet - approximation, idealization, weak generalization, context-dependent predicates, etc. - are the pneumatic tires of the scientific explorer's vehicle. Without their insulation from the shocks of reality, movement across rough country would be virtually impossible. This kind of insulation from physical or thermal shock is common in nature. In a related way, the iceberg survives because it can rise and fall under the changes of water level and give way to the thrust of gale winds. Oil rigs run to 800,000 tons today; but even that kind of inertia is not as good a defence as the small motions of the iceberg. The marine architect's favorite cause for reflection is the fact that a coke bottle will survive the tempest that destroys oil rigs. Calculated but incalculable imprecision is a crucial device for scientific and commonsensical expression, and exploring it deserves ten times the attention given to universal quantification.

Probative Logic: Review & Preview

31

11. The Insecurity Difficulty The most profound source of insecurity in argumentation occurs just off the screen, in the process of getting the picture set up; it is the encoding step, the conversion of experience into language. (But the decoding step that takes us from language back to significance for life is not of trivial significance.) The formalists are really only equipped to handle operations within language, and yet the results of any such operations absolutely depend on the accuracy or the probability of the first step into language. Eventually this makes nonsense out of the idea that a calculus of probability can encapsulate its meaning. There were some gestures in the direction of developing semantics, the discipline whose job was to bridge the gap between language and the world, but their significance can best be judged by the fact that it was thought to be profound to say that 'The snow is white is true, if and only if the snow is white'. Of course, this instantly converts semantics into the syntactics of quotations, which guarantees it will not handle the encoding problem. The formalists could never quite swallow the fact that the encoding step could be right or wrong, sound or unsound, even though it could not, by its nature, be formalized. It was a fatal mistake, and it locked logic into word games. Its significance as an error went far beyond the encoding problem; for example, it doomed the treatment of induction. Popper really didn't believe there was life beyond deduction; there were just guesses which were then given asymmetrical tests using deduction. He completely missed the point of scientific inference, but not because there is a logic of induction in his sense of 'logic' i.e. one with rules that can be applied deductively; on that issue, he was right and the Carnap camp was wrong. The point he missed was the real point; the issue between him and Carnap & Co. is a complete irrelevancy. The real point is whether there is a learnable, trainable, correctable, refinable procedure of scientific inference that is demonstrably superior to and completely different from guessing. We all know that there is such a procedure, typically a species of probative inference in that the scientist can point to reasons for believing his or her conclusions, reasons that are clearly relevant and supportive although it is impossible to make the whole sequence of considerations into a demonstrative proof that a certain probability should be assigned to the conclusions. That is of the nature of 'inductive inference' and, more generally, of probative inference. To understand what is wrong with formal logic, and with quasi-formal approaches to informal logic, one must understand just how far the realm of reasons reaches beyond the realm of demonstrations. One must understand that everyone who has ever reasoned, from the dawn of time, has usually given as reasons - and had accepted, properly - considerations that are never going to be part of proofs, no matter how far the boundaries of science are extended, considerations whose soundness as reasons does not in any way depend upon the existence of other propositions. This is true for the same reason that it is true that most causal claims, in science or everyday life, are not dependent for their truth on the existence of exact laws from which they could be deduced. It is true for the same reason that analogies are illuminating even when no-one can list the exact similarities between the entities compared.

32

Michael Scriven

We should not conclude without a reminder of another vast area to which probative reasoning brings respectability - evaluative inference, in practical evaluation as well as in ethics. Insecurity about the conclusions of a particular type of inference deeply affected the formalists here. They did not want to see this kind of inference legitimated - whereas they were not opposed in principle to the legitimation of induction. But the right move in that case would have let in evaluation, and perhaps that was part of the reason why they spent their days trying to formalize induction by making it into a deduction of probabilities. As one reasons from facts about the appearance and performance of a racehorse, or a restaurant, or a rosebush, to a conclusion about its merit, one is not working from a suppressed premise which defines good racehorses, or a set of rules that allows one to deduce from the presence of certain indicators a certain probability that a particular racehors is a good one. That may be what current expert systems do in the course of matching the inference; but that is not what the reasoner's mind is doing. It is working with the same kind of linguistic and intellectual skills that enable one to correctly describe the way to the nearest bus stop; the skills of language use and, in that case, an understanding of some specific spatial relations. In evaluation, the mind is combining its understanding of the function of evaluative language with the facts of the case in order to generate conclusions. When challenged about those conclusions, it will recall some of the most influential elements in the reasoning process. It cannot recall rules which it never knew and which, if adequate for deduction, would be too complex for comprehension. It is enough - as we know from much experience - that the indicators point a certain way, and that there are none so strong pointing the other way. This is how prima facie reasoning goes on, the first step in probative inference, and on that we build further, where possible and necessary, to reach firm conclusions. Reasoning is a practical matter; it cannot wait upon the progress of science any more than accounting can attend the elimination of problems in the foundations of mathematics or speaking can attend the formalization of linguistics. Evaluative reasoning is not entirely independent of inductive reasoning, since a key part of induction is the evaluation of explanations and theories. This process is the equivalent in science of the 'deep criticism' phase of everyday critical thinking, the part where originality and courage, and profound philosophical deliberation, are sometimes required. The idea that this process could be formalized was bizarre. Far from involving the 'naturalistic fallacy', then, inference from facts to values is simply another kind of inference in much the way that inductive inference is another kind of inference. In fact, induction has much more in common with evaluation than it has with deduction, as both have with legal reasoning, and they all exemplify what may be thought of as a candidate for a new logical paradigm, probative inference.

Chapter 2

Logic to Some Purpose Theses against the Deductive-Nomological Paradigm in the Science of Logic EM.

Barth To the memory of L. Susan

Stebbing

1. Logic and the Ivory Tower Let me start with a rhetorical question: Is it not sad that a hundred years after its justly acclaimed modernization, modern logic still has nothing to offer to human political life, and nothing to offer to students of political life? I take it to be a fact, and a recognized one, that logic, in its usual form, has still virtually nothing to offer in this direction and does not even try. I take it to be a recognized fact that the majority of people in this business - the business of academic logic - show no positive interest in changing this state of affairs and quite often even a strongly negative interest. This can be amply illustrated with facts from academic life, but I do not want to spend any time in doing that here. It will not be necessary either, since everyone here may be taken to be able to supply examples of this themselves. Instead I shall try to improve on the diagnosis of what is still wrong with modern logic from the point of view of its contribution to political life, and I shall try to point out and clarify another way of looking at logic. That other way of looking at it may be called a European outlook - for to my knowledge Americans and other non-Europeans have as yet not taken up this perspective. But first let us approach the task of giving a diagnosis of the obvious shortcomings of academic logic vis-ä-vis political life (in the widest sense). Four different diagnoses are well-known, three of them are in my opinion mistaken, and none of them is really to the point. The first says that the official science of logic is a straight-jacket because it is two-valued and hence reflects an untenable dualistic ontology. That diagnosis has petered out, for obvious reasons (there are many non-two-valued systems of logic). The second is that 'formal logic' is static and does not admit of time. This, too, has dropped out as a possible diagnosis, for similarly obvious reasons. The third is that as soon as academic logic is applied outside of mathematics it becomes irrelevant because it is 'too formal', or 'merely formal'. One reason why this diagnosis is wrong is that it usually is based on a mix-up of formality and formalism, or formalism as an instrument and a point of view and formalism as a psychological drive. I shall not give strict definitions of either one. As to simple formality in the sense of attention to linguistic form rather than to empirical content, this is in itself not blameworthy at all - on the contrary; and no more so in the logic of political affairs than in the logic of other domains. The fourth diagnosis, that academic logicians are far too formalistic as persons to be of any serious use in seeing and dealing with the problems of political life is not wrong, but it does not touch at the foundations of the science of logic itself.

34

Ε. Μ. Barth

2. Components of the Contemporary Idea of Formal Logic. Diagnosis If neither two-valuedness, timelessness, nor attention to linguistic form nor even the unquestionably exaggerated and onesided formalistic drive among logicians is the locus of the uselessness of the science of logic for political purposes, then what is? My own diagnosis is this: it is partly the notion of deductivity and the deductivist program that has been associated with it. As long as logic is understood as a deductive science it can neither suggest new directions for its own further development nor interesting applications of the available theory. And partly, too, it is the received idea that with the exception of mathematics logic is the only science that has not and cannot have an empirical component. All other sciences have - there is empirical physics, biology, psychology and so on. But there is no empirical logic, no empirical study of human logics, in the plural. I shall not say anything more about empirical logic here than that it is - wrongly - supposed to be a contradiction in terms, probably as a consequence of the idea that logic is a deductive science (rather than, for instance, a hypothetico-deductive one). Often one speaks about the Nomological-Deductivist Paradigm or program, or the Deductivist-Nomological Paradigm or program - the DNP, for short. Where does the deductive-nomological philosophy come from, and in particular, where does the deductive-nomological program and style in logic come from? Answer: first, from theology, then from metaphysics, and then, but not independently of theology and metaphysics, from mathematics. When mathematicians took over the science of logic a hundred years ago it was a blessing, however with a negative side to it inasmuch as the deductive-nomological style was emphasized and carried to its extreme. Mathematics has been characterized as resting on three basic ideas or cognitive atoms: 1 the idea of algorithm, the idea of the infinite, and, last but not least, the idea of deductive method. Logic was to be the science that should serve mathematicians in the analysis of the strengths and weaknesses of deductive-nomological systems. Now logic can, in principle and in practice, serve to analyse the strengths and weaknesses of deductive-nomological theories without being itself a deductive-nomological theory (or set of theories). But it is still tacitly regarded as falling under the same methodological and stylistic claims as those deductive systems it is to help us to evaluate. Surely that requirement is a non-sequitur, even if we were to agree that the evaluation of deductive theories is its only task (which we don't). More recently the whole conception of deductive-nomological science has been under fire from various quarters, including philosophers of physics as well as of language. From its name it is clear that the D N P combines at least two

Logic to Some Purpose

35

ideas, both of which are conceived as norms. The first is that theories should display 'laws', and the second is that they should display the interrelationships among whatever laws they came up with in a deductive system. The first component of the paradigm, the nomological one, has received so much criticism even in logic that it has not loomed large in logic since the thirties. The second requirement, that whatever the theory consists of should be assembled into a 'deductive system', is justified by the false assumption that only in that way will the theory yield to logical scrutiny. When this is applied to logic itself the justification is seen to be circular and completely unconvincing. The only way in which one could justify the belief that logic is and should remain 'a deductive science' would be a clear statement of two things: first, that mathematics itself is absolutely basically and irrevocably a 'deductive' science, and, second, that logic is a part of mathematics. The latter statement is of course nothing but a dogma issued in a certain interest. There is therefore no contradiction implied in saying: For political purposes (in the widest sense), formal logic is essentially a set of sets of rules for the assignment of rights and duties in discussions that take place under conflicts of opinion, and in no way is it fundamentally a science of deduction, to say nothing of: 'a deductive science'. I think the American movement towards an 'informal' logic shows that what is usually taken to be the quintessential fault in the science of logic today when it is measured against political and other practical purposes, is its 'formality'. As I have already said, in my opinion this diagnosis is wrong. The primary obstacle is not its 'formality' in whichever sense of that word but the idea and practice of deductivity, the idea that logic itself is essentially tied to the deductivist conceptions of science and the DNP, even if other sciences are not, and that logic is therefore tied to the type of'rationality' that is suggested by that paradigm. I also think that the field of argumentation theory should be given a firm footing in our culture, the firmest possible. Let me try to summarize what I have been trying to say so far as a number of recommendations, three in all: - Do not take leave of established sciences entirely or too soon. - If you do, be sure to do it for the right reasons. - Beware of creating new isolated playing-grounds with their own esoteric languages (just to keep up with the Jones'es), it is bad enough as it is. Or, as William of Ockham might have said, Play-pens non sunt multiplicandapraeter necessitatem. Keep contact with the established sciences and, where that is necessary, transform them, if possible. I shall return to this in a moment. You will have surmised by now that I am recommending to erect the theory of human argumentation as immediately adjacent also - though not exclusively - to formal logic in its non-deductive, dialogue form. At the international conference on theories of argumentation in Groningen in 1978 a considerable part of the material had this orientation. 2 Reasons for keeping this contact close on a theoretical level abound and have multiplied since 1978.3 So I would recommend that argumentation studies maintain the closest possible theoretical

36

Ε. Μ. Barth

contact with the investigation, whether empirical or normative, of the distribution of rights and duties in topic-oriented debate (by which I mean saklig debatt, Sachliche Diskussion)·, and that these rights and duties should be defined as functions of dialogical roles (or, dialectical roles). This set-up of formal logic is not called deductive logic and not inductive logic either but dialogue logic or dialogical logic (sometimes: dialectical logic). It might also be called: A formal logic that is capable of supporting, encouraging and of furnishing norms for political debate, in the widest sense; as well as of organizing the analysis of factual political debates without recourse to the deductivist style. I would like to commend this procedural-dialectical framework to you for its potential as a tool for: 4 1. de-escalation in theatres of violence 2. empowerment and self-defense 3. analysis. I believe its potential is unbeatable (so far) on all three of these scores.

3. The Deductive-Nomological Paradigm Again, and its Links with Epistemology In order that it can become a tool in and for political life, logic must be freed from its theoretical ties with subjectivist and objectivist epistemology. The formality of modern quantiflcational logic is an important step away from the problems and perspectives of epistemology, and the formalist methodology in metalogic may seem to have completed the emancipation. I maintain that the emancipation is far from complete. Let me briefly sum up the components of that paradigm as it appears in logic itself. There it can be seen to have something like eight components. They are: - nomological representation (less frequent than it used to be) - formality (not harmful - on the contrary) - formalization in the sense that requires a completely formalized language (this can be gotten around) - the formalist drive (this is psychology) - truth conservation - deductivity - the deductive-inductive distinction seen as exhaustive - closure of the philosophy of logic to the set {objectivist i.e. truth-value oriented logic, subjectivist i.e. constructive logic}. Of these assumptions all except the formality requirement are still tied to the notion of epistemology as the basis of everything - even the formalist desire or drive is. What one wants to reach are the Highest or Deepest Insights. But this has little or nothing to do with formality. So, again, as I see it the counter-move of Informal Logic, even if it is not seen as a counter-move but merely as an addition to the logic that is called formal and deductive, as much as I rejoice in its coming into being, that move is not, I believe, optimally focussed. There is even a serious risk that informal arguments, whatever that may be, and informal analyses of arguments will retain in them and further encourage the idea that arguments basically are 'epistemic'

Logic to Some Purpose

37

truth-oriented devices that are written, or could be, on the great objective epistemo-logical Screen on which, individually, we 'see' even social and political Nature as in a mirror.

4. The North-West European Scene The American philosopher Richard Rorty has written a fabulously good and justly influential book, Philosophy and the Mirror of Nature,5 In that book the author rightly combats the common assumption that there can be any one basic or 'first' philosophy, of some kind or other, and in particular that epistemology can be such a first philosophy. At the end of the book he says: Perhaps a new form of systematic philosophy will be found which has nothing whatever to do with epistemology but which nevertheless makes normal philosophical inquiry possible... The only point on which I (R.R.) would insist is, that philosophers' moral concern should be with continuing the conversation of the West.

I would like to add, on my own behalf, - and with improving on the conversation with the East, by suitable means.

And I would like to think that this, to keep up and to improve on the conversation world-wide, is the ideal that motivates the majority of those present here. One may therefore regret that Rorty closes the book - for which I have the highest regard possible - without having so much as mentioned the dialogical thinkers Martin Buber, M.M. Bakhtin and Emmanuel Levinas. However, in connection with argumentation in the sense of critical discussion I may be allowed to express particular regret that he did not discuss the work of certain thinkers who, seen from the North-West European point of view, immediately present themselves to the mind: that of Arne Naess and the Norwegian school, that of Chaim Perelman and the Belgian school, and that of Paul Lorenzen and the German school, which was not without connections with the logical work of the Dutchman Evert Willem Beth. What is more, both of them, Beth and Lorenzen, were inspired by Alfred Tarski's work in two-valued truth-falsity semantics. 6

5. Dialectical Fields Suppose this much is agreed: for the purpose of political application in the widest sense, logic, formal or otherwise, should be understood as the theory of rights and duties in topic-oriented debate. When that has been agreed upon, then ipso facto we have acquired the freedom to contemplate the possibility of widening the definition of logic as given, so as to introduce into logic itself the investigation of the conditions for arranging and of entering upon debates of those types that logicians will study (according to our first, narrower definition

38

E.M. Barth

of logic). Obviously, not all relevant features of that problem should be subsumed under logic, but I think some of them should, in particular one should study influential opinions about rights and duties as functions of logical roles (or, dialectical roles). It is certainly not my intention to subsume under this widened definition of logic a lot of subject matter that was formerly classified under the heading of 'epistemology', or under psychology. On the contrary, it is my intention to suggest a couple of points that may help us replace the old notion of epistemology as the study of subject and object and their relations, which as such falls in its entirety under the traditional non-dialogical type of philosophy, under that type of philosophy that we may call the social-solipsist type. I shall refer to the social-solipsist type of philosophy as Phase-I type philosophy. PhaseII type philosophy is then the philosophy of what Arne Naess calls 'the Arena'. Epistemology falls, in its entirety, under Phase-I philosophy. The distinction itself is not an epistemological distinction but a distinction in metaphilosophy, drawn by a non-believer in epistemology (myself).7 In particular I have the following in mind. Consider a certain company of users of language who are tied together as a socio-intellectual unit, a significant class, by a certain library in the sense of a set of books and papers, more generally: opinions, that more or less deeply determines their attitude toward critical discussion. These people may be thought of as placed within a certain dialectical field whose particular features are determined by that library as well as by new persons entering into the field. Theoretically we may speak of the Library (with a capital L) of the company involved. In one company the Library will be represented by the works of Engels, Lenin and Marx, in another by those of Thomas Aquino, and so on. A dialectical field is therefore a milieu of users of language who are subjected to norms that may influence in some way or other the frequency and the efficacy of discussion among them. This concept of a dialectical field is meant as an analogon of the concept of a physical field, hence a dialectical field is a field of force. It is generated by the language users in it and the texts that unite them. A language user who is introduced into the field at a certain 'point' (j) of the field8 is thereby exposed to a number of categorially different field forces (Äij). They may or may not have one resultant field force - this question need not detain us. The language user in question who enters the field carries with him a certain psychological inertia or 'mass', say m. The set of field forces encountered at the point where the person has been introduced is processed by that person either as an inviting or as an excluding force (F). This inviting or including force is clearly a function both of the psychical 'mass', m, and of the field forces. To describe that function in general terms is a task for empirical psychology, not for us. Our task is to describe the field, and the speech acts it is likely to induce in persons of an average description. If only one type of dialectical field were empirically possible, then this conception would probably be of little or no value. However, it is possible to distinguish quite objectively, by direct reference and even quotations from literary sources, between various types of dialectical field, basing the definition on clear pronunciations-in-context that have been made in writing by influential thinkers.

Logic to Some Purpose

39

The field conception, which was suggested for group psychology by Kurt Lewin, now becomes an explanatory instrument, as well as a predictive instrument, in argumentational studies. I shall describe one such field, by listing its determining tenets. These are tenets that were made public by the Dutch mathematician Luitjen Brouwer, or Bertus Brouwer, as he was called (L.E.J. Brouwer), the central name (if any) of so-called intuitionistic mathematics (subjectivist mathematics). Brouwer was a professed enemy of logic, that is to say: of formal logic. His philosophical source of inspiration was German idealism. On the basis of a book 9 published by Brouwer in 1905, some two years before his world famous doctoral dissertation, we can describe a deontic dialectical Brouwer field as follows: DEF. 1. A (deontic) dialectical Brouwer field is a dialectical field in which (1) (2)

(3)

(4)

(5) (6)

the use of language in pursuit of agreement through argument/reasoning is discouraged (38, 71); intersubjective understanding is considered as requiring a type of enforced drilling ('dressuur') that is to be regarded as a negative phenomenon (42); the parties in a controversy are not in the possession of positive and agreed rights and obligations, linguistic or other, and this is regarded as a desirable state of affairs (97); opinions and products of individuals belonging to certain classes of the population are to be excluded from serious philosophical or scientific consideration [passim); logic is of no avail anywhere, a fortiori it is of no avail in conflict resolution (27); real truth is self-evident ('van zelf sprekend'; 16, 72).

The book from which this is taken is not found in his Collected Works except for a few innocent stanza's, and there is as yet no full translation. Since Brouwer - who took the most rigid anti-logical position of all mathematicians - has been excessively influential in determining what is to be accepted as logic, formal or otherwise, in Dutch universities, I may be allowed to dwell a moment on his thinking concerning discussive argumentation. Observe that on the strength of what I have told you about his opinions concerning discussive argumentation we can derive the following THEOREM: Verbally and otherwise, anything goes (by DEF. 1, clause (3) - there are and should be no rights or duties, verbal or other). The book by Brouwer in question - Life, Art, and Mysticism - contains a very great number of demonstrations of his own applications of this theorem, that he did not formulate explicitly, as I have done. You may want to know that

40

Ε. Μ. Barth

nothing in this mathematician's later publications (or private life, were that to count) contradicts or otherwise goes against the contents of these references and quotations. It is important to stress this point because his famous withdrawal from two-valuedness in mathematics has brought some mathematicians to try to see him as a pragmatical thinker and to recommend him as such, rather than as a solipsistic constructionist whose contructions were in, of, and for his own mind. Suppose we eliminate from the clauses of this deontic definition all reference to what should or should not be the case. Then we arrive at a definition of a weaker kind of dialectical field, which, because of its neglect of the argumentational importance of dialogical roles, duties and rights, still deserves to be called a Brouwer field, though a weakened one: DEF. 2. A weakened dialectical Brouwer field is a dialectical field in which (1) (2) (3)

the use of language in the pursuit of agreement (conflict resolution) is not encouraged; the parties in a controversy are not in the possession of positive and agreed rights and obligations; the normative clauses of the definition of a deontic Brouwer field are not satisfied. That is to say, there are no tenets in function to the end that there should not be any positive and agreed rights and obligations in verbal interchange - the whole question is left open, or rather: bypassed.

I have added this second definition, of a seemingly less noxious type of dialectical field (which may be studied in many milieu's and countries all over the world) in order to be able to point out to you that our theorem still holds - by the clause that there are no positive and agreed rights and obligations for conflict resolution: Verbally and otherwise, everything goes.

6. Transforming Dialectical Fields Dialectical Brouwer fields - of either type - cannot be eliminated simply by recommendations or attempts to superimpose a more inviting and including type of field. A long-term method is, I think, available, one which ought to result in the elimination of these tenets and of the very real and tangible fields that they determine. This is the time-consuming and energy-consuming method of systematic and detailed pragmatization, that is to say: of carrying out methodological and scientific transformations, for which I should like to introduce the name Naess transformations, on all philosophical systems and on all scientific theories. Let me try to clarify by giving a definition again - a stipulative one, of course. DEF. 3. A dialectical Naess transformation is a reinterpretation followed by re-styling and reformulation in more pragmatical and discussive terms, with

Logic to Some Purpose

41

explicit reference to observers' or debaters' positions or 'roles' - of epistemological and methodological tenets and norms (that will no longer be of an 'epistemological' nature when the transformation is completed); - of all ontological conceptions, assumptions and tenets; - and of complete scientific theories. This definition is formulated by me, in the heyday of neo-pragmatism, but it is inspired by an article that was written by Arne Naess at the age of 25, in 1937, when no one would listen to pragmaticist philosophies about the importance of relating the philosophy of science to the needs of the Arena. He added a second part to this paper in 1956. The paper 10 has never been properly published - I think we shall have to issue it world-wide from Holland. It is written in German, and carries a title that translates as: How to Advance the Empirical Movement Today - i.e., 1937. It is no less valid today, half a century later. I wish that someone among you with the right linguistic background would volunteer to undertake the task of translating this fascinating paper from German into English. I shall now discuss one such 'Naess transformation', the transformation of deductive logic into dialogue logic.

7. Fitting Formal Logic for Use in the Arena: First Naess Transformation Compare these two set-ups, each of which features the splitting up of 'deductive' arguments into two columns: Lorenzen:

Beth's semantic tableaus: I

Opponent(s)

II Proponent

True

False

(Concessions) Τ

C

(Thesis)

(=T)

Going from left to right many people at first find the distribution of 'True' and 'False' over Lorenzen's headings 'Opponent' and 'Proponent' puzzling. Why should the Proponent's thesis be registered as false? This is even more puzzling (vis-ä-vis Lorenzen's characterization of the two columns) in the light of the columns of Beth's 'Method of Deductive Tableaus' (III) HI Premisses

Conclusion C(=T)

42

Ε. Μ. Barth

which is closer than either of the two first to the traditional point of view, and which shows that the 'conclusion' of a 'deductive' argument corresponds to the Proponent's thesis and is put under 'False'. Here is a real puzzle for all those among us who have been more or less strongly imbued with the assumption that logic corresponds to the point of view of someone who is positively attuned to the conclusion, in the sense that he operates from the assumption of its acceptibility, as well as from that of the premisses. Only those who have been trained in the philosophy of a rather strict logical formalism can know that this assumption is of no importance in the analysis of logical argument - but formalism is often pushed back, so that this knowledge has not really been assimilated. However, it is easy to get an 'intuitive' grasp on the connection between the two set-ups I and II without recourse to formalism, namely if one realizes that what Beth's semantic-tableau method depicts - for the first time in the history of modern logic - is the perspective of the lonely - monologizing - thinker who is preparing for the Arena: (Future opponent:) Ί am not sure that C is true, though other people seem convinced that it is. My own assumptions so-and-so are certainly true, i.e. they are Agreed in the set {Myself}, and it is likely that in a future debate about C the same assumptions will be Agreed in the set {Myself, the Proponent} of such a discussion as well, so that I can go on labelling them as True. My problem now is: Can this opinion of mine, that C may be false, be refuted? That is to say: is it possible that I shall ever have to admit (that) C, unless I give up one of my assumptions? Let me put this up in a chart: II' Agreed in {Myself}, i.e. True (assumptions)

Not-agreed in {Myself}, i.e. possibly False

(opinion)

Notice that it is not possible to present this in the future Proponent's terms directly, one has to go via the standpoint of ones imagined opposition. The Proponent need not agree to the premisses or concessions, and in any case is less likely than the future Opponent is to label the conclusion - his own thesis - as Not-agreed. Note that II' conceptually medicates between II and I. What Beth did, then, was to present logic in such a garb that it becomes clear that formal logic is basically Opposition-oriented and not Adherent-oriented, or Proponent-oriented. The axiomatic method gave - falsely, as we now see - the impression that logic is mainly Proponent-oriented:

Logic to Some Purpose

43

IV logical axioms, Agreed by an adherent (cf. Proponent)

premisses, Agreed or at least assumed by the same

'deduction' (by the same person), i.e. application of 2 or 3 simple rules assumed to be evidently valid (to everyone, including the deducing person) Conclusion, reached by its Proponent (with no preceding shift of logical role) This image of logic as 'deductive' is the image of the metaphysicist who operates alone and is in need of no critical opposition, not even opposition coming from his own mind. However, seen from Lorenzen's two-dimensional space of dialectical roles (as I call them), what Beth does is to project Lorenzen's tableaus onto the inner mental screen of the Opponent. That inner 'screen' is a veridical screen. Hence the step from dialogue logic to truth-value logic is a reduction of dialectical roles to the veridical screen of the solipsistically functioning future opponent who is preparing for the Arena. Chronology testifies to its character as preparation: the semantic-tableau logic as formal, but no longer 'deductive' logic preceded formal dialogue logic. Before Beth, Gentzen had introduced the logical style that may be called 'a double focus' - that was not invented by Beth. Gentzen recommended that logic be studied as a 'development' (cf. 'deduction') starting from an ordered pair Prem/Concl, where Prem is the class of all premisses in the argument in question and Concl is the desired conclusion (his terminology was different from ours). Such an ordered pair he called a 'sequent'. Gentzen's double-focussed approach - his new style for formal logic - was certainly pivotal in the process that led via Beth's representation of formal logic (formerly 'deductive') in semantic-tableau form, to Lorenzen's representation of the same in dialogicaltableau form. However, Gentzen's new representation of formal logic did not clearly point to what we may now, with hindsight, recognize as the representational activities of the future opponent who is preparing for the Arena. Summing up: Guided by the methods developed by Gentzen and by Jaskowski, and inspired by Tarki's truth-valued semantics, Beth became the first logician to represent monological 'deductive' formal logic as the mental or pencil-and-paper representations of an Opponent of the potential conclusion, when this person is left

44

Ε. Μ. Barth

to his or her own devices. This fact is made clear by the structural similarity of Beth's method and Lorenzen's method of dialogical tableaus. You may now want to say that instead of the usual headings of Lorenzen's two columns, Proponent and Opponent, one might as well use these:

I' Agreed in {0}

Not-agreed in {0}

However, Agreed in {0} and Not-agreed in {0} refer to the socially-solipsistic phase (and only to O's activities). They would lead us, as theoreticians, away from the task of giving the richest possible description of the interaction between the speech acts made by Ρ and Ο - in accordance with their rights and obligations - when they operate in the Arena. Therefore the two columns should be headed by the terms Opponent' and 'Proponent', as Lorenzen has always done. (Or by O l g a ' and 'the Pope', as I like to call them.)

8. The Meaning-in-use of 'True', and Dialectical Model Theory: Second Example of Naess Transformation We now have a definition of 'true' ready - in fact I have already assumed it: I declare ρ to be true iff pis agreed in {Myself) or, as said by someone, who is not identical with me: S takes ρ to be true iff ρ is agreed in S This can be generalized to any number of users of language: For any natural number n, Si, S2, ..., and S„ all take ρ to be true if ρ is agreed in {Si,...,Sn} I want to emphasize that in the Arena no other meaning can sensibly be given to statements of truth than this. In other words, in the Arena and in analyses of what happens in the Arena, 'true' and 'false' are superfluous. This should be brought to bear on logical model theory. In fact the transformation of referential and truth-value oriented model theory into a dialectical model theory is well under way. This dialectical model theory takes its concepts and terms directly from the Arena - and not from the social-solipsist phase of philosophy. 'Truth' and 'falsity' have dropped out. Semantics itself is pragmatized - what was earlier relegated to 'pragmatics' is now given pride of place in semantics itself.11 To this the study of dialectical fields should be added. In the process of philosophical emancipation from epistemology and other pretenders to the throne of prima philosophia the transformation of deductive logic, whether of the objectivist brand or of the subjectivist intuitionist brand, into (formal and informal) dialogue logic is an evolution with a great potential in argumentation studies - not as yielding a new prima philosophia, for there

Logic to Some Purpose

45

is none, but one whose end result should surely be given a central role, in a global philosophy with the purpose of 'keeping the conversation going'.

Notes 1. E.W. Beth, 'Konstanten van het Wiskundig Denken' (Constants of Mathematical Thought), Mededelingen der Koninklijke Nederlandse Akademie van Wetenschappen, Afd. Letterkunde, Nieuwe Reeks, Deel 26 no. 7; p.8. 2. E.M. Barth and J.L. Martens (eds.), Argumentation: Approaches to Theory Formation. Containing the Contributions to the Groningen Conference on the Theory of Argumentation, October 1978. John Benjamins, Amsterdam, 1982. 3. Here I am thinking of computer science, for instance, of expert systems and of interface problems. There are people working on the utilization of the dialogue approach to formal logic in the computer sciences, notably 7. J. Hoepelman and his group (Fraunhofer Institut, Stuttgart). 4. Here I am happy to be able to refer to concepts and terminology employed by other members of this congress. 'Empowerment': cf. Helen B. Warren, Argumentation as Empowerment, paper read at this congress. 'Self-defense': having emphasized the importance of logic as 'the noble art of (nonviolent) self-defense' in tuition I am delighted to learn that Ralph Johnson and Anthony Blair have published a book with the title Logical Self-Defense (Toronto, etc., 1983). 5. Richard Rorty, Philosophy and the Mirror of Nature, Blackwell, Oxford, 1980. 6. Beth (1908-1964), whose earliest philosophical impressions derived from the Neo-Kantian school, met Tarski (1S02-1983) in 1948. The importance of their contacts for his own subsequent work in logic and semantics has been pointed out in writing by Beth himself. The 'semantic-tableaux method' for two-valued truth-oriented logic would never have come into being were it not for the entirely new impulses his thought received from his acquaintance with Tarski. Brouwer, a direct heir of German idealism and some twenty years older than Tarski, never experienced that influence. (G. Mannoury, father of the Signifies movement, managed to mollify Brouwer's idealism to some extent.) It was a stunning surprise for me to learn from Professor Lorenz just yesterday (5.6.1986) that Lorenzen's dialogical logic, too, has Tarski's philosophy as a sine qua non. Lorenzen had opted for 'constructive' - an operationalist reading of Heyting's formalization of the logic implicit in Brouwer's 'intuitionist' mathematics. Tarski pointed out that mere operationalism will not suffice, that scientific thought and uses of language must recognize some kind of outer or second instance, and that the science of logic has to reflect this. Lorenzen's achievement is to have answered this challenge without going back on his rejection of the tertium non datur concerning statements, facts or beliefs, by means of his dialogical definitions of the logical constants. 7. This has been expounded at some length in my paper: Two-Phase Epistemology and Models for Dialogue Logic, Philosophica 35 No. 1 (Special issue on dialogue logics, ed. Jean Paul van Bendegem), 69-88. - The word 'epistemology' was chosen in the title instead of 'philosophy' for reasons of euphony. 8. A 'point' or 'position' of (or, in) the field is determined by a number of pragmatic co-ordinates - any ones you want. Time seems one of the most important. Situational features of all kinds can be taken as co-ordinates of the field in question. 9. L.E.J. (Bertus) Brouwer, Leven, Kunst en Mystiek (Life, Art, and Mysticism), J. Waltman Jr., Delft, 1905. 10. Arne Naess, Wie fördert man heute die empirische Bewegung? Eine Auseinandersetzung mit dem Empirismus von Otto Neurath und Rudolph Carnap, Oslo (Universitetsforlaget), 1956 (Filosofiske problemer 19). (Written 1937-1939. Appendices translated from the Norwegian and added in 1956.) 11. For main ideas and references see the paper mentioned in Note 2. For a detailed exposition of the elements of this transformation see pp. 234-316 of From Axiom to Dialogue (E.M. Barth and E.C.W. Krabbe, 1982; Walter de Gruyter, Berlin). For a systematic philosophical motivation and the historical background see the first two chapters of that book.

Chapter 3

Logic Naturalized: Recovering A Tradition Ralph H. Johnson

1. Introduction My paper is about logic, inference and argument-matters of interest mainly to logicians*. In particular, my remarks are addressed to those who belong (however tenuously) to the traditions of .Formal, Deductive Logic - a tradition I shall have sufficient occasion to refer to as to warrant a name of its own. When I looked at the abbreviation, 'FDL,' the name 'Fiddle' came to mind and I shall so refer to the formal deductive tradition in logic, intending by this reference no disrespect - though my own reservations about this tradition will become clear at the end of my paper. 1 In this tradition which I have just baptized FDL, the following specimens get counted as arguments: Boston is a city and Boston is in the United States. Therefore, Boston is in the United States (Lambert and Ulrich, 11) The sky is blue Grass is green Therefore, tigers are carnivorous (Lambert and Ulrich, 19) The purpose of my paper is three-fold. First, I wish to challenge that conception of logic and argument at work in FDL. Second, I wish to speculate as to how our conceptions of logic, inference and argumentation underwent such radical transformation. Third, I want to suggest that logic must be naturalized. I have in mind a couple of things: first, that logic should take argumentation in natural language as its focal point; and second, that logic should be natural, both in its approach - it should as much as possible use natural language rather than artificial language - and its purpose: it should help increase the capacity to reason better, which is what people naturally expect.2 I shall begin by introducing a pair of theses, the first of which is that inference is not argument. 3 Yet the identification of argument with inference has become commonplace in logic textbooks, especially (but not exclusively) those which belong to the tradition of formal, deductive logic. My second point is related to the first. It is that logic Φ deductive logic. A fortiori logic Φ FDL.

48

Ralph Η. Johnson

2. Argument is not Inference The conception of argument found in most F D L textbooks will resemble one of two prototypes. Either argument is defined as a set of reasons leading to a conclusion, or else argument will be identified with inference, for all practical purposes. Let us take the first prong, evident in the example already cited from Lambert and Ulrich: 'Boston is a city and Boston is in the United States. Therefore Boston is in the United States.' 'Those who know' will recognize this as a substitution instance of a tautology (p.q; p) and hence as a valid inference. And yet to term this an argument is something that would probably not occur to anyone who had not been raised on FDL. What then is wrong with taking argument and inference to be identical? If the practice of argumentation presupposes, as I believe it does, a claim which is contentious, 4 then in order for this item of discourse to qualify as an argument, the claim that Boston is in the United States would have to be contentious. Is it? I don't think so, at least not where I come from. And if it were to become contentious, it could scarcely occur to a reasonable individual to seek to support this assertion in this way. Put differently, anyone who really does not believe that Boston is in the United States is not likely to be persuaded rationally to accept that claim by the Lambert-Ulrich specimen. The example above is no argument. Perhaps we shall call it an inference. From the fact two propositions are together true, it follows with necessity that one may affirm either one of them to be true. Thus one might find it appropriate to reason this way: 'If the couple was on the bus, then clearly the husband was on the bus.' Now inferences are very important, and certainly logic has something to do with inferences; that is part of its tradition going back to Aristotle and the Prior Analytics. But let us not forget that Aristotle wrote no work entitled Logic; and that the entire corpus of methodologically oriented works is known as The Organon and that this corpus includes besides the Prior Analytics (the ancestor of FDL) much else, including the Topics, Sophistical Refutations, etc. In these works, it is argumentation 5 rather than inference which is the focus. Thus, if we look to tradition, we shall have abundant evidence to support the view that logic is about inference, and also that logic is about argument. Let us then look at the other prong of the F D L conception, according to which an argument is a set of reasons leading to a conclusion. What is wrong with this? A lot, for it is not sufficient for an argument that there be reasons leading to a conclusion. 6 That which is argued about must be controversial, contentious, really in doubt; and for this to occur, there must be contrary views. One purpose of argument is to persuade the rational individual of the truth of the conclusion using evidence and reasoning only. 7,8 Thus an argument must not only lay out a route to the conclusion, but it must also in some fashion or another come to grips dialectically with its competitors by showing that its path is superior. To ignore other arguments by merely tabling a few reasons that support the conclusion is to fail to take seriously the ethos and the climate which gives argumentation its raison d' etre and in which alone it can flourish.

Logic Naturalized

Recovering

a

Tradition

49

Hence there is some reason to think that the F D L conception is inadequate, whether argument is either identified with inference or conceived of as a set of reasons leading to a conclusion. This claim leads now to the next point.

3. Logic is not Equivalent to Formal, Deductive Logic (FDL) There are many who still think so. (That is how they were trained.) Alvin Goldman is one: Ί shall concentrate on deductive logic. I shall accept the contention that deductive logic - formal logic - is a body of necessary truths' (1985: 41). In other words, in Goldman's view deductive logic just is formal logic. Only a few lines earlier Goldman has dispatched inductive logic: 'It is very controversial whether there is such a thing as inductive logic', and so, unless there is some other logic, the conclusion follows that logic is deductive logic. What about informal logic? In a footnote, Goldman states: 'The term 'logic' is also used to refer to a subject called 'informal logic' often taught to introductory logic students. Unfortunately there are no established truths of informal logic; indeed, it is quite unclear what the content of informal logic is, or should be. By contrast, formal logic has a well-defined content and set of truths.' (1985: 65)

Notice how logic has here been narrowed to include nothing more than FDL. Goldman is not alone in holding this view; he is simply more forthright and explicit than most. But once again if we look back into its history, we will find that other generations have had a much broader conception of logic than that of a body of necessary truths. For example, the task of definition of terms has traditionally been taken to be part of the business of logic, but where is that on Goldman's account? Less than 100 years ago, Peirce (no mean logician in anyone's sense of the term!) could write: 'The very first lesson that we have a right to demand that logic should teach us is how to make our ideas clear' (1878: 64). Dewey wrote at great length about the logic of inquiry. Yet one suspects that neither of these would be mentioned in Goldman's canon. How did logic get reconceptualized so as to refer uniquely to FDL? What has happened here is that one particular paradigm has seized the imagination of logicians and blinded them. This is the sort of thing I believe Wittgenstein had in mind when he wrote: 'The only way for us to guard our assertions against distortion... is to have a clear view in our reflections of what the ideal is, namely an object of comparison - a yardstick, as it were instead of making a prejudice out of it to which everything has to conform. For this is what produces the dogmatism into which philosophy so easily degenerates' (1980: 26e).

By equating logic with F D L and by identifying inference with argument (or failing to note the distinction between them), F D L was able to present the ideal of logic as a body of necessary truths, the hardest form of science. Yet in doing this logic was cut off from important parts of its historical development. Let me then proceed in the hope that these two points are acceptable, so

50

Ralph Η. Johnson

that then it may be profitable for us to think about this question: Where and when did this re-orientation occur?

4. Where did we go Wrong: Lessons from the Past? Just as it is sometimes said that all philosophy is footnote to Plato, it may be thought that all modern logic is a footnote to Frege. At any rate, all trails inevitably lead there, and so our search might well begin there also. Frege was not in the least interested in argumentation. What interested him was rigorous proof. Shocked by the shoddiness of much of mathematical reasoning of his time, he set about to reform it. Thus mathematical logic developed under the guidance of Frege, Russell and Whitehead whose project called for demonstrations of syntactic transformations within logistic systems, ultimately with a view to showing that mathematics was reducible to logic. They constructed artificial languages for this purpose. Let me make it clear that I regard these developments as most interesting and worthy of study. What I wish to question is the right of these initiatives to usurp the title 'logic'. But where does this usurpation occur? It does not occur in Russell and Whitehead's Principia which was a technical working out of the logicist program in the foundations of mathematics. Nor does it occur in Frege's Begriffsschrift. Then where? Where does the trail lead next? Where can we pick up the scent? My thought is that we should look to the textbooks, where students learned their logic lessons. To check this conjecture, I looked at some North American undergraduate logic textbooks from the late 40s and 50s. I began with Quine. Do you know what I found? The term 'argument' is conspicuous by its absence in Quine's two logic texts!! Actually, the term 'argument' makes one appearance: 'In the usual terminology, z'x is the value of the function ζ for the argument χ...' (1940: 222). In this tradition of logic, emanating from Frege, Russell and Whitehead, the entire concept of argument as rational persuasion has evaporated. True, Quine would not claim to be analyzing arguments. He does however claim to be doing logic - and that is, from the point of view of my paper today, the amazing finesse: logic without argument! Think about that! In a different vein, I looked at Max Black's Critical Thinking (1946). Here we find a much different approach to logic, one in which, you will be happy to hear, the idea of argument is included. Alas, Black makes the mistake, not uncommon, of equating argument with inference. Thus he writes: 'An inference that purports to be conclusive is said to be deductive, and such an inference is known as a deduction... Our first task is to discover the kinds of things out of which a deductive argument is constructed. Let us consider a concrete example: Whales are mammals. All mammals suckle their young. Therefore: Whales suckle their young. This very simple example illustrates certain important features c o m m o n to all argument.' (17)

Logic Naturalized Recovering a Tradition

51

The lessons Black draws from this example are the typical ones. An argument is discourse with a certain structural complexity, and at the core of argument lies the proposition. Thus begins the inevitable (and quite useless) excursus on the difference between sentences and propositions! Now notice, if you will, the lessons which have not been culled from this example (because they could not be). First, no reference is found to argument as an essay in rational persuasion. No mention is made of the purpose. Apparently, final causes may play no role in logic as a science, anymore than in any other scientific endeavour. But here, as elsewhere, structure often follows function/ purpose. That is to say, argumentation has the structure that it does because of its purpose. An argument seeks to bring about rational assent, and because the issue to be dealt with is contentious, controversial, it follows that one must provide reasons! Mere assertion can't get the job done rationally. Hence we find that the purpose of argument as rational persuasion dictates the infrastructure of argument: premises and conclusion. Yet this is just the ground floor. Precisely because the proposition is contentious, because there are contending parties who have taken a different view of things, it follows that the argumentation cannot be complete until and unless the argument addresses itself not only to the issue as represented in the conclusion but also to the other positions! (This is the insight which many of our confreres in the discipline of rhetoric take for granted!) In short, an argument cannot be conceived merely as a set of reasons leading to a conclusion; this may be the infrastructure but this cannot be the full story. The arguer must also address himself to the opposing points of view and show why he is superior. To fail to do this is to fail to discharge a fundamental obligation of the arguer in the dialectical situation. I continued my search in Beardsley's Practical Logic. Here is Beardsley on argument: it is 'discourse that contains at least two statements, one of which is asserted to be a reason for the other' (1950: 9). This is marginally better. Argument is treated as a form of discourse, and Beardsley's examples are infinitely closer to the real thing. Yet there is some indecision and confusion, as we later read: 'We can be sure that an argument is good if we can justify its conclusion according to a rule of inference' (16). Witness once again the easy passage between argument and inference. How about Beardsley on logic? He doesn't declare his position explicitly, but we can perhaps glean it from a passage like the following: 'When an argument is valid, its premises are said to 'imply' (or 'entail') the conclusion. The relationship of implication is a fundamental idea in deductive logic' (1950: 213). Not a smoking gun, but close to the idea that logic is bound up with implication and entailment. The easy shift between inference and argument is evident again a few lines later: 'When we are presented with a deductive argument and we want to know whether it is valid, we have to recognize its form' (213). So Beardsley has not avoided the confusions that have plagued others. Of course no survey of undergraduate logic texts would have been complete without a look at the classic - Copi's Introduction to Logic. In the 2nd edition, Copi says: 'The study of logic is the study of the methods and principles used

52

Ralph Η. Johnson

in distinguishing correct from incorrect reasoning' (1961: 3). You might wonder how we are going to get from there to the dogma whose development we have been tracing. Copi makes the missing link explicit: 'Reasoning is a special kind of thinking, in which inference takes place or in which conclusions are drawn from premises' (1961: 5). So logic is about reasoning, and reasoning is either inference or argument. But what is the distinction between argument and inference? 'Although the process of inference is not of interest to logicians, corresponding to every possible inference is an argument and it is with these arguments that logic is chiefly concerned. An argument, in this sense, is any group of propositions of which one is claimed to follow from the others, which are regarded as providing evidence for the truth of that one' (1961: 7).

Thus the difference between inference and argument is slight; inference is the process and argument the result, the product. From this early tradition of logic texts, we can begin to pick up the trail and see where the shifts began. (I realize that there is an ellipsis in my account: I have shown where the shift takes place but I have not said why.) What of the newer generation of logical textbooks? Have they a better account to offer?

5. The Concept of Argument in New Wave Texts. In our 1978 paper, Blair and I dubbed those textbooks 'new wave' which, took seriously argumentation in ordinary language. The question which I look to is whether that generation of textbooks pas provided a more adequate conception of argument. I won't have time to do much more than sample a few leading texts. Let us then look first at the textbook which helped to launch the informal logic movement in North America: Howard Kahane's Logic and Contemporary Rhetoric. In the first edition, Kahane writes: 'Let's call uses of language or pictures intended to persuade anyone of anything an argument' (1971: 1). A somewhat different account is provided in the third edition: 'Reasoning can be cast into the form of arguments. An argument is just one or more sentences, called the premise of an argument, offered in support of another sentence, called the conclusion' (1980: 3). Much the same view is found in Stephen Thomas's textbook: 'As I shall use the term, an argument is a sentence or sequence of sentences containing statements some of which are set forth as supporting, making probable or explaining others. That is, an 'argument' is a discourse in which certain claims or alleged facts are given as justification or explanation for others' (1973: 2). Without wishing to deny that there may be important differences between the two, nevertheless from our vantage point they are alike in that the kernel of the idea (reasons supporting conclusion) is all that is offered. No mention of content, of purpose, of structure resulting from purpose, of the need to engage dialectically with alternative positions. Fogelin (1st edition) does not fare any better. Though he does not give us an extreme closeup on the idea of argument, yet it is clear from his discussion

Logic Naturalized Recovering a Tradition

53

that he takes argument in the sense of reasons offered in support of a conclusion (1978: 35). Johnson and Blair also adopt this view (1977: 3-4). Thus informal logic textbooks offer the reader an anemic conception of argument, one which does not differ markedly from that which appears (when it does appear) in other standard introductory logic textbooks, such as Copi; nor indeed from those on the F D L tradition. The one exception to this pattern in the first wave of the 'new wave' texts is Scriven's. Although Scriven's conception of argument has to be culled from a careful reading of several chapters, and although its mainlines seem to resemble those we have looked at, one can sense just below the surface that Scriven has come closer than any of the new wave texts in breaking free of the constraints of the F D L tradition. Thus Scriven locates his discussion of argument within a broader context of reasoning; he distinguishes between argument and inference, even though the distinction is not fully satisfying; and makes implicit reference to the function/purpose of argument. Moreover, the sixth of his seven steps requires that one evaluate alternative arguments, and this comes very close to our idea that the very essence of an argument requires that one should do this (1976: 54 ff.). All in all, one would have to say that all of the elements of a fully developed conception of argument are here; they simply needed to be developed more. Scriven excepted, it seems that on the whole informal logicians have not yet been converted to a more robust conception of argument. For nurture, they might then find it profitable to look outside their own garden walls.

6. What we can learn from Others: The View from Afar Those who have worked in this new tradition of logic may discover, as I did, some evidence about the force of such blinders by consulting a different tradition, for example Speech Acts in Argumentative Discussions by van Eemeren and Grootendorst. In their first chapter, they lay out their conception of argumentation under four headings. The first feature in their account is the externalization of argumentation: by which they mean 'the verbal communication of the subject to be investigated' (1984: 6). The argumentation theorist must therefore concern himself with the opinions and statements as expressed rather than with the thoughts, ideas, and motives which may underlie them. The second feature they mention is the functionalization of argumentation, 'the treatment of the subject of investigation as a purposive activity' (1984: 7). The third feature in their account is the socialization of argumentation, by which they mean treating the subject of investigation communicatively and interactionally (1984: 9). Argumentation is an attempt to persuade a rational judge of the Tightness of a particular claim and this involved dialogue and exchange. Thus it is that the idea of dialogue emerges as central in their account. Lastly, they stress what they call the dialectification of argumentation, whether attempting to defend an opinion or to criticize someone else, the language user

54

Ralph Η. Johnson

addresses another language user who is supposed to adopt the position as a rational judge, who reacts to argumentation critically, so that a critical discussion ensues. When I first read their monograph, I had two simultaneous reactions. The first reaction was: 'How perfectly right they are.' This was followed by a wave of despair, for it seemed to me that they have said about the concept of argumentation just about all that I wanted to say. Which left me with the uncomfortable question: What do you say after someone else has already said it? My dilemma was resolved and my despair eased when I reminded myself that van Eemeren and Grootendorst have had the good fortune of not having been educated in the narrow tradition of logic that has dominated tuition in North America. That tradition, stemming from Frege and then Whitehead and Russell, has many strengths and has come to have a close look at the other side of the ledger; so as to see clearly not only what was gained but also what was lost.

7. Conclusion I have dwelt on the conception of argument in both the older and the more recent logical traditions and found very little difference between them (when, that is, the older tradition had a conception of argument in sight at all). I have also attempted to delineate a more adequate conception of argument, one which is familiar to other disciplines such as rhetoric and speech communication. Now someone is sure to confront me with James's dictum: Ά difference which makes no difference is no difference.' Behind this dictum lies the following challenge to which I owe some response: 'What differences does it make in actual practice whether one adopts the older desiccated notion of argument (premises and conclusions) or the more robust one you are defending?' I do not have yet a satisfactory answer to the question, but let me make three brief observations by way of conclusion. First, it seems to me important to get it right! I am unhappy with the accounts given by logicians and I have tried to make plain the basis of my objection. Second, I wonder about the wisdom of giving students an undernourished conception of argument in these times when the practice of argumentation is very much threatened by powerful cultural forces, such as television. Third, in following the moves of Frege and company, logic allied itself with science. The alliance has been important but costly here as elsewhere in philosophy. And just as the alliance is being challenged elsewhere (by Maclntyre in ethics, by Feyerabend and Kuhn in philosophy of science, etc.) so the challenge should be extended here. Logic took a bad turn when it allied itself with science (especially with mathematics) and divorced itself from the humanities. Argument construed merely as reasons to a conclusion without clear reference to the demands of dialectical interchange ceases to be the powerful instrument it might otherwise be. My contention is that the naturalization of logic is the next important task

Logic Naturalized

Recovering a Tradition

55

confronting us. Central to this development will be the reconceptualization of argument so that its dialectical nature is fully appreciated. In this process, logicians have something to learn from other disciplines, among them rhetoric and speech communication. It has been my purpose here to make the need for that transition more evident and its importance intelligible.

Notes * Since delivering this paper at the Conference, my thinking has benefitted from discussions and correspondence with the following people: Timo Airaksinen, E.M. Barth, Anthony Blair, Trudy Govier, Erik C.W. Krabbe, Richard Paul, Perry Weddle and John Woods. I am grateful to all of them for their comments and criticisms. I wish that this revised version could have shown more clearly than it does how much I have learned from their patient instruction. 1. I am grateful to Erik C.W. Krabbe for reminding me that the crucial terms here, 'formal', 'deductive' and 'logic' have all undergone significant shifts in meaning during their history. 2. My views on these matters may remind some here of those taken by Prof. Barth in her article Ά New Field: Empirical Logic/Bioprograms, Logemes and Logics as Institutions' in Synthese (63, 1985), 376-88. She writes: 'Dialogue logic embedded in a wider theory of argumentation ought, in due time, to bring about an improvement of the systems of logic that actually dominate the modes of reasoning and argumentation in human affairs. For this purpose we shall have to develop a techne that has more to offer than applications of mathematical logic to Mary and Bill.' (377) I must admit that my heart went boom when I read this. And even though I am not yet persuaded to follow Prof. Barth in her proposals regarding dialogical logic, I believe her work should be read carefully by all who are concerned for the future of logic. 3. I have been persuaded by discussions and correspondence with both Trudy Govier and John Woods that it is implication rather than inference which is the subject of formal, deductive logic. But I believe my point is easily enough recast and will ask reader to bear this change in mind as the paper is read. 4. Trudy Govier points out, and rightly, that I have not argued for this claim and that some argumentation really should be provided (I suppose because my claim is controversial!). For one possible way of dealing with this point, cf. 7 below. 5. Though I have not made a great effort to distinguish between argumentation - a social practice whose history can be traced back to the Greeks - and argument - episode or specific event located in that practice. The terms have thus been wed interchangeably. 6. Richard Paul and Tony Blair both have urged me to weaken this claim and instead of attempting to redefine 'argument', they believe I should rather take the position that what I am speaking about is one very important and often overlooked sense of the term 'argument'. On reflection, I'm inclined to agree. 7. In the earlier version, I had written 'the' instead of 'one'. But I am now persuaded that this is too strong. Argumentation can also be used to inquire into the truth of something, as my colleague, Anthony Blair, has brought home to me. What I think I now want to claim is that when argument is used persuasively, the conclusion will be controversial; and for it to be used properly in that context, argument must include some reference to other views or positions. 8. Timo Airaksinen questions the use o f ' o n l y ' here.

56

Ralph Η. Johnson

References Barth, Ε.Μ. (1984). Ά New Field: Empirical Logic/Bioprograms, Logemes and Logics as Institution.' Synthese, 63: 376-88. Beardsley, Monroe. (1950). Practical logic. Englewood Cliffs, N.J.: Prentice-Hall. Black, Max. (1946; 1952, 2nd ed.). Critical thinking. Englewood Cliffs, N.J.: Prentice-Hall. Copi, Irving. (1961, 2nd ed.). Introduction to logic. New York: Macmillan. Eemeren, Frans Η. van and Grootendorst, Rob. (1984). Speech acts in argumentative discussions. Dordrecht: Foris Publications. Fogelin, Robert J. (1978). Understanding arguments. New York: Harcourt Brace Jovanovich. Goldman, Alvin. (1985). 'The Relation between Epistemology and Psychology.' Synthese 64, No. 1, July. Johnson, Ralph H. and Blair, J. Anthony. (1977; 1983, 2nd ed.). Logical self-defense. Toronto: McGraw-Hill Ryerson. Johnson, Ralph H. and Blair, J. Anthony. (1980). 'The Recent Development of Informal Logic.' In: Blair, J. Anthony and Johnson, Ralph H., ed.: Informal Logic: The First International Symposium. Inverness, CA: Edgepress. Kahane, Howard. (1971; 1980, 3rd ed.). Logic and contemporary rhetoric. Belmont, CA: Wadsworth. Lambert, Karel and Ulrich, William (1980). The nature of argument. New York: Macmillan. Peirce, Charles S. (1878). 'How to make our ideas clear.' In: Paul Kurtz, ed., American Philosophy in the Twentieth Century. New York: Macmillan. Quine, Willard Van Orman. (1940). Mathematical logic. New York: Harper Torchbook. Quine, Willard Van Orman. (1950). Methods of logic. London: Routledge and Keegan Paul. Scriven, Michael. (1976). Reasoning. New York: McGraw-Hill. Thomas, Stephen N. (1973). Practical reasoning in natural language. Englewood Cliffs, N.J.: PrenticeHall. Wittgenstein, Ludwig. (1980). Culture and value. Oxford: Basil Blackwell.

Chapter 4

Beyond Induction and Deduction Trudy Govier

Issues a theory of argument must treat are not only logical, but epistemic and psycho-social as well. However, among logicians and philosophers the predominant topic studied has been that of inference. How is the inferential link between the premises and conclusion of an argument to be appraised? There are various theories proposed. An especially prominent one is that inferences fall into two basic types: deductive and inductive. Corresponding to this judgement, it is usual to divide arguments into two types as well. We can call such theories of argument positivist theories. The label is appropriate because of its relation to the positivist theory of knowledge, according to which there are two and only two sources of genuine knowledge - the formal sciences and the empirical sciences. Deductive arguments are to be found within mathematics and logic and inductive ones in the empirical sciences. In recent years the positivist theory of knowledge has been discredited. Positivist theories of argument merit similar scepticism. It is well known that many naturally occurring arguments are not easily classified as inductive or deductive. In addition, there are many problems about just how the inductive-deductive distinction is to be defined. In this paper I make some brief preliminary comments on these matters and then go on to describe two types of argument which are not deductive nor, in any significant sense, inductive.

1. Reflections on Positivist Theories of Argument Though positivist theories of argument have a venerable history, going back through Hume to Aristotle, they are not satisfactory theories. The only way to get an exhaustive dichotomy between deductive and inductive arguments (or between inductive and deductive standards of inference, for that matter) is to define 'inductive' as 'nondeductive'. This makes the inductive category a kind of grab bag. There is no assurance that inductive arguments, or inductive standards, have anything in common. When this fact is conjoined to the difficulty of applying the distinction 'deductive-inductive' to naturalistic examples, we have a basis for questioning the dichotomy, insofar as it purports to be exhaustive and exclusive. The strong philosophical attachment to the positivist theory of argument is probably due to two beliefs, both prominent in western intellectual traditions. The first maintains that there are two and only two broad types of argument:

58

Trudy Govier

the deductive ones, which are rationally conclusive, and the inductive ones, not conclusive but essential in the practical conduct of life. The second tradition links nonconclusive arguments, the inductive ones, with empirical science and the confirmation of empirical hypotheses. When we rely on these traditions, either explicitly or implicitly, the result is that we are led to ignore many arguments which are nonconclusive and yet deal with nonempirical issues. This is the fundamental objection to positivist theories of argument. Some arguments should be inductive, since they are nondeductive - but they should not be inductive, since they do not fit our models for reasoning in empirical science. Α positivist theory will rule them out, and this is its real danger. We set up a dichotomy which will be exhaustive by definition, but we then narrow 'inductive' by concentrating too much on empirical science. When we look at inductive logic, it deals with sampling, the generation of empirical hypotheses, causal inference and, sometimes, inferences to empirical explanations. Types of argument common in law, ethics, aesthetics, literary interpretation, and philosophy itself are not included. The positivist classification of arguments is naturalistically implausible and misleads us. This theme is similar to that emphasized by Chaim Perelman and L. Olbrechts-Tyteca who, in their massive book The New Rhetoric and many other works, offer many illustrations of such arguments and castigate logicians and philosophers for having been so mesmerized by traditional categories as to ignore them. Such omissions not only result in false simplicity in classificatory categories, but generate falsely founded problems of justification in moral philosophy and elsewhere. In this paper I shall discuss two important types of argument which are neither deductive or, in any interesting sense, inductive. As will be apparent, my work owes much to others, especially to the English philosopher John Wisdom and the American philosopher Carl Wellman.

2. Case-by-Case Reasoning In a set of lectures entitled 'Explanation and Proof (commonly referred to as the Virginia Lectures and yet unpublished), John Wisdom argued that the most basic type of reasoning is neither inductive nor deductive. It is case-bycase reasoning. This reasoning is used, either explicitly or implicitly, in order to show that a word is properly applied to a particular case. It is of the general type 'x has features a,b,c, and is e; y is like χ in having features a,b,c,; therefore y too is e'. (Features are not always specified; we are sometimes just to 'see' how relevantly alike cases are.) Both deduction and induction presume the proper application of words to instances. If we are disputing as to whether the Jews constitute a nation, or whether a trailer fixed in one position should still be called a mobile home, we find an instance which is unproblematic and compare the disputed case to it. In this way, we can reason toward conclusions. Case-by-case reasoning is a species of argument by analogy. It is nondeductive - the inference is not conclusive in the deductive sense. It differs from standard cases of inductive reasoning in that the issue is conceptual, not empirical. We

Beyond Induction and Deduction

59

reason to conclusions from cases set forth as similar to the case at hand. What is called for is a decision as to whether the two cases are relevantly similar, not a prediction which could be confirmed or disconfirmed by subsequent observations. The thrust that underlies these logical analogies is that of consistency. This is not the consistency at issue when we seek to avoid assenting to contradictory propositions, but rather that required for consistent behavior. Relevantly similar cases must be treated similarly. If one man is given three years for theft and another three months, there must be a relevant difference between them, or else the law is inconsistently and unjustly applied. Logical analogies focus on similarities between cases. Arguments based on such analogies urge us to make decision on a case on the basis of our understanding of a closely similar one. We are pushed to do so by considerations of consistency. The negative use of logical analogy is found in the technique of refuting arguments by citing parallel flawed arguments. If two arguments are fundamentally similar as to structure and the first is flawed, that refutes the second. To regard one as sound the other as unsound would be to make decisions inconsistently. This is a familiar use of case-by-case reasoning within logic itself. The appeal to consistency is also extremely important in law, moral reasoning, and administration. Wisdom emphasized the philosophical, moral, and legal contexts in which such reasoning is ubiquitous and necessary. So too did Chaim Perelman in his early writings on justice and argument. To Wisdom's ideas many would reply that analogy is not a distinct type. A common reason offered is that any analogy can easily be recast in the form of a deductively valid argument, if we are willing to add to it general premises of a sufficiently sweeping nature. To see how this is possible, consider any analogy, of the form: 1. Case χ has features a,b,c. 2. Case y has features a,b,c. 3. Case χ is of type e. Therefore, 4. Case y is of type e. We may regard such an argument as elliptical, insisting that cases (x) and (y) must both be subsumed under the appropriate generalization. Thus, 1. Case χ has features a,b,c. 2. Case y has features a,b,c. 3. Case χ is of type e. Missing Premise: All things which have features a,b,c, are of type e. Therefore, 4. Case y is of type e. There are a number of things wrong with such an approach however. The added premise makes two of the stated premises redundant so far as the logic of the inference is concerned. If we know that all things with a,b,c, are of type e and case y is a,b,c, we can apply the universal statement to case y directly. We have no need to consider case x. An even more basic problem is that the requisite generalization cited as the missing premise is typically not known and may not even be statable. This, indeed, is usually the primary reason

60

Trudy Govier

for appealing to an analogy in the first place. We can compare one case with another even when we don't know a generalization under which both are subsumed. Case-by-case reasoning is, as Wisdom noted, particularly important in law and in moral thought. It may also be used in logic and philosophy and in any context where the issue concerns the correct application of a term. Some attention has been paid to Wisdom's account of reasoning but until quite recently it has gone largely ignored by theorists. Although Wisdom spoke more of reasoning than of argument, it is clear that he was willing to acknowledge a type of argument corresponding to case-by-case reasoning and that he would have resisted any attempt to reconstruct those arguments either as inductive or deductive. It is an open question whether we can set out a general 'logic' for appraising a priori analogies. Wisdom and others tend to write as though this could not be done, and their opinion has doubtless been shared by systematic logicians, who have usually found analogy rather uninteresting from a theoretical point of view. Whether a set of rules giving directions for the comparing and contrasting of different cases for different purposes can be devised remains to be seen. But even if it cannot, this would not entail that case-by-case reasoning fails to exist at all. What is recalcitrant to systematization may be real nonetheless. To the Euclidean theorist who may not be satisfied with such a response, we can reply by turning the tables. As Mill, Pap, Haack, Rawls, Goodman, and many others have pointed out, rules and generalizations do not themselves stand independently. They ultimately require justification too, and they will receive it, in the final analysis, from their intuitively plausible application to particular instances.

3. The Pros and Cons of Conductive Argument There are many arguments in which quite distinct premises are cited as relevant support for a conclusion. These factors may be balanced, in some cases, by counterconsiderations - factors counting against the conclusions. Thus we have the familiar phenomenon of pros and cons. This does not fit naturally into a context of deduction, where there is no room for degrees. Many such pro and con arguments are used in contexts where conceptual, normative, or philosophical, or interpretive issues are at stake. They do not fit models of inductive reasoning developed for empirical science. In a 1970 book called Challenge and Response, the American philosopher Carl Wellman called such arguments conductive and insisted that they constituted a distinct type, the existence of which was crucially important for the correct understanding of justification in ethics. I have been very influenced by Wellman's account but do not follow it closely here. As I use the term, 'conductive' applies to those arguments in which the connection between premises and conclusion is based solely on the relation of relevance. Premises are put forward as being relevant - either conceptually or normatively - to the conclusion and not as being sufficient for it - either

Beyond Induction and Deduction

61

deductively or inductively. Characteristically, conductive arguments have several premises which support the conclusion quite separately. That is to say, premises are not linked. If one were to turn out to be false, the others would still be relevant and would still give some support to the conclusion. Sometimes in a conductive argument only one reason is given; here the difference between conduction and deduction is that other factors, not mentioned, could naturally have been mentioned. The stated factor cannot entail the conclusion by itself. Factors counting against the conclusion may exist and may be recognized as such in the argument. When they are, the arguer is tacitly maintaining that supporting factors outweigh these counterconsiderations. It is consistent to allow that these factors exist and do count against the conclusion while nevertheless insisting that supporting factors really do support it. The interpretive argument 'Hume is not a sceptic for although he argues that our basic beliefs are not rationally justified, he rails against classical sceptics and he maintains that we are as much determined to believe as we are to think and feel' is a conductive argument of this type. One factor is cited which would count towards Hume being a sceptic and two others which would count against that. The latter two are supposed to outweigh the first, so that the argument overall presents good reason to believe that Hume is not a sceptic. Much actual reasoning seems to involve this kind of consideration of various pros and cons and estimating or 'weighing up' their collective significance. Conductive arguments are based on this kind of reasoning. They typically involve the specification of several factors, and the statement of a conclusion on the basis of what the arguer takes to be their collective force. There is a tendency among beginning students in practical logic to approach arguments in a way which is quite inappropriate for standard deduction and induction. They often believe that the more premises an argument has, the better it is, and find it almost impossible to believe that a perfectly sound argument can sometimes have only one premise. Furthermore, asked to assess the connection between premises and conclusion, they will often begin by looking at the premises one at a time, to try to determine, for each premise, whether it is relevant to the conclusion. Such ideas are very often wrong, but they are appropriate for conductive arguments. It is likely that their prevalence among beginners is a kind testimony to the presence, in ordinary argumentation, of many conductive arguments. There is a tendency in some philosophical circles to want to reduce conductive arguments to deductive ones. We can do this by insisting that they are enthymematic. Consider a simple case such as 'You should return the book because it is due today and someone else needs to use it'. This conductive argument could be turned into one or two deductively valid arguments. But the premises we would need to add are either false (as 'You should do everything you promise to do whenever someone else needs that you should do it') or unverifiable (as 'You should do everything you promise to do whenever someone else needs that you should do it, other things being equal') or impossible to formulate in advance - we cannot exhaustively list all the types of cases in which other things would not be equal. The premises, put forward as relevant, can be

62

Trudy Govier

supplemented to entail the conclusion, but the cost of watertight inference is uncertainty about the premises. Wellman objected to this on the grounds that an argument which has some justificatory power, as stated, will lose it, due to the fact that the added premises cannot themselves be shown to be true or acceptable. Like Wisdom, Wellman insisted that we can and do reason about particulars without relying on linking generalizations. Such a position, though a minority one within philosophy, has a venerable history. In fact, it was clearly stated by Descartes when he defended his view that Ί think therefore I am' was not a syllogism with the unstated background premise 'Everything that thinks exists'. The view that particular inferences and judgments can stand independently without background support from universals was also taken by Mill, Wittgenstein, and Moore.

4. Cases, Relevance, and Rules Neither conductive reasoning nor case-by-case reasoning seems fully amenable to treatment by general rules. Perhaps it is for this reason that these kinds of reasoning have been so widely ignored by logical and philosophical theorists. There seems little scope for theory construction and research projects. So far, theorists have done little more than make some rather general comments about their appraisal. How much systematization is possible remains in doubt. Those who insist on general rules as a presumption of correct argumentation will find this disturbing, because so far the arguments quite certainly exist and the rules do not seem to. It is often alleged that any judgment of merit - whether in the domain of logic or elsewhere - is based on a rule. With no rule, there can be no sound or unsound, no valid or invalid, no better or worse. If this were true, we might have to admit that there is no critical discrimination possible for a priori analogies and conductive arguments. They either strike us as forceful, or they don't, and that is the end of it. To acknowledge their existence would be to make only a negative contribution to the logic and epistemology of argument. There is some discussion of this thesis in Wisdom's work and in Wellman's as well. Wisdom emphasizes the dependence, in the final analysis, of general rules upon particular cases. For example, if we are asked why modus tollens is a valid pattern of inference, we will give case after case where it is used successfully, relying on the clear intuitive force of such cases so that people can 'see' that the rule is a plausible one. In addition, the case-by-case method of arguing can deal with a wider range of cases than the original two. By considering further cases which compare and contrast with the original disputed case and the original analogue, we may gain an appreciation of just which features of cases are relevant and irrelevant to the problem at hand. Similar procedures may be possible for conductive argumentation as well. Facing the unattractive prospect of insisting on the 'validity', or argumentative merits, of conductive arguments not backed up by generalizations, Wellman emphasized that judgments about the merits of conductive arguments are at

Beyond Induction and Deduction

63

least open to revision after critical reflection. Things are not so bad that 'thinking makes it so'; we can check by 'thinking things through again'. Such a response is not likely to fully satisfy theorists. Is this all that can be said? Perhaps not. The issue will hinge on two further ones. First of all, there is the issue of how to adjudicate disputes about relevance. (Obviously, this applies to case-by-case reasoning very directly as well.) Secondly, there is the issue of procedures for 'weighing up' the significance of factors. This is very important for the many conductive arguments in which there are both pros and cons. I personally doubt that there are general procedures for doing this, and Wellman did as well. But the possibility remains open and will be explored, no doubt, by those who would seek greater systematization wherever it is possible.

5. Futuristic Speculations It is sometimes suggested that an absence of covering generalizations and background theory to handle such tricky matters as relevance is a product of our presently inadequate knowledge. A recent paper on analogical inference, for example, maintained that analogy is a distinct kind of inference but that the judgments of relevance needed to discriminate between good and poor analogies will ultimately require defence by substantive general theories. In a Knowledge Utopia, theory will be complete, and people will no longer need to use these particularistic arguments which make a basic appeal to intuition. Such a view is incorrect, I think. It is based on false presumptions about what it is to defend and rely upon general theories. It ignores the dependence of theoretical principles on considerations about particular cases. It neglects the key role of analogical reasoning in language learning and use. It neglects its role too in the application of theory. We may know that all x's are y's, but in order to use this principle to get a specific result in a given case, we have to determine whether this case is an x. Is it like other cases, which are x? Application of generalizations presumes classification which presumes, at some level, case-by-case reasoning. No completeness in empirical or normative theories is going to eliminate the need for consistency of application in logic, law, ethics, administration, and science itself. Case-by-case reasoning is crucial in establishing and applying general principles. That role will remain, however complete and 'finally true' such principles are. For somewhat different reasons, the same kind of point can be made about conductive arguments. We can see this by thinking about the defense of theories at the most general levels of discussion. All current accounts of theory acceptance specify several distinct conditions which are relevant to the merits of theories. If empirical, they must be well confirmed; they must have explanatory value; they must be consistent with established theories; they must have predictive power; they must be simple; they should have research potential - be 'fruitful'. Different analysts different lists but all agree there is more than one item on that list. Competing theories will almost always measure up differently on different

64

Trudy Govier

criteria. An argument for the acceptance of a theory is bound to be a conductive argument - the distinct, separately relevant reasons we should accept the theory being its simplicity, power, degree of confirmation, etc. The same point can be made for normative theories. Suggested desiderata for moral theory, for instance, include coherence ready teachability, easy of application, and sensitivity to some range of moral sentiments and considered particular judgments. We are not going to find ourselves in a situation where we have justified general theories applicable to everything under the sun and we do not have conductive arguments. It will be by conductive arguments that we try to show we have justified general theories. I submit that deduction and induction are not all there is. Case-by-case arguments and conductive arguments are real and important and certain to remain so.

Notes I am grateful to John Wisdom for giving me permission to study his Virginia Lectures, which are in limited private circulation. In Renford Bambrough's Wisdom: Twelve Essays, (Oxford: Blackwell, 1974) D. Yalden-Thomas has an essay accurately describing their content. In addition, Bambrough's Moral Scepticism and Moral Knowledge (London: Routledge and Kegan Paul, 1979) and R. Newell's book, The Concept of Philosophy (London: Routledge and Kegan Paul, 1967) develop and use Wisdom's ideas. A thorough discussion of Wisdom's views can be found in a doctoral dissertation by Jerome Bickenbach - The Nature and Scope of Reflective Reasoning, University of Alberta, 1977. Bickenbach makes some appeal to Wisdom in two published articles 'Justifying Deduction' (Dialogue - Canada -Vol. XVIII, No. 4) and 'The Diversity of Proof (Informal Logic Newsletter, Vol. iv, 2). In my own text, A Practical Study of Argument (Belmont, Calif.: Wadsworth, 1985), views quite close to Wisdom's are developed in Chapter Nine. The full title of Wellman's book is Challenge and Response: Justification in Ethics (Carbondale, 111.: University of Illinois Press, 1970). I wrote a critical review of the book which appeared in the Informal Logic Newsletter for April 1980 (Vol. ii, No.2); this piece was discussed by David Hitchcock in his essay 'Deduction, Induction, and Conduction', Informal Logic Newsletter iii, 2. Hitchcock also makes some use of the notion of conductive argument in his text Critical Thinking: a Guide to Evaluating Information (Toronto: Methuen, 1983). There is a brief discussion in my own text, Chapter Ten. The recent article on analogical inference referred to is by William H. Shaw and L.R. Ashley - 'Analogy and Inference'; Dialogue - Canada - Vol. XXII, No. 3. Issues of the Informal Logic Newsletter for 1980 and 1981 contain a number of articles spelling out problems with the inductive-deductive distinction by myself, Perry Weddle, David Hitchcock, Samuel Fohr, and others. The account given here is very simplified and will be developed in much more detail in my essay 'The Great Divide', intended as part of my forthcoming book, Threads of Conviction: Essays on Argument and Critical Thinking. In the present essay I follow Wisdom's ideas quite closely, but deviate to some extent from Wellman's. Wellman used a rather idiosyncratic definition of'inductive', limited conductive arguments (by definition) to arguments about particular cases, and concentrated almost exclusively on ethical examples. The ideas about the role of conductive and case-by-case reasoning in the envisioned Knowledge Utopia are my own, as is the discussion of the theoretic perils of the deductive-inductive dichotomy.

Chapter 5

Meaning Postulates and Rules of Argumentation: Remarks concerning the pragmatic tie between meaning (of terms) and truth (of propositions) Kuno Lorenz

For years we experience a sometimes heated debate among linguists and philosophers of language which I would like to call the controversy between semanticists and pragmaticists concerning the proper demarcation of the areas of semantics and pragmatics. There are the semanticists who try to extend the limits of meaning of linguistic expressions as far as possible into areas of prima facie applicational features, and the pragmaticists who try to resolve even structural relations among linguistic expressions into properties of standard situations of use. The underlying question to be asked is: Can linguistic features be distinguished into those which pertain to language structure (universally or regionally of various degrees) and others which depend irreducibly on non-linguistic circumstances of language use. An example recently investigated by Asa Kasher (1980) concerns the linguistic component in rules of politeness. Is it part of the meaning of certain phrases or constructions like Ί would like to tell you' instead of Ί tell you' that they are used in situations of polite address, i.e. do situations of politeness belong to the range of signification of such phrases or rather to the conditions of their application. Obviously, this is indeed a question of how to demarcate precisely the domain of semantics and the domain of pragmatics. The traditional answer that you switch from semantics to pragmatics as soon as speaker a n d / o r listener appear explicitly in the focus is not sufficient because this move is usually understood as being part of a particular research strategy which I want to call the strategy of semantization of pragmatics. This is basically a move where within your theory of semantic properties - conducted in an appropriate descriptive metalanguage - further variables get introduced, such as variables for time and place of uttering a certain expression, and you arrive at a comprehensive theory which includes treatment of both semantic and pragmatic properties of expressions. No conceptual change of the theory has occurred. You still rely on the Fregean analysis of sentence structure derived from the mathematical model with arguments and value of a function. In fact, you start with two kinds of expressions, singular terms (nominators) which denote entities, and general terms (predicators) which serve to express propositional functions on domains of entities. The rest is logic. Hence, you presuppose as being available a theory of reference which has essentially to account for singular terms, and a theory of sense which has essentially to account for general terms. The area of semantics has not been left, you rather have succeeded in a semantic treatment, also of pragmatic features.

66

Kuno Lorenz

Of course, there is a dual strategy of pragmatization of semantics initiated with Wittgenstein's concept of language games, though speech-act theory, for example, made use of it only very reluctantly and certainly never in its radical version where every semantic notion should be based upon the concept of language game and its derivatives. The propositional content beneath illocutionary and perlocutionary forces of utterances is still analyzed in the Fregean way. Hence, instead of really arguing the respective merits and limits of both strategies with the aim to arrive at a precise explication of the two basic functions of linguistic expressions, their significative and their communicative function, - the study of significative function will then be semantics and the study of communicative function pragmatics - a kind of truce is observed: The study of sentence structure is reserved for the semanticists where truth theory with truth as some kind of correspondence becomes the paradigm case; on the other hand, the study of all kinds of forces appearing with the utterance of sentences becomes the domain of the pragmaticists, and here argumentation theory may well be called the paradigm case of today. The theory of truth as consensus or shared acceptance is but a special case of argumentation theory, and it would be a good choice for taking up again the fight between the two antagonistic strategies. With respect to the concept of truth we should have to ask, whether it is a semantic notion or a pragmatic one. But, unfortunately, just because of the consequence that it would mean to breach the truce, argumentation theory usually stops short of truth theory. This is especially deplorable, because it has the result of continuing to neglect scrutiny into an ambiguity already present in Frege: Frege used the term 'true' both in a semantic framework and in a pragmatic one. I may recall that in the analysis of Frege's symbolism for assertions \— A, the horizontal ' - ' represents the semantic component, the vertical '|' the pragmatic component of '|—Α' ('Λ' alone stands for the syntactic component), such that '-A' signifies intensionally a thought and extensionally a truth-value (Sinn and Bedeutung of an assertion), whereas' (—A' does not signify at all but asserts (1891: note 7), or more explicitly: The speech-act assertion uses the thought ' - A ' as subject of a judgement, called by Frege 'acknowledgement of the truth o f - A ' . But in case ' - A ' refers to the truth-value 'falsehood' it is imperative not to mix the mere claim for truth with its fulfilment. At that time Frege did not draw conclusions out of his own observation that thoughts (or sentence signs) do not behave like other names of objects, but that they are bound to undergo a judgement and that only then the question of truth comes in. A famous passage in his posthumous paper Logik (1897) indicates such a radical move to put 'truth' exclusively into the area of pragmatics and afortiori to free considerations of meaning from the grip of truth conditions which gained widespread currency following the concise formulation in Wittgenstein's Tractatus 4.024: 'To understand a proposition means to know what is the case if it is true.' None of the semanticists has followed him on this route since, because meanwhile the archangel Tarski, with his successful restitution of semantic notion of truth, at least for formalized languages, seemed to have closed the entrance to a pragmatic treatment of truth. The relevant quotation from Frege's logic-paper runs thus: 'Es wäre nun vergeblich, durch eine D e f i n i t i o n deutlicher zu machen, was unter 'wahr' zu verstehen

Meaning Postulates and Rules of

Argumentation

67

sei. Wollte man etwa sagen: 'Wahr ist eine Vorstellung, wenn sie mit der Wirklichkeit übereinstimmt', so wäre damit nichts gewonnen, denn, um dies anzuwenden, müßt» man in einem gegebenen Falle entscheiden, ob eine Vorstellung mit der Wirklichkeit übereinstimme, mit anderen Worten: ob es wahr sei, daß die Vorstellung mit der Wirklichkeit übereinstimme. Es müßte also das Definierte selbst vorausgesetzt werden. Dasselbe gälte von jeder Erklärung von dieser Form: 'Λ ist wahr, wenn es die und die Eigenschaften hat, oder zu dem und dem in der und der Beziehung steht'. Immer käme es wieder im gegebenen Falle darauf an, ob es wahr sei, daß Α die und die Eigenschaften habe, zu dem und dem in der und der Beziehung stehe. Wahrheit ist offenbar etwas so Ursprüngliches und Einfaches, daß eine Zurückführung auf noch Einfacheres nicht möglich ist. Wir sind daher darauf angewiesen, das Eigentümliche unseres Prädikates durch Vergleichung mit anderen ins Licht zu setzen. Zunächst unterscheidet es sich von allen anderen Prädikaten dadurch, daß es immer mit ausgesagt wird, wenn irgend etwas ausgesagt wird. Wenn ich behaupte, daß die Summe von 2 und 3 5 ist, so behaupte ich damit, daß es wahr ist, daß 2 und 3 5 ist [...]. Die Form des Behauptungssatzes ist also eigentlich das, womit wir die Wahrheit aussagen, und wir bedürfen dazu des Wortes 'wahr' nicht. Ja, wir können sagen: selbst da, wo wir die Ausdrucksweise 'es ist wahr, daß ...' anwenden, ist eigentlich die Form des Behauptungssatzes das Wesentliche' (1897: 139f)-

We are advised to take up again the study of the difference between the propositional kernel of an utterance and its modes, e.g. its assertive force. The apparently simplest case is an elementary affirmative sentence s ε P, e.g., 'Sam is smoking'. In view of what I said in the beginning about the prerequisites to have singular and general terms available, we may at once observe - and this is usually overlooked by both parties - that using sentences of this kind is dependent on two important presuppositions to be fulfilled. First, the independence-presupposition'. With each singular term a certain domain of objects is assumed to be available which includes the referent of that term - the domain acts as the range of a variable at the place of the singular term when sentence forms get introduced and this domain is prior to, that is, definitionally independent of, the possible predicates to be applied. With respect to the example 'Sam is smoking' we may use the domain of human beings, and the predicates on that domain are - in an extensional account - nothing but set-theoretic partitions of the domain, hence, they are certainly without influence on the definition of the domain itself. Second, the individuation-presupposition: For the objects of a domain it has to make sense to use the predicates 'same' and 'different', that is, when building a logic on top of our sample sentence together with others (either keeping to one domain of objects or to several ones), it is always possible to introduce the identity predicate without further ado. And, in fact, the very operation of substituting variables for singular terms and vice versa requires to have the objects as recurring individuals at one's disposal rather than as mere singulars in the strict sense, as feature tokens, for example. Quine, of course, has noted this individuation-presupposition, though he does not treat it as a presupposition in need of further inquiry how it can be ensured. His claim investigated by Gottlieb (1979) follows the slogan 'no entity without identity' provided you add that to be [an entity] means to be the value of a variable (Quine 1939). The obvious question to be asked is: What must happen a n d / o r what has to be done in order to have both presuppositions fulfilled. In traditional philosophical terminology we are confronted with the problem of object-constitution to be solved prior to everything connected with object-description which takes place by using predicates on the domain of objects.

68

Kuno Lorenz

Plato, already, had devised his method of διαιρεοι? for the purpose of separating questions of object-constitution from the posterior questions of truth connected with object-description. Though he had not yet been able really to solve the problem how to provide for objects, his very simple move of introducing a two-step procedure is sufficient to account for the distinction between, e.g., 'Sam is a human being' which raises no problem of truth - the sentence is true ex definitione or, better: the term 'true' does not yet apply - and our original sample sentence 'Sam is smoking'. The usual headline for this distinction, of course, is 'analytic-synthetic' which is rather veiling than exhibiting the origin of this difference. It suggests, for example, and it has always suggested that analytic sentences are >true by stipulatio n ^ And it leads and has always led to the awkward maxim that even in cases of such >immediate evidence of truth< - besides 'Sam is a human being' you may also include, e.g., 'this is an instance of Smoking-by-Sam' where 'this' refers deictically to an actualization of the action Smoking-by-Sam which implies that 'this' need not yet refer to a full-fledged individual and is not free, therefore, for substitution by a variable - you either should give reasons why they are true, that is, produce arguments (for whom?, what for?...) or develop a theory of elementary truth as evidence. The more reasonable move in this situation, I propose, is to inquire into the genesis of the question of truth, that is, into the conditions which obtain when it makes sense to use assertions. Of course, logicians of both camps, model theorists and argumentation theorists - they are the semanticists and pragmatists within logic, and their relation may be considered as a dialectical field in the sense of Professor Barth instead of treating it as an evolutionary move from phase 1 to phase 2 in her presentation - are accustomed to begin their work only when such a situation already obtains. This, too, is the reason, I think, why the two functions of linguistic expressions I referred to in the beginning, the significative function and the communicative one, which Plato already called the two main characteristics of human speech (διακρινειν τα πραγματα and διδασκειν τν αλλήλους) in his Cratylus (388b), get interpreted by logicians, of the semanticist camp at least, to be the origin of the idea that it is necessary to develop both the means to guaranty truth - eventually settled by finding appropriate axioms - and to establish correct rules of discourse or argumentation rules which will, again eventually, i.e. after having invented the instrument of axiomatization of a theory, be settled by determining adequate rules of inference. All this happens on the level of sentences, the set-up of sentences out of terms is relegated to a mere auxiliary task - in the view of our logician, not in the view of Plato. Now, my suggestion is to take Plato verbatim and to start with the idea that each utterance shows indeed both features, it signifies and it communicates, that is, it plays the role of a term (or a word) and the röle of a sentence, well-known in the simplest case of one-word-sentences. In the more complex cases there exist syntactic clues indicating which role is active and which is dormant. The terminology I am going to use with respect to signification and communication is derived more or less from a Peircean account of dealing with semiotics and pragmatics expanded by a consideration of the acquisition processes for the language competences of human beings conceptualized in the frame of Wittgenstein's

Meaning Postulates and Rules of

Argumentation

69

language games. They, in fact, corroborate Plato's idea that one should start with the two basic functions of linguistic expressions when they are uttered. In the simplest case - we are now on a level below the sample sentence schema 'seP' - an utterance, as a term, articulates an object-schema, e.g., the action-type Smoking or even its relativization Smoking-by-Sam, and it states that there is, in the situation of utterance, a token of that type. The articulation is the significative function, whereas stating being the communicative function may properly be called predication. To say either the utterance makes sense or the utterance is true is superfluous, both formulations are pleonastic, you just use other words for the two functions. It needs a considerable number of steps to introduce that kind of complexity into one-word-sentences which allows to distinguish by syntactic means their significative and communicative function, e.g., to make available both 'Smoking-by-Sam' (a term) and 'Sam is smoking' (a sentence). Before looking into this a bit further, I would like to call to your attention that it is not sufficient just to observe utterances and to distinguish their two functions. Such a descriptive attitude works only when amended by participation. Unless you take into account that you have to acquire the ability to use an expression with these functions - making utterances in a situation which are understood and reacted upon properly - , and this amounts to being aware of the necessarily social character of the acquisition process, it is not possible to get a clear picture of the distinction between introduction and use of expressions. In situations of use the relation to the acquisition process for the parties involved looks very different with respect to the significative function. For example, the utterance 'Smoking' may occur in a situation where the speaker sees or smells someone (listener or speaker herself) smoking or where he helps someone to lit a cigarette, etc. The use of 'Smoking' can refer to Smoking only through a specific > c o m p o n e n t < of the action-type Smoking, accessible to the speaker (and another one to the listener) in the situation of utterance. It is what Frege called the Gegebenheitsweise (way of being given) of an object, here: of the object-schema, the type Smoking. You can articulate these Gegebenheitsweisen, too, and the associated predications of second-order are what our philosophical tradition has called perceptions, though they have erroneously been treated as basis for predications of first order instead of proceeding the other way round. Correspondingly, with respect to the communicative function of our sample utterance 'Smoking', you are bound to consider all kinds of use in a specific situation, warning the listener or encouraging him with that utterance, asking him by it whether he wants to smoke, etc. The predication 'Smoking', i.e. 'Smoking' in its sentential röle, can occur only in a mode - there is no pure predication in situations of use - which, when articulated independently, occur as the well-known performators of speech-act theory. Now, the meaning of a term is, of course, the invariant out of the open set of perceptual perspectives, whereas the validity of a sentence is a claim with respect to the mode of the sentence-utterance, a claim which roughly amounts to the ability on the part of the speaker a n d / o r the listener to make the situation of sentence-utterance and the situation, the type of which is articulated by the corresponding term, coincident. Hence, validity claims can occur only when these two situations, the

70

Kuno

Lorenz

situation of uttering and the situation of acting which is articulated with the utterance, do not coincide. The mere use of different ways of being given by two parties is insufficient to speak of a problem of meaning for them - do they > r e a l l y < speak of the same thing - , and just the use of a mode when speaking > a b o u t something< is insufficient to raise a problem of validity for the two parties, e.g., has my communicative intention towards you been successful or not. Let us return to our example in order to clarify this situation of theoretical reconstruction a bit more. If the term 'Smoking-by-Sam' is available, what is its sentential value when uttered. Obviously, the one which we can represent by the sentential locution 'this is Smoking-by-Sam' assuming the assertive mode for the moment. And the crucial question: What is the difference between the sentence 'this is Smoking-by-Sam' and the sentence 'Sam is smoking'? The sentence 'this is Smoking-by-Sam' contains a singular term referring, in complete dependency upon the situation of utterance, to a singular token of Smoking-by-Sam. On the other hand, the sentence 'Sam is smoking' contains a singular term referring independently of the situation of utterance, in fact, referring to an individual constructed as a whole out of an indefinite class of tokens taken from types like Smoking, Running, Nose, Beard, Seen-by-X, etc., which are coincident with certain tokens of the type Man. And it is this invariance of reference with respect to utterance situations made possible by turning from singulars (tokens of an object-type) to individuals (wholes out of certain indefinite classes of tokens of an object-type) which creates the difference between uttering something meaningfully and truthfully. You can assert 'Sam is smoking' and thereby claim that you are able to provide situations - in the non-trivial cases different from the situation in which 'Sam is smoking' is uttered - which instantiate Smoking-by-Sam, this action-type being defined as the meaning of the sentence 'Sam is smoking'. The operation leading from a sentence A, e.g., 'Sam is smoking', to the associated t e r m ^ * , e.g., 'Smokingby-Sam', such that A is true i f f A* is instantiated, was first introduced by Reichenbach (1947, §48) with his star-operator to derive an event-language out of a thing-language. As a final step we apply these ideas to logically complex sentences, and thereby arrive at a perspective towards dialogic logic which was not at all clear when the game-theoretic approach to logic got developed by Lorenzen and myself. 1 For each logical composition, Α Λ Β or A B, you are asked to determine the meaning of these sentences, and that is, to ask for the kind of action-types which are articulated by (Α Α Β)* or (A - B)*. The game-theoretic approach makes sense only, if you neither turn to the conjunction of the two action-types, e.g., Smoking-by-Sam and Climbing-by-Jim, as the meaning of A AB- it would be nothing but truth-functional logic in disguise, and, hence, inadequate to handle the universal quantifier in a finite manner - , nor turn to the conjunction of proof-procedures for Λ and fori? as a proof-procedure, and in this sense the meaning, for A A B - that would be K o l m o g o r o f f s interpretation of logical composition with its well-known difficulties to arrive at a treatment both adequate and rigorous. It is necessary to turn to an interaction-type as the meaning of A AB, and that is to be specified as a dialogue game with the plays of

Meaning Postulates and Rules of

71

Argumentation

that game being the interaction tokens in a specific Gegebenheitsweise for either player. The same holds in case of the subjunction A —• B. Again, neither an If-then-connection of action-types as meaning of A —• Β nor a conditional proofprocedure is appropriate. The first move is placed to > l o w < , on the object-level (technical 'if-then'), the second move is placed too > h i g h < , on the level of argumentation (practical 'if-then'). We have to get >in between