Logic, Sets, and Functions 0787263559

This book is intended as an introduction to symbolic logic and elementary set theory, including mathematical induction.

327 18 7MB

English Pages [288] Year 1999

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
1 Basic Concepts of Logic
1.1 Arguments
1.2 Validity
1.3 Implication and Equivalence
1.4 Logical Properties of Sentences
1.5 Satisfiability
2 Sentences
2.1 The Language of Sentential Logic
2.2 Truth Functions
2.3 A Sentential Language
2.4 Translation
2.5 Validity
2.6 Truth Tables
2.7 Truth Tables for Formulas
2.8 Examples
2.9 Truth Tables for Argument Forms
2.10 Implication, Equivalence and Satisfiability
3 Natural Deduction
3.1 Natural Deduction Systems
3.2 Rules for Negation and Conjunction
3.3 Rules for the Conditional and Biconditional
'3_4 Rules for Disjunction
3.5 Derivable Rules
4 Quantifiers
4.1 Constants and Quantifiers
4.2 Categorical Sentence Forms
4.3 Polyadic Predicates
4.4 The Language Q
4.5 Translation
4.5.1 Noun Phrases
4.5.2 Verb Phrases
4.5.3 Connectives
4.6 Interpretations
5 Quantified Natural Deduction
5.1 Deduction Rules for Quantifiers
5.2 Universal Proof
5.3 Derived Rules for Quantifiers
6 Identity and Function Symbols
6.1 Identity
6.2 Deduction Rules for Identity
6.3 Function Symbols
7 Sets
7.1 Extensionality
7.2 Abstraction
7.3 Pair Sets, Unit Sets and Enumeration
7.4 The Null Set
7.5 Binary Unions, Intersections and Complements
7.6 Unions and Intersections of Single Sets
7.7 Power Sets
7.8 Complex Abstraction
8 Relations
8.1 Sequences and Ordered Pairs
8.2 Cartesian Products
8.3 Relations
8.4 Properties of Relations
8.5 Ordering Relations
8.6 Relations between Relations
8.7 Restrictions and Images
8.8 Equivalence Relations and Partitions
9 Functions
9.1 Functions and Relations
9.2 Properties of Functions
10 Induction
10.1 The Natural Numbers and Definition by Recursion
10.2 Weak Induction on the Natural Numbers
10.3 Strong Induction on Natural Numbers
10.4 Induction on Sets other than the Natural Numbers
10.5 Graphs, Trees and Lists
10.6 Formal Languages
A Plato's Users' Guide
A.I Introduction
A.2 Getting Started
A.3 Starting Plato on the Macintosh
A.3.1 The Menus: An Overview
A.3.2 The File Menu
A.3.3 The Edit Menu
A.3.4 The Annotations Menu
A.3.5 The Font and Size Menus
A.4 Homework Mode the Windows Version
A.5 The Windows Menu
A.6 Languages of Logic
A.6.1 The Language of Sentential Logic
A.6.2 The Language of Quantificational Logic
A. 7 Systems of Proof
A.7.1 Proof Formats
A.7.2 Derivation Rules
A.7.3 The Proofs Menu
A.7.4 Imported Rules
A.7.5 The Quantification Rule Set
A.8 Macintosh Keyboard Shortcuts
A.8.1 Mackintosh Keyboard Equivalents for Logical Symbols
B Answers to Selected Problems
Index
Recommend Papers

Logic, Sets, and Functions
 0787263559

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Logic, Sets and Functions

Daniel Bonevac Nicholas M. Asher Robert C. Koons

University of Texas at Austin

rri1

KENDALL/HUNT PUBLISHING

� 4050 Westmark

Drive

Dubuque.

COMPANY Iowa

52002

To Beverly, Sheila and Debbie.

Copyright c, 1999 by Kendall/Hunt Publishing Company ISBN 0-7872-6355-9 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the copyright owner. Printed in the United States of America 10 9 8 7 6 5 4 3 2 1

Contents 1

2

3

4

Basic Concepts of Logic

1 2 8 14 18 23

Sentences

27 27 30 33 36 38 41 43 45 48 51

Natural Deduction

55 55 57 64 67 69

Quantifiers

77 78 82 84 87 90 91 95 97 100

1.1 1.2 1.3 1.4 1.5

2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 2.10 3.1 3.2 3.3 '3_4 3.5

Arguments . . . . . . Validity . . . . . . . . Implication and Equivalence . Logical Properties of Sentences . Satisfiability . . . . . . . . . . . The Language of Sentential Logic Truth Functions . . .. A Sentential Language Translation . Validity . . . . . . . . . Truth Tables . . . . . . Truth Tables for Formulas Examples . . . . . . . . . . Truth Tables for Argument Forms Implication, Equivalence and Satisfiability Natural Deduction Systems . Rules for Negation and Conjunction . . . . Rules for the Conditional and Biconditional Rules for Disjunction Derivable Rules

4.1 Constants and Quantifiers 4.2 Categorical Sentence Forms . 4.3 Polyadic Predicates . 4.4 The Language Q . . . 4.5 Translation . . . . . . 4.5.1 Noun Phrases . 4.5.2 Verb Phrases 4.5.3 Connectives 4.6 Interpretations . . . iii

CONTENTS

iv 5

Quantified Natural Deduction

109

6

Identity and Function Symbols

123 123 127 129

7

Sets

137 13 7 142 145 147 149 156 159 16 1

8

Relations

163 16 3 16 5 17 0 17 5 18 0 18 5 191 195

9

Functions

199 199 20 3

5.1 Deduction Rules for Quantifiers . 5 2 Universal Proof . . ..... . 5 3 Derived Rules for Quantifiers . . 6 1 Identity ............ 6.2 Deduction Rules for Identity 6.3 Function Symbols ...... 7.1 7.2 7.3 7.4 7.5 7.6 . 7 7 . 7 8

8.1 8.2 . 8 3 8.4 . 8 5 . 8 6 8.7 8.8

Extensionality . Abstraction .. Pair Sets, Unit Sets and Enumeration The Null Set .............. Binary Unions, Intersections and Complements Unions and Intersections of Single Sets . Power Sets ...... Complex Abstraction . Sequences and Ordered Pairs Cartesian Products ... Relations ..... . .. Properties of Relations . Ordering Relations ... Relations between Relations . Restrictions and Images ... Equivalence Relations and Partitions

. 9 1 Functions and Relations . 9 2 Properties of Functions

10 9 114 119

10 Induction

209

A Plato's Users' Guide

251 251 252 252 252 253 253 254

10.1 The Natural Numbers and Definition by Recursion 10.2 Weak Induction on the Natural Numbers .... . 10.3 Strong Induction on Natural Numbers ....... 10.4 Induction on Sets other than the Natural Numbers 10.5 Graphs, Trees and Lists 10.6 Formal Languages

A.I Introduction ............ A.2 Getting Started ......... A.3 Starting Plato on the Macintosh A.3.1 The Menus: An Overview A.3.2 The File Menu ..... A.3.3 The Edit Menu .... A.3.4 The Annotations Menu

20 9 217 221 224 23 3 241

CONTENTS A.3.5 The Font and Size Menus . . . . A.4 Homework Mode the Windows Version . A.5 The Windows Menu . . . . . . . . . . A.6 Languages of Logic . . . . . . . . . . . A.6.1 The Language of Sentential Logic A.6.2 The Language of Quantificational Logic . A.7 Systems of Proof . . . . . A.7.1 Proof Formats . . A.7.2 Derivation Rules . A.7.3 The Proofs Menu A.7.4 Imported Rules . A.7.5 The Quantification Rule Set A.8 Macintosh Keyboard Shortcuts . . . A.8.1 Mackintosh Keyboard Equivalents for Logical Symbols .

V

255 255 256 256 256 257 257 257 259 260 264 265 266 267

B Answers to- Selected Problems

269

Index

284

vi

Preface This book is intended as an introduction to symbolic logic and elementary set theory, including mathematical induction. It is designed to meet the needs of a semester-long course, with the needs and interests of computer science majors and others requiring a foundation in the construction of proofs and in the use of discrete mathematics. The text is accompanied by a software application, Plato, available in both W indows and Mac­ intosh versions. Plato can be used by students in producing derivations that employ the proof styles and rules introduced in this text. The use of Plato provides the student with immediate feedback, since it will not allow any logical errors to be committed. Plato is quite flexible, capable of being used to generate a derivation of any valid argument in first-order logic. In addition, Plato greatly eases the burden of grading homework. We would like to express our appreciation to the College of Liberal Arts at the University of Texas-Austin, and to the Multimedia Task Force of the UT System, for their financial support for the development of Plato. Austin , Texas May, 1999

Chapter 1

Basic Concepts of Logic Clerk: Mr. McClory moves to postpone for ten days further consideration of whether sufficient grounds exist for the House of Representatives to exercise constitutional power of impeachment unless by 12 noon, eastern daylight time, on Saturday, July 27, 1974, the President fails to give his unequivocal assurance to produce forthwith all taped conversations subpoenaed by the committee which are to be made available to the district court pursuant to court order in United States v. Mitchell... Mr. Latta: ... I just want to call [Mr. McClory's] attention before we vote to the wording of his motion. You move to postpone for ten days unless the President fails to give his assurance to produce the tapes. So, if he fails tomorrow, we get ten days. If he complies, we do not. The way you have it drafted I would suggest that you correct your motion to say that you get ten days providing the President gives his unequivocal assurance to produce the tapes by tomorrow noon. Mr. McClory: I think the motion is correctly worded, it has been thoughtfully drafted. Mr. Latta: I would suggest you rethink it ... Mr. Mann: Mr. Chairman, I think it is important that the committee vote on a resolution that properly expresses the intent of the gentleman from Illinois [Mr. McClory] and if he will examine his motion he will find that the words 'fail to' need to be stricken ... Mr. McClory: If the gentleman will yield, the motion is correctly worded It provides for a postponement for ten days unless the President fails tomorrow to give his assurance, so there is no postponement for ten days if the President fails to give his assurance, just one day. I think it is correctly drafted. I have had it drafted by counsel and I was misled originally, too, but it is correctly drafted. There is a ten day postponement unless the President fails to give assurance. If he fails to give it, there is only a twenty-four-hour or there is only a twenty-three-and-a-half hour day. House Judiciary Committee, July 26, 1974 Logic is the study of correct reasoning. Logic pertains to all subjects, since people can reason about anything they can think about. Politics, the arts, literature, business, the sciences, everyday problems are all subjects open to reasoning. Sometimes the reasoning is good; sometimes, not so good. People use logic to tell the difference. Using logic, we can evaluate bits of reasoning as proper or improper, good or bad. Logic is not the study of how people do reason, but how they should reason. Logic does not describe real reasoning, with its errors, omissions and oversights; it prescribes methods for justifying reasoning, 1

2

Logic, Sets and Functions

that is, for showing that a given bit of reasoning is proper. Logic thus describes an ideal that actual reasoning strives for but often fails to reach. Logic begins with the study of language. To develop a system of logic, it is necessary to un­ derstand how people actually reason. To eliminate the errors that creep into people's performance, we need to examine people's considered judgments about the correctness or incorrectness of infer­ ences. No matter what mental processes people go through to achieve the right result, they try to follow rules for putting sentences together to form proper bits of reasoning. Logic is not merely a description of reasoning; logicians examine people's evaluations of bits of reasoning to say what the rules of correct reasoning are. Logic describes not the process of reasoning but the rules for correct reasoning. The logical systems discussed in this book are designed both to explicate the meanings of certain expressions of natural languages such as English, Chinese, Swahili, and German and also to analyze reasoning in formal systems such as those used in ma.thematics and computer science .

1.1

Arguments

Arguments represent reasoning in language. Frequently, we think of arguments as heated debates, disagreements or disputes. Sometimes, however, we speak of a politician arguing for the passage of a bill, a lawyer arguing a case, or a moviegoer arguing that North By Northwest is better than The 39 Steps. An argument in this sense starts with some assertions called premises and tries to justify a conclusion. Many arguments in natural language are complicated. A lawyer arguing for the innocence of a client, for instance, offers many more specific arguments in presenting the case. The lawyer may argue that a piece of evidence is inadmissible, that results from a lab test are ambiguous, that the client could not have reached the scene of the crime by the time it was committed, and so on. All these smaller arguments form part of the larger argument for the client's innocence. We can divide arguments, then, into two groups: extended arguments, which contain other arguments, and simple arguments, which do not. Extended arguments may have several conclusions. Such arguments may consist of several simple arguments in sequence. They may contain other extended arguments. And they may consist of a list of premises, followed by several conclusions stated at once. Mathematical proofs are extended arguments. A mathematician may begin a proof by stating some assumptions. The mathematician then draws out consequences of the assumptions, perhaps making other assumptions along the way. Finally, the proof ends with a conclusion, the theorem it proves. A mathematical proof is thus a series of simple arguments. A simple argument, like an extended argument, starts with premises justifying a conclusion. We will be so often concerned with simple arguments that we will drop the adjective simple and speak of arguments. (Later, when we examine proofs, we will just call them proofs.) An argument consists of a finite sequence of sentences, called premises, together with another sentence, the conclusion, which the premises are taken to support. An argument in ordinary language or in mathematics is a string or sequence of sentences. The sentences making up the argument are in a particular order, whether the argument is spoken, written, or encoded in a computer language. For our purposes in this text, the order of the premises makes no difference. So we will not worry about order of presentation. But we will require that the string of premises be finite. No one has the patience to listen to an argument that runs on forever. Arguments consist of sentences. In this text, we will be interested only in sentences that can be true or false. Many ordinary sentences, including almost all in this book, fall into this category. They say something about the way the world is, and might be correct or incorrect in so describing it. But commands, for example, are different. Shut the door can be appropriate or inappropriate,

Basic Concepts of Logic

3

irritating or conciliatory, friendly or hostile, but it cannot be true or false. Questions, such as 'What is the capital of Zaire? ', and interjecti ons, such as 'Ouch!', are likewise neither true nor false. A sentence is true or false in a particular context: as used on a particular occasion by a particular speaker to a particular audience, in a given circumstance and as part of a disc ourse. Without contextual information, we cannot say whether a sentence such as I love you is true or false. Sentences have truth values only relative to a context of use. Nevertheless, very little in the following pages will involve context directly. So, we will generally speak of sentences as having truth values, trusting ourselves to remember that these values are relative to context. An argument, according to our definition, contains one sentence that is its conclusion. This is an idealization: In natural language, a conclusion may be a clause in a sentence; it may be spread acr oss several sentences; or it may be left unstated. The same is true of premises. The definition does not specify how to pick out the conclusion of an argument. In English, certain words or phrases typically signal the conclusion of an argument, while others signal premises: Conclusion Indicators therefore, thus, hence, so, consequently, it follows that, in conclusion, as a result, then, must Premise Indicators because, for, smce All these words and phrases have other uses; they are not always premise or conclusion indicators. But these words and phrases can, and often do, serve as indicators because they can attest to relations of supp ort among the sentences of an argument. 'Since Fred forgot to go to the interview, he won't get the job' presents a simple argument within a single English sentence. The word 'since' indicates that we sh ould take 'Fred forgot to go to the interview' as a premise, supporting the conclusion 'he won't get the j ob.' Similarly, 'Jane's business must be doing well; she drives a Mercedes' constitutes an argument. The auxiliary verb 'must' marks 'Jane's business is doing well' as the conclusion, supported by the evidence in 'she drives a Mercedes.' Premise indicators often signal not only that one or more sentences are premises,but that a certain sentence is a conclusion. Since, for example, exhibits a relation of supp ort between the sentences it links. The occurrence of 'since' in 'Since Fred forgot to go to the interview, he won't get the j ob' p oints out, n ot only that the sentence immediately following it is a premise but also that the sentence 'he won't get the j ob' is a conclusion. Similarly, the occurrence of 'for' in 'Northern Indiana Public Service will not pay its usual dividend this quarter, for the court refused to allow expenditures on its now-cancelled nuclear pr oject into the rate base' indicates b oth that 'the court refused to allow expenditures on its now-cancelled nuclear pr oject into the rate base' is a premise and that 'Northern Indiana Public Service will not pay its usual dividend this quarter' is a conclusion. Indicators provide imp ortant clues to the structure of arguments. Often, h owever, n o explicit indicators appear. Sometimes the conclusion is not even stated. In such cases, we must consider the point of the argument. What is the author trying to establish? Consider some examples: Suppose we argued that what was true was true for us, that two assertions met on no comm on ground, so that neither was "really true" or "really false" . This position went further than skepticism and declared the belief in error itself to be erroneous. Royce called this view that of the total relativity of truth, and he had an argument against

4

Logic, Sets and Functions

it. If the statement "T here is error" is true, there is error; if it is false, then there is, ipso facto, error. He could only conclude that error existed; to deny its existence was contradictory. Bruce Kukli ck, The Rise of American Philos ophy T his is an extended argument. The conclusion of Royce's smaller argument is plainly 'error exists'; the words conclude that make this obvious. Royce then uses this c onclusion to argue that the view of the total relativity of truth is false. T he sentence 'error exists' thus functions as the conclusion of one argument and as a premise of another, all within the same extended argument. If it were permitted to reason consistently in religious matters, it is clear that we all ought to become Jews, because Jesus Christ was born a Jew, lived a Jew, and died a Jew, and because he said that he was accomplishing and fulfilling the Jewish religi on. Voltaire Voltaire seems to be arguing for the conclusion 'we all ought to become Jews. ' Here the key word is because, which indicates that the rest of the argument is a list of premises. Voltaire, a satirist, is really aiming n ot at this conclusion but at an other. Everything he says is supposed to follow from the hyp othetical 'if it were permitted to reason consistently in religious matters. ' Like Royce, he is offering an argument within an extended argument. T he conclusion of the larger argument is not stated. Nevertheless, it is easy to see that Voltaire is trying to establish that it is not permitted to reason consistently in religi ous matters. The conclusi on of the smaller argument- 'we all ought to become Jews' - is an observation that few Christians would be willing t o accept, even th ough, acc ording to Voltaire, their own doctrine commits them to it. T he final example is a mathematical pr o of: the traditional proof that the square r o ot of 2 is irrational.

v'2

[Supp ose] for the sake of argument that is rational, i.e. that there are two integers, say m and n, which are mutually prime and which are such that � = or m 2 = 2n2 . From this it follows that m2 must be even and with it m, since a square number cannot have any prime factor which is not als o a factor of the number of which it is the square. But if m is even, n must be odd according to our initial supp ositi on that they are mutually prime. Assuming that m = 2k, we can infer that 2n 2 = 4k 2 , or n2 = 2k 2 ; and from this it can be shown by a repetition of the reasoning used above that n must be even. Our hyp othesis, therefore, entails inc ompatible consequences, and so it must be false.

v'2

W. Kneale and M. Kneale, T he Development of L ogic Like almost any pr o of, this one is an extended argument; in fact, it is a series of simple arguments. T he pro of begins with the assumption that is rati onal. The first simple argument concludes that m 2 must be even; very quickly follows an other simple argument concluding that m must also be even. T he third simple argument concludes that n is odd. T he fourth concludes that 2n 2 = 4k2 ; the fifth, that n 2 = 2k 2 ; the sixth, that n is even. Finally, the proof ends with a seventh simple argument that the hypothesis that the square root of 2 is rational is false. When we write an argument "officially" , in standard form, we will list the premises in the order in which they are given, and then list the conclusion. So, in our official representations, conclusions will always come last. T his is n ot true in natural language, as Voltaire's argument shows; conclusions may appear at the beginning, in the middle, or at the end of arguments, if they are stated at all. In addition, we'll preface the conclusion with the symbol :. , which means "therefore" . T o see h ow these representati ons work, let's write Royce's smaller argument in standard form:

v'2

Basic Concepts of Logic

5

If the statement "There is error" is true, there is error. If the statement "There is error" is false, there is error. :. There is error. Royce's larger argument, then, is: Error exists. The view of the total relativity of truth holds that the belief in error is errone ous. :. The view of the total relativity of truth is false. We can similarly express Voltaire's two arguments in standard form: Jesus Christ was born a Jew, lived a Jew, and died a Jew. Jesus Christ said he was accomplishing and fulfilling the Jewish religi on. :. If it were permitted to reason consistently in religious matters, it is clear that we all ought to become Jews. If it were permitted to reas on consistently in religious matters, it is clear that we all ought to become Jews. (It is not clear t o religious Christians that we all ought to become Jews. ) :. It is not premitted t o reason consistently in religious matters. Finally, we can express the proof of the irrationality of

.J2 as a series of simple arguments:

/2 is rati onal, i.e. , there are two integers, say m and n, that are mutually prime and that are such that !;! = or m 2 = 2n 2 . .-. m 2 is even.

.J2

m 2 is even. A square number cannot have any prime factor that is not also a factor of the number of which it is the square. :. m is even. m is even. m and n are mutually prime. :. n is odd. m = 2k :. 2n2 = 4k2 2n 2 = 4k 2 :. n2 = 2k 2 n2 = 2k 2 (A repetition of the reas oning used ab ove.) :. n is even.

.J2 is rational entails incompatible consequences. .J2 is rational is false.

The hypothesis that :. The hypothesis that

Problems

Write each of the following arguments in standard form. If there are several arguments in a passage, write each separately.

Logic, Sets and Functions

6

l . The Bears did well this year, so they'll probably do well again next year. 2. John must have left already; his books are gone. 3. Few contemporary novels deal explicitly with political themes. The study of contemporary literature is therefore largely independent of the study of political culture. 4. Mary dislikes Pat. Consequently, it's unlikely that they'll work on the project together. 5. Most criminals believe that their chances of being caught and punished are small; thus, the perceived costs of a life of crime are low . 6. The building will generate large tax write-offs. As a result, it will be a good investment even if it yields little direct profit. 7. No one has ever constructed a convincing case that Bacon or someone else wrote the plays we generally attribute to Shakespeare. Shakespeare, then, almost certainly wrote the plays we attribute to him. 8. There are no centaurs, for centaurs are mythical creatures, and no mythical creatures really exist. 9. Nobody wi!l ever find an easy way to get rich, because people have been looking for centuries, and nobody's ever found one yet. 10. Swedish is an Indo-European language, but F innish isn't. Hence F innish is more difficult for English-speakers to learn than Swedish. 11. Many people are easily shocked by unusual or threatening events. No one who is thunderstruck can think clearly. It follows that the emotions can obstruct reason. 12. Since happiness consists in peace of mind, and since durable peace of mind depends on the confidence we have in the future, and since that confidence is based on the science we should have of the nature of God and the soul, it follows that science is necessary for true happiness . (Leibniz) 13. In Europe pupils devote time during each school day to calisthenics. American schools rarely offer a daily calisthenic program. Tests prove that our children are weaker, slower, and shorter­ winded than European children . We must conclude that our children can be made fit only if they participate in school calisthenics on a daily basis. (LSAT test, 1980) 14. * F irst, the personality and character- which are really synonymous- take their form during the first six or eight years of life. During this period of infancy and childhood, we select and develop the techniques which gain us satisfaction, defend us against threats, and become the tools in coping with the endless variety of problems situations that will be encountered later in life. It is during this time that we develop our methods of relating ourselves to other people and undergo the experiences which determine the strengths and weaknesses within our personalities. As adults we are not able to remember the details of these formative years. Therefore, we cannot understand our own behavior fully. (William Menninger) 15. * The earth receives radiant heat from the sun and loses heat to outer space by its own radiative emissions. The energy received undergoes many transformations. But in the long run n o appreciable fraction of this energy is stored on the earth, and there is no persistent trend toward higher or lower temperatures. (LSAT test, 1980)

Basic Concepts of Logic

7

16. *To describe an equilateral triangle on a given finite straight line. Let AB be the given straight line: it is required to describe an equilateral triangle on AB. From the center A, at the distance AB, describe the circle BCD. From the center B, at the distance BA, describe the circle ACE. From the point C, at which the circles cut one another, draw the straight lines CA and CB to the points A and B . ABC shall be an equilateral triangle. Because the point A is the center of the circle BCD, AC is equal to AB. And because the point B is the center of the circle ACE, BC is equal to BA. But it has been shown that CA is equal to AB; therefore CA and CB are each of them equal to AB. But things which are equal to the same thing are equal to one another. Therefore CA is equal to CB. Therefore, CA, AB , BC are equal to one another. Wherefore the triangle ABC is equilateral, and it is described on a given straight line AB . (Euclid) 17. * One may well ask, "How can you advocate breaking some laws and obeying others?" The answer is found in the fact that there are two types of laws: There are just laws and there are unjust laws. I would be the first to advocate obeying just laws. One has not only a legal but a moral responsibility to obey just laws. Conversely, one has a moral responsibility to disobey unjust laws. I would agree with St. Augustine that "An unjust law is no law at all." Now what is the difference between the two? How does one determine whether a law is just or unjust? A just law is a man-made code that squares with the moral law or the law of God . An unjust law is a code that is out of harmony with the moral law. To put it in the terms of St. Thomas Aquinas , an unjust law is a human law that is not rooted in eternal and natural law. Any law that uplifts human personality is just. Any law that degrades human personality is unjust. All segregation statutes are unjust because segregation distorts the soul and damages the personality. . . . (Martin Luther King Jr. ) 18. * Theorem ( Mean value theorem) . Let a , b E n, a < b, and let f be a continuous real-valued function on [a, bJ that is differentiable on ( a, b) . Then there exists a number c E ( a, b) such that f ( b ) - f( a ) = (b - a )f'( c) . Proof: Define a new function F : [a, bJ

->

n by

F (x) = f (x) - f (a ) -

f( b ) - f( a ) · (x - a) b a

for all x E [a, b] . (Geometrically F (x) is the vertical distance between the graph of f over [a, b] and the line segment through the end points of this graph. ) Then F is continuous on \a, b] , differentiable on (a , b ) , andF(a) = F ( b ) = 0. By Rolle's theorem, there exists a c E (a, b ) such that F'(c) = 0. Thus,

F' ( c) = f ' ( c) - J

( b) - f ( a ) =0 b-a

proving the result. (M. Rosenlicht) 19. * If the premise, that independence means complete absence of subconscious bias, were carried to its logical conclusion, no one could be found independent in any absolute sense. One is bound to be influenced by his environment, and everyone has some subconscious bias. On this premise it could be argued that the accountant who frequently lunches with a client, or plays golf or bridge with him, or serves with him on a board of trustees of a church or school, should not be considered independent in certifying the financial statements of that client. . . . [This

Logic, Sets and Functions

8

would lead] to the conclusion that certified public accountants who act as independent auditors for a company must avoid all other relationships with that company, leaving the accounting work to be performed by other CPAs, and accepting accounting work only from clients who retain other independent auditors. T his conclusion seems almost fantastically inconsistent with settled practice and the temper of the community generally. There would be serious economic loss in depriving the business man who needs accounting service of the knowledge which the independent auditor gains about the business, particularly its accounting aspects, in the course of his audit. (William Carey) 20. * I should like to suggest that neither composition nor literature is an intellectual field in its own right. Literary study obviously connects with a number of genuine intellectual fields like history and philosophy. Composition, too, has disciplinary connections with linguistics and psychology. But neither literary study nor composition is an intellectual discipline. Both are primarily cultural subjects with cultural missions of unparalleled importance . To the extent that we evade those missions under the banner of some neutral formalism or disciplinary pretense, we are neglecting our primary educational responsibilities and are also making an empirical mistake. This clearly implies that we should return to an integrated conception of "English" based on the pattern of the old literature-and-composition course originated by Blair and followed traditionally in the schools and colleges. (E. D. Hirsch, Jr.)

1 .2

Validity

Some arguments are good; others are not. According to the definition of argument given above, any collection of sentences counts as an argument if it is possible to single out one sentence in the collection as the conclusion, purported supported by the others. What distinguishes good from bad arguments? What makes a good argument succeed? What makes a bad argument fail? People typically demand many things of an argument. Throughout this book, we will focus on one of them. A good argument should link its premises to its conclusion in the right way. There should be some special connection between the premises and the conclusion. To see what this special sort of connection is, consider an argument that has true premises and a true conclusion, but is nevertheless bad: Harrisburg is the capital of Pennsylvania. Richmond is the capital of Virginia. :. Austin is the capital of Texas. What is wrong with this argument? The f acts cited in the premises have nothing to do with the truth or f alsehood of the conclusion. Texas could move its capital- to, say, Del Rio- while Harrisburg and Richmond remained the capitals of their respective states. That is, the conclusion of this argument could turn out to be false , even when the premises were true. The truth of the premises does nothing to guarantee the truth of the conclusion. This is the mark of a deductively invalid argument: its premises could all be true in a circumstance in which its conclusion is false. In a deductively valid argument, the truth of the premises guarantees the truth of the conclusion. If the premises are all true, then the conclusion has to be true. Consider, for example, this argument: Paris is, and has always been, the capital of France. Ed has never visited Paris. :.Ed has never visited the capital of France.

Basic Concepts of Logic

9

In any circumstance in which the premises of this argument are true, the conclusion must be true as well. It is imp ossible to conceive of a state of affairs in which Ed has visited the French capital without visiting Paris, if Paris is, and has always been, the capital of France. T o put this differently, the only way to imagine Ed visiting the French capital without visiting Paris is to imagine a case where Paris is not the capital of France. In a deductively valid argument, the truth of the premises guarantees the truth of the conclusion. Or, to say the same thing, if the c onclusion of a deductively valid argument is false, at least one premise must also be false.

Definition 1 . 1 An argument is deductively valid if and only if i t is impossible for i ts premises all to be true while its conclusion is false. It is possible, then, for a deductively valid argument to have true premises and a true conclusion; (at least some) false premises and a false conclusion; and false premises and a true conclusion. But no deductively valid argument has true premises and a false conclusion. Some Deductively Valid Arguments True Premises False P remises False Premises False Conclusion True Conclusion True Conclusion Daniel is a dog. Daniel is human. Daniel is a dog. All dogs eat mice. All dogs sleep. All humans are mortal. :. Daniel eats mice. :. Daniel sleeps. :. Daniel is mortal. Each of these arguments is deductively valid: In each case, there is no possible circumstance in which the premises are all true but the conclusion is false. H ow could it be true that Daniel is a dog, and true that all dogs eat mice, but false that Daniel eats mice? Whether the premises and conclusion are actually true or false makes little difference to the validity of the argument. What matters is that if the premises are true the conclusi on cannot be false. Thus, not every argument with true premises and a true conclusion is deductively valid, as the argument concerning state capitals sh ows. Similarly, many arguments with false premises and a true conclusion are deductively invalid. The same is true for arguments with false premises and a false conclusion. So, although valid arguments can have any of these three combinations of truth and falsity, not every argument with those combinations is valid. An argument is deductively invalid if it is possible for the premises to be true while the conclusion is false. Similarly, an argument is deductively valid just in case its conclusion has to be true if its premises are all true. 1 Some deductively invalid arguments nevertheless have some legitimate force in reasoning. Al­ though the truth of the premises of such an argument does not guarantee the truth of its conclusion, it does make the truth of the conclusion pr obable. Consider for example, this argument: Most cats like salm on better than beef. Gwen is a cat. :. Gwen likes salm on better than beef. It is p ossible for the premises to be true while the conclusion is false. Gwen may be atypical; she may prefer beef to salmon. So the argument is deductively invalid. Nevertheless, the premises lend some supp ort to the conclusion. Given just the information in the argument, the conclusion is m ore likely to be true than false. Arguments such as this are called inductively reliable. T hey are imp ortant in both scientific and everyday reasoning. Evaluating them, however, requires developing theories of 1 These links between deductive validity and truth or falsehood were first recognized explicitly by the Greek philoso­ pher Aristotle (384-322 B. C . ) , the father of logic.

Logic, Sets and Functions probability and statistics. In this text, therefore, we will restrict our attention to deductive validity and invalidity. When we imagine a circumstance in which some sentences would be true, and others would be false, we normally imagine a situation that settles the matters that those sentences involve, but that leave lots of other things unsettled. Above, for example, we imagined a case in which Texas moved its capital, but Pennsylvania and Virginia didn't. That was all we said, or, apparently, needed to say to convince ourselves that the argument was invalid. But that was not even close to a complete description of an entire world. \Ve said nothing about what happened to Montana, or Alaska , or Afghanistan, or the pennant hopes of the Mets, or the price of pork bellies on the Chicago Board of Trade. The case we've described, therefore, isn't very determinate. There are many different ways the world might be that all agree in fitting our description. So, it might be more correct to say that we imagined, not a single case, but a kind of case in which the premises are all true and the conclusion is false. Many circumstances might fit the description we gave . The logic we study in this book assumes that some circumstances are so comprehensive that they determine whether each declarative sentence of a language is true or false. Every sentence that can be true or false at all must, in such a complete situation, be either true or false. The logic studied throughout most of this book is a bivalent logic because it says that, given any sentence capable of truth or falsehood, the question, "Is this sentence true, or false, or whatever?" always has only two possible answers: "True" and "False" . In other words, classical logic allows only two truth values: truth and falsehood. The truth value of a sentence is truth, if the sentence is true, and falsehood or falsity, if it is false. So far in this section we've examined only the second criterion for success in arguments: we want arguments to be valid. Deductively valid arguments always preserve truth; if they begin with true premises , they carry us to true conclusions. It's nice, of course, to have an argument that not only preserves truth but has some truth to preserve. Thus, our first criterion was that a successful argument should have true premises and a true conclusion. A sound argument meets both criteria for success. It has true premises; it is valid. Furthermore, since, in any valid argument, the truth of the premises guarantees the truth of the conclusion , it also has a true conclusion. Definition 1 .2 An argument is sound if and only if (1) it is valid and (2) all its premises are true . Sound arguments, then, are often paradigms of successful arguments. They derive the truth of their conclusions by arguing validly from true premises. Nevertheless, most of this book will focus, not on soundness, but on validity. Logicians have always concentrated on validity. 2 This focus is easy to understand. F irst, validity is obviously a crucial component of soundness. We can't evaluate whether an argument is sound without first determining whether it's valid. Second, evaluating soundness requires judging the actual truth or falsehood of premises. This, however, is the job, not of logical theory, but of those who know enough physics, history, business, or whatever facts are relevant to the argument at hand. Third, although we usually want to argue from true premises, many useful arguments start from false ones. Some arguments try to show that a certain sentence is false by using it as a premise to reach an outrageous or absurd conclusion. Others adopt a premise purely as a hypothesis, to see what would follow if it were true. Aristotle first realized how important such arguments are; he characterized them as having dialectical, rather than demonstrative, premises. As we shall see later, these forms of argument are much more common and useful than most people would imagine. 2 Indeed, although Aristotle and some earlier thinkers talked of valid arguments, a term for soundness was only introduced later, by logicians of the Stoic school, which thrived in Greece from the third century B. C. to the second century A. D.

Basic Concepts of Logic

11

A simple example occurs at the beginning of this chapter in the pr oof o f the irrationality of the square root of two. The pr oof starts with the assumpti on that ./2 is rati onal, and deduces from it a contradiction . The point of this argument is precisely to show that the premise that ./2 is rational is false. The argument's success, therefore, depends solely on validity, not on soundness. Our study of reasoning will therefore focus on validity. It's worth noting that the logical concept of soundness approximates, but is not the same as our ordinary, intuitive concept of a successful argument. Successful arguments generally lead us fr om premises for which we have go od evidence to a conclusi on that foll ows from those premises and to which the premises are relevant. The logical concept of soundness, however, mentions neither evidence nor relevance. Our technical definition thus calls sound a variety of arguments that seem, from an intuitive point of view, peculiar. Some violate our ordinary notion of evidence. So, supp ose that the earth will be invaded by little green men in 2025, but that we p ossess no evidence now t o supp ort this. Then the argument The earth will be invaded by little green men in 2025. :. The earth will be invaded by little green men in 2025. is sound; given our supp osition, the premise is true, and the conclusion is surely true whenever the premise is, since they are the same sentence. But this argument won't c onvince anyone that we ought to be building defenses; it doesn 't establish its conclusion in the usual, evidence-related sense of establish. Some arguments also count as sound even th ough they violate our usual noti on of relevance. The argument Coffee ice cream is more p opular than chocolate in Rhode Island. :. Cats are cats. is sound, since the premise is true, and the conclusion can never be false while the premise is true, simply because the conclusion can never be false. Yet this argument, t o o, seems bizarre. The premise is irrelevant to the conclusion. Thus, just as validity is a part, but only a part of soundness, so soundness is a part, but only a part, of our intuitive noti on of success in argumentation.

Problems Evaluate these arguments as valid or invalid. If the argument is invalid, describe a circumstance in which the premises would be true but the conclusi on would be false. 1 . John and Mary came to the party. Hence, Mary came to the party. 2. Larry g ot angry and stormed out of the room. Consequently, Larry stormed out of the ro om. 3. If Susan's lawyer objects, she will not sign the contract. Susan's lawyer will object. Therefore she will not sign the contract. 4. If Frank takes the j ob in Cleveland, he'll make a lot of money on the sale of his house. Frank won't take the j ob in Cleveland. It follows that Frank won't make a lot of money on the sale of his house. 5. If Strawberry hits 30 home runs, the Mets will be contenders. The Mets will be contenders. So Strawberry will hit 30 h ome runs. 6. If Lynn testifies against the mobsters, she'll endanger her life. So, she won't testify against them, since she won't put her own life in danger. 7. Max is mayor of either Abilene or Anarene. Max isn't mayor of Abilene. Hence Max must be mayor of Anarene.

12

Logic, Sets and Functions

8. Pamela played Shelley for the tournament trophy. Consequently, Pamela played either Shelley or Tracy for the tr ophy. 9. Henry doesn't kn ow anyone. So Henry doesn't kn ow Kim. 10. Rocky has beaten everyone he's faced. Thus, Rocky has beaten Mad M oe, if he's faced him. 1 1 . Since all who have been accepted have scores over 1 300, either Jim has been accepted , or his scores weren't over 1300. 12. Since all wh o have been accepted have scores over 1 300, either Jim hasn't been accepted, or his sc ores were over 1300. 13. Everyone who has th ought ab out the political tensi ons of the Middle East realizes that they're complicated. Deb orah d oesn't realize that these political tensi ons are complicated, so she mustn't have thought about them. 14. Everyone who admires Frost also admires Dickinson. Some people who normally hate poetry admire Frost. Therefore some people who normally hate poetry admire Dickinson. 1 5. Some politicians are demagogues, but no demagogues are good leaders. Hence, some politicians are not good leaders. 1 6 . All scientists have a deep interest in the workings of nature. All who devote their lives to the study of the physical world have a deep interest in the workings of nature. C onsequently, all scientists devote their lives to the study of the physical world. 17. Some modern art shows the str ong influence of primitivism. No m odern art is primarily representational. Thus, some art that exhibits the influence of primitivism is not primarily representational. 18. M ost Americans like baseball. Anyone who likes baseball likes sp orts. So most Americans like sp orts. 19. M ost medieval theories of motion were, in essence, Arist otelian. No theory of motion that uses a concept c orresp onding to inertia is essentially Arist otelian. It foll ows that m ost medieval theories of m otion used n o concept corresponding t o inertia. 20. The patient will surely die unless we operate. We will operate. Therefore the patient will not die. 2 1 . Jerry will take the job unless we match the salary offer. Since we won't match the offer, Jerry will take the j ob. 22. The launch will be delayed unless the weather clears. So, if the weather clears, the launch won't be delayed. 23. The meeting will take place only if b oth parties agree on the agenda. So, if the parties don't agree on the agenda, the meeting will not take place. 24. Marilyn will finish the brief on time only if she gets an extension on the M orley case. Therefore, if Marilyn gets an extension on the M orley case, she will finish the brief on time. 25. If Jack understands how imp ortant this sale is, he'll devote most of the next two weeks to securing it. It follows that Jack won't devote most of the next two weeks to securing this sale unless he understands h ow imp ortant it is.

Basic Concepts of Logic

13

26. The b oss won't understand what you're trying to say unless you put it in the bluntest possible terms. Consequently, if you don't put what you're trying to say in the bluntest terms possible, the b oss won't understand it. 27. Either the city will raise electric rates, or it will raise taxes. Thus, if the city does not raise electric rates, it will raise taxes. 28. Nancy will not marry Alex unless he signs a prenuptial agreement. So, if Alex signs a prenuptial agreement, Nancy will marry him. 29. This album will sell only if it contains at least one hit song. Hence, unless it contains a hit s ong, this album will n ot sell. 30. John is watching Mary run through the park. So Mary must be running through the park. 31.

* Few students fully appreciate the value

32.

* Corp orate taxes result in higher prices for consumer goods , increases in interest rates, re­ duced empl oyment at l ower wages, and reduced levels of savings and investment, depending on whether corp orati ons pass along the cost of taxati on to the c onsumer, b orrow to replace these funds, take steps to reduce lab or costs, or reduce the return they offer to shareholders. Consequently, corp orate taxes should be repealed.

33.

*

Most Americans who travel in Europe know no language other than English. All Americans who travel in Europe are affl uent. Thus, most affluent Americans know n o language other than English.

34.

* The President didn't know that several of his sub ordinates had started "the c ompany within the company" , a small, highly secret gr oup within the CIA. All of the President's sub ordinates belong to the President's political party. It follows that the President didn't kn ow that several people of his own political party started a secret gr oup within the CIA.

35.

* Few mathematics students take courses in logic. All accounting majors take courses in logic. So few accounting maj ors are students of mathematics.

of an educati on while they are in sch o ol. Only those who fully appreciate the value of their education while they are in sch o ol dev ote themselves to their studies as much as they ought to. Therefore, most students don't dev ote themselves to their studies as much as they ought to.

36. * Almost all Asian nations have socialist or statist or otherwise centralized econ omies. All our allies in Eastern Asia are, of c ourse, Asian nations. So most of our Eastern Asian allies have centralized ec on omies. 37.

** By 1988, the capital of Israel will be either Tel Aviv or Jerusalem. Thus, if in 1988 the Israeli capital is not Jerusalem, it will be Tel Aviv.

38.

** D ogs are animals. Dogs bark. J ohn owns a dog. So J ohn owns an animal that barks. ** Terry's mother gave her permissi on to go to the movies or to the park. Thus, Terry's mother

39. 40.

gave her permission to go to the park.

** The collapse of the Austr o-Hungarian Empire at the end of the First W orld War caused the fragmentation and political divisions that led, ultimately, to an easy Soviet takeover of most of Eastern Europe at the end of World War II. Thus, if the end of W orld War I had not witnessed the collapse of Austria-Hungary, the Soviets would have found it m ore difficult to take over most of Eastern Eur ope after the Second World War.

Logic, Sets and Functions

14

1.3

Implication and Equivalence

A concept cl osely related to validity is implication. We might express the idea that an argument is valid by saying that its conclusion follows from its premises. Equivalently, we might say that its premises imply or entail its c onclusion. At least part of what we mean, in either case, is that the truth of the premises guarantees the conclusion's truth. If the premises are true, the c onclusion has to be true t o o. Implication, then, is very similar to validity. But validity is a property of arguments; implicati on is a relation between sentences and sets of sentences. A set of sentences implies a given sentence just in case the truth of that sentence is guaranteed by the truth of all the members of the set. 3 Definition 1 .3 A set of sentences S implies a sentence A if and only if it 's impossi ble for every

member of S to be true while A is false .

It should be clear from this definition that, if an argument is valid, the set c onsisting of its premises implies its conclusion. We can also speak of a single sentence implying an other sentence. Definition 1 .4 A sentence A implies a sentence B if and only if it 's impossible for A to be true

while B is false.

One sentence implies another, that is, just in case the truth of the former guarantees the truth of the latter. In every circumstance in which the first is true, the second must be true as well. C onsider these two pairs of sentences. l.

(a) Mary likes Chinese food, but Bill hates it. (b) Mary likes Chinese food.

2.

(a) Susan is going to spend her summer in either Palo Alt o or Pittsburgh. (b) Susan is g oing t o spend her summer in Pittsburgh.

Sentence (l)a. implies ( l ) b. It's impossible to conceive of a situation in which it's true that Mary likes Chinese food, but Bill hates it, and false that Mary likes Chinese fo od. In such a circumstance, Mary would have to like and n ot like Chinese food; the sentence 'Mary likes Chinese fo od' would have to be both true and false at the same time. There are no such circumstances. N o sentence can be both true and false at the same time. So the truth of (l )a. guarantees the truth of (l)b. Does the truth of (2)a. similarly guarantee the truth of (2)b.? Obviously, the answer is n o. Imagine a world in which Susan is going to spend her summer in Palo Alto, never setting foot outside California. In this situati on, (2)a. is true, but (2)b. is false. So (2)a. does not imply (2)b. A sentence A implies a sentence B just in case B is true in all those possible circumstances in which A is true. B implies A , of course, just in case A is true in all those cases in which B is true. If A implies B and B implies A, then A and B must be true in exactly the same circumstances. In such a case, we say that A and B are equivalent. Definition 1.5 A sentence A is equivalent to a sentence B if and only if it 's impossible for A and

B to disagree in truth value.

If A and B are equivalent, then they must be true in the same circumstances, and false in the same circumstances. There c ould be no situation in which one would be true while the other would be false. Thus, equivalence amounts to implication in both directions. A is equivalent to B just in case A implies B and B implies A . To make this m ore concrete, consider four m ore pairs o f sentences: 3 Throughout this book, 'just in case' will be used as a synonym for 'if and only if. '

Basic Concepts of Logic

15

3. (a) N o apples are oranges. (b) No oranges are apples. 4.

(a) All apples are fruits. (b) All fruits are apples.

5.

(a) T he Senator is neither worried n or angry ab out the investigation. (b) T he Senat or is not worried ab out the investigation; he is not angry ab out the investigation.

6.

(a) Professor Pinsk saw that no one left. (b) Professor Pinsk saw no one leave.

The sentences in (3) are equivalent. Any circumstance in which n o apples are oranges is one in which no oranges are apples, and vice versa. Both sentences say that nothing is b oth an orange and an apple. In (4) , however, the sentences are obviously not equivalent. All apples are fruits, so (4)a. is true. But not all fruits are apples, so (4)b. is false. The real world is thus a case in which these sentences disagree in truth value. Similarly, the sentences in (5) are equivalent; they are true in exactly the same circumstances. If the Senator is neither worried nor angry, then he is not worried, and he is n ot angry. Conversely, if he is n ot worried, and is n ot angry, then he is neither worried n or angry. The sentences in (6), however, are not equivalent. Imagine a case where a student left without being observed by the professor. In such a case, it could well be true that Pr ofessor Pinsk saw no one leave; it would n onetheless be false that the pr ofessor saw that no one left, since, in fact, someone did leave.

Problems Consider the sentences in each pair: are they equivalent? If not, does either sentence imply the other? 1 . (a) Both Alan and Bob took their vacati ons in California. (b) Alan t o ok his vacation in California. 2. (a) Vivian and Beth b oth majored in English in college. (b) Beth majored in English in college, and so did Vivian. 3. (a) Pittsburgh will face Dallas or New York in the champi onship game. (b) Either Pittsburgh will face Dallas in the championship game, or P ittsburgh will face New York. 4. (a) Hance will run for the govern orship or the Senate. (b) Hance will run for the Senate. 5. (a) Mali will continue t o experience severe fo od shortages throughout m ost of the countryside unless more roads are built. (b) More roads will be built in Mali. 6. (a) Neon and Xenon are inert. (b) Xenon and Neon are inert. 7. (a) Pluto or Uranus is now directly aligned with Neptune. (b) Pluto and Uranus are now directly aligned with Neptune. 8. (a) N ot b oth whales and dolphins are fish. (b) Whales are not fish, and dolphins aren't either. 9. (a) Either Sam or Peter failed to give the play an appr opriate sense of place. (b) Peter and Sam did n ot b oth give the play an appropriate sense of place. 10. (a) Columbia and Universal cannot b oth be the year's most successful studio. (b) Neither Columbia n or Universal is the year's m ost successful studio.

16

Logic, Sets and Functions

1 1. (a) Aunt Alice will n ot come to the wedding, and neither will Uncle Harry. (b) Not b oth Uncle Harry and Aunt Alice will come to the wedding. 12. (a) Either the Babyl onians or the Assyrians employed the lex talionis. (b) If the Assyrians empl oyed the lex talionis, the Babylonians didn't. 13. (a) Either the physical world really exists, independently of our minds, or our senses system­ atically deceive us. (b) If our senses systematically deceive us, then the physical world doesn't really exist independently of our minds. 14. (a) If pay-per-view televisi on catches on, cable companies will make huge profits. (b) If pay­ per-view TV doesn 't catch on, cable companies will not make huge profits. 15. (a) Universities will continue to grow only if they find new markets for their services. (b) If universities do not find new markets for their services, they will not continue to grow. 16. (a) If Elizabeth did not sign this letter, then her assistant did. (b) If Elizabeth had not signed this letter, her assistant would have. 17. (a) If Caesar had not crossed the Rubicon, he would never have become Consul. (b) If Caesar had become Consul, he would have cr ossed the Rubicon. 1 8. (a) N o high-paying j ob is easy; (b) No easy j ob is high-paying. 1 9. (a) Some small law firms have specialists in municipal bonds. (b) Some law firms wh o have specialists in municipal bonds are small. 20. ( a) All corp orati ons primarily in the metals business are lo oking to diversify. (b) All corp ora­ tions looking to diversify are primarily in the metals business. 21. (a) M ost fo ods that are high in carb ohydrates are high in calories. (b) M ost fo ods that are high in calories are high in carb ohydrates. 22. (a) At least three of my friends own cats. (b) At least three people who own cats are my friends. 23. (a) At most five d octors in the United States can perform that operation. (b) At most five doctors who can perform that operation are in the United States. 24. (a) The President is a Republican. (b) The Republican is a P resident. 25. (a) Several cities with populations over 700,000 have no baseball franchises. (b) Several cities with out baseball franchises have populati ons over 700,000. 26. (a) A European country b ordering on the Adriatic has close ties with China. (b) A European country having close ties with China borders on the Adriatic. 27. (a) Anybody who can speak effectively can find a j ob in sales. (b) Anyb ody who can find a job in sales can speak effectively. 28. (a) Either the Federal Reserve Board or foreign investments will increase the supply of capital. (b) If the Federal Reserve Board doesn't increase the supply of capital, foreign investments will. 29. (a) Fred knows that Jupiter is cl oser to the sun than Saturn. (b) Jupiter is cl oser to the sun that Saturn, and Fred knows it.

Basic Concepts of Logic

17

30. (a) Meg thinks that Oswald did not shoot Kennedy. (b) Meg realizes that Oswald did not shoot Kennedy. 31. (a) It wasn't necessary for things to turn out as they did. (b) Things could have turned out differently. 32. ** (a) Many films that make a lot of money are tailored to the teenage audience. (b) Many films tailored to the teenage audience make a lot of money. 33. ** (a) Our soil tests show that we will strike either oil or gas. (b) Our soil tests show that we will strike oil unless we strike gas. 34. ** (a) The tachyon is a particle that either travels backwards in time or travels faster than the speed of light. (b) The tachyon is a particle that travels backwards in time unless it travels faster than the speed of light. 35. ** (a) It's not true that Donna will come to the party but won't enjoy herself. (b) If Donna comes to the party, she'll enjoy herself. 36. ** (a) Few who read Hemingway write like Melville. (b) Few who write like Melville read Hemingway. 37. ** (a) Good wine isn't inexpensive. (b) Inexpensive wine isn 't good. 38. ** (a) Even Ralph found your comments offensive. (b) Ralph found your comments offensive. 39. ** ( a) We have some excellent redfish today, if you would like some. (b) We have some excellent redfish today. 40. ** (a) If historians revise their analysis of the impact of refugees from Weimar Germany on American intellectual history, they will revise their entire conception of that history. (b) His­ torians will revise their analysis of the impact of refugees from Weimar Germany on American intellectual history only if they revise their entire conception of that history. Consider the statement: If a fetus is a person, it has a right to life. Which of the following sentences follow from this? Which imply it? 41. A fetus is a person. 42. If a fetus has a right to life, then it's a person. 43. A fetus has a right to life only if it's a person. 44. A fetus is a person only if it has a right to life. 45. If a fetus isn't a person, it doesn't have a right to life. 46. If a fetus doesn't have a right to life, it isn't a person. 47. A fetus has a right to life. 48. A fetus isn 't a person only if it doesn't have a right to life. 49. A fetus doesn't have a right to life only if it isn 't a person. 50. A fetus doesn't have a right to life unless it's a person.

18

Logic, Sets and Functions

51. A fetus isn't a person unless it has a right to life. 52. A fetus is a person unless it doesn't have a right to life. 53 . A fetus has a right to life unless it isn't a person. Consider the statement: The patient will die unless we operate immediately. What follows from this, together with the information listed? 54. The patient will die. 55. The patient will not die. 56. We will operate immediately. 57. We won't operate immediately. In a quotation at the beginning of this chapter, Mr. McClory introduces a motion saying: There will be a ten day postponement unless the President fails to give his assurance to produce the White House tapes. Throughout the subsequent debate, congressmen offer paraphrases of this motion which they believe better express Mr. McClory's intentions, and Mr. McClory also expresses his motion in other terms which he takes to be equivalent to his original. Which of these paraphrases are in fact equivalent to the original motion? 58. If the President fails to give assurance, there is a ten day postponement . (Mr. Latta proposes this as equivalent.) 59. If the President gives his assurance, there is no postponement. (Mr. Latta proposes this as equivalent.) 60. There is a ten day postponement provided that the President gives his assurance. (Mr. Latta suggests this as a nonequivalent revision.) 61. There is a ten day postponement unless the President gives his assurance. (Mr. Mann suggests this as a nonequivalent revision.) 62. There is no postponement if the President fails to give his assurance. (Mr. McClory proposes this as equivalent. )

1 .4

Logical Properties of Sentences

Logic deals primarily with the logical connections between sentences. Nevertheless, it also classifies individual sentences. The overwhelming majority of sentences we use could, depending on what the facts are, be either true or false. It's possible to conceive of cases in which they would be true, and other cases in which they would be false. For instance, each of the following sentences would be true in some circumstances and false in others: 7. The snow is falling all over Ireland. 8. The King recognized that the some of the nobles would oppose him.

Basic Concepts of Logic

19

9. The earth is the third planet from the sun. 10. Francis Bacon, n ot Shakespeare, wrote The Merchant of Venice. Such sentences are contingent: Definition 1 .6 A sentence is contingent if and only if it 's possible for it to be true and possible for it to be false. Contingent sentences could be true, given the right set of circumstances. Of course, they c ould also be false, depending on the facts of the situation. They are immensely useful precisely because they assert, in effect, that the real circumstance is among th ose in which they are true. Some sentences, in contrast, cannot help being true. It's simply imp ossible for them to be false. They are true in every possible circumstance. Such sentences are valid, or logically true: Definition 1. 7 A sente nce is valid (or logically true) if and only if it 's impossible f or it to be false. If you d oubt that there are any sentences that cannot be false, n o matter what the facts may be, then try t o imagine circumstances in which these sentences are false. 1 1 . Either Lima is in Ecuad or or it's not. 12. A rose is a rose. 1 3. Wherever you go, there you are. 14. It ain't over 'ti! it's over. 15. When you're hot, you're hot. 1 6 . Either some of my friends are crazy, or none of them are. These sentences are true in every possible world. They also seem to say very little. But not all valid sentences are so straightforward and unsurprising. ( 1 7) , for example, is logically true: 1 7. If everyone loves a lover, and Sam doesn't l ove Jeanne, then Jeanne doesn't love Greg. But it doesn't seem as trivial as (1 2)- ( 16). Notice, furthermore, that even those sentences can be useful. Sometimes they set up the structure of an argument, as when a mathematician begins a pr oof by saying, "the number n is either prime or not prime. If it is prime. . . . " At other times, they serve a function in discourse by forcing the listener to interpret certain terms as ambiguous. We normally assume that a speaker is making a good faith effort to communicate information. So, when Yogi Berra said "It ain't over 'ti! it's over," he presumably meant something like "it ain't over 'til it's really over,"' that is, "the outcome isn't fully determined until the game ends." So interpreted , the sentence isn't valid at all, but contingent. Some sentences, furtherm ore, could never be true. They are false, regardless of the facts. These sentences are contradictory ( or contradictions). Definition 1 .8 A sentence is contradictory if and only if it 's impossible for it to be true. Here are some examples of c ontradictions: 18. Fred is both bald and n ot bald.

20

Logic, Sets and Functions

19. Sheila is irritated, and she's not. 20. This set belongs to itself if and only if it doesn't belong to itself. 21. Nobody's seen the trouble I've seen. In no conceivable circumstance could any of these sentences be literally true. Try, for example, to imagine a situation in which Fred is both bald and not bald at the same time. Whatever the state of Fred's scalp, he 's either bald, or not bald, but not both. No matter what Sheila's state of mind may be, she is either irritated or not. (Of course, (19) might be used to suggest that she is irritated at one thing but not at another. On that interpretation , (19) is contingent.) Similarly, the set in question must belong to itself or not. And, since I've seen the trouble I've seen, somebody (namely, me) has indeed seen the trouble I've seen. Like logical truths, contradictions tend to signal that we should interpret some terms generously, since we assume that our colleagues in communication are trying to say something that could be true. Hearing (21 ) , then, we tend to read the Nobody as Nobody else, reading the sentence as a whole as if it were Nobody else has seen the trouble I've seen. Contradictions too may ulfill important functions in arguments. (20) , for example, might be a crucial step in showing that the set under consideration can't exist. Nevertheless, contradictions are disruptive enough that it's worth having a term for sentences that, whether they are valid or contingent, at least are not contradictory. Such noncontradictory sentences are satisfiable. Definition 1. 9 A sentence is satisfiable if and only if it 's not contradictory.

Obviously, a sentence is satisfiable just in case it's either contingent or valid. That is, it must be possible for the sentence to be true. Since every sentence is either valid, contingent or contradictory, the terms introduced in this section div ide sentences into three groups, as shown in this diagram . Sentences Contingent True in some circumstances, false in others Satisfiable ( true in some circumstances)

Valid True in every circumstance

Contradictory False in every circumstance

We 've assumed, throughout this chapter, that all sentences are either true or false , but not both. We 've thus been treating sentences as the bearers of truth value. This is plausible only if we think of a sentence as used on a particular occasion, by a particular speaker, to a particular audience, in a given context. Without all this contextual information , we can 't begin to say whether a sentence such as 'I love you ' is true or false. The truth value of the sentence clearly depends on who 'I' and 'you' refer to, when the sentence is uttered, etc.It would be better, therefore, to speak of utterances of sentences as true or false, rather than sentences themselves, or to speak of sentences as having truth values only relative to a certain context of use. Some sentences- such as 'I am here now' - are true whenever they are uttered, but are nevertheless not valid. No matter who is speaking, or when or where the utterance takes place , 'I am here no'w is true. But that does not make it a necessary truth. Suppose, for example , that I utter the sentence as I write this line. Supplying information from the context, that utterance has the force of saying that Daniel Bonevac is in Austin on May 7, 1999. But this is not necessarily true ; it's easy to imagine possible circumstances in which I am in Istanbul or, less fancifully, in Dripping Springs. The role of context in language is critically important, and should not be forgotten. Nevertheless, very little in the following pages will involve context directly. We 'll continue, then , to speak of sentences as having truth values, trusting ourselves to remember that these values are relative to context.

Basic Concepts of Logic

21

Problems Classify these sentences as valid, contradictory or contingent. 1 . I a m wh o I am. 2 . All dogs are dogs.

3 . Some d ogs are n ot dogs.

4. Some cars are red. 5 . All red automobiles are autom obiles. 6. All red automobiles are red.

7. Every German car is a car.

8. Every German car is a German. 9. I kn ow what I kn ow. 10. Some people are friendly, and or{)u en't. 1 1 . Some people are friendly and not friendly. 12. Some people aren 't friendly, but everybody is friendly. 13. T here are many trees in Yosemite Nati onal Park. 14. Every student studies. 15. Nobody loves everybody. 16. Everyone who l oves everyone l oves every loser. 1 7. Everyone who loves every loser loves everyone. 18. Everyone wh o drives a Mercedes drives a car. 19. Not everyone wh o drives a car drives a Mercedes. 20. * If what you say is true, then it's false. 21. * Nobody can defeat everyone without being defeated at least once. 22.

*

T oday is the first day of the rest of your life.

23. * N o batter ever made a hit with the bat on his shoulder. (John McGraw) 24. * You are what you eat. (Ludwig Feuerbach) 25. * Everything is what it is , and not an other thing. (Plato) 26. * T he business of America is business. (Calvin Co olidge) 27. * T here are two kinds of people in the world: th ose who divide the world into two kinds of people, and those wh o d on 't. (H. L. Mencken)

22 28.

Logic, Sets and Functions

*

I am never less alone than when I am alone, nor less at leisure than when I am at leisure. (Scipio Africanus)

29. * There comes a time to put principle aside and do what's right. (Michigan legislator) 30.

*

31.

**

32.

*

I don't know what the previous speaker said, but I agree with him. (Texas legislator) this poem is the reader and the reader this poem (Ishmael Reed)

I exist.

33. * Some dogs are dogs . 34.

* No dogs are dogs.

35.

* Most dogs are dogs.

36.

*

37.

* All former Congressmen are Congressmen.

38.

* Some fake diamonds are diamonds.

39.

*

Many dogs are dogs.

John likes baseball, but hates all sports .

,-.... '· J,

40. * Alice taught Sarah some chemistry, but Sarah learned no chemistry from Alice. 41.

*

Say that a sentence A implies another sentence B . What can we conclude about B, if A is (a) valid? (b) contingent? (c) satisfiable? (d) contradictory?

42. * Say that a sentence A implies another sentence B. What can we conclude about A, if B is ( a) valid? (b) contingent? (c) satisfiable? (d) contradictory? The fourteenth century logician Pseudo-Scot (so-called because his writings, for many years, were attributed to John Duns Scotus) raised several objections to definitions of validity such as that of this section. The following two arguments, he thought, showed that there was a problem with saying that an argument is valid if and only if it's impossible for its premises to be true while its conclusion is false. Do these arguments really pose a problem for such definitions of validity? Explain. 43.

**

Every sentence is affirmative. :.No sentence is negative.

(Note: assume that no sentence can be both affirmative and negative. The argument seems valid. But, Pseudo-Scot argued, even though the premise could be true, the conclusion can't be; it refutes itself, since it, itself, is negative.) 44.

**

God exists. :.This argument is not valid.

(Note: Pseudo-Scot assumes that the premise is necessarily true. Any necessary truth would serve here in place of God exists. So, if the argument is valid, the conclusion must be true; but then the argument isn't valid, contradicting the hypothesis that it is valid. So suppose the argument is not valid. Then the conclusion is true ; in fact, it must be necessarily true. In that case, we have an invalid argument in which it cannot happen that the premise is true while the conclusion is false , because the conclusion can never be false.)

Basic Concepts of Logic

1.5

23

Satisfiability

A sentence is satisfiable just in case it is not contradictory; that is, just in case it can be true. Any true sentence, obviously, is satisfiable. But false sentences can also be satisfiable, so long as they are true in some other possible circumstance. We can speak of sets of sentences, too, as satisfiable or contradictory. It's easy to think of sets of sentences that, in some sense, contain contradictions, even though each sentence in the set is itself satisfiable: 22.

(a) Beer and sauerkraut are very good together. (b) Beer and sauerkraut aren't very good together.

23.

(a) Many of my friends belong to the Flat Earth Society. (b) Nobody in the Flat Earth Society believes in modern science. ( c) All my friends believe in modern science.

The sentences in (22) , like those in (23) , are not themselves contradictions. Taken individually, each could be true. Taken together, however, they describe an impossible situation. Though each could be true , the sentences in (22) or (23) couldn't be true together. In such cases, the set of sentences is contradictory, whether or not any individual sentence in the set is itself contradictory.

Definition 1.10 A set of sentences is contradictory if and only if it 's impossible for all its mem­ bers to be true. A set is satisfiable otherwise. If a set is contradictory, we can also say that its members are mutually inconsistent, and that any member contradicts, or is inconsistent with, the set containing all the rest. If the set is satisfiable , then its members are mutually consistent, and each member is consistent or compatible with the set containing all the rest. Two sentences contradict each other just in case the set containing just the two of them is contradictory. If a set is satisfiable , then all its subsets are satisfiable: each member is consistent or compatible with each other member of the set. From a logical point of view, contradictory sets of sentences can be described in two ways. First , the sentences in the set can't all be true at the same time. Second, the set implies a contradiction. Although a contradictory set of sentences might not contain a contradiction, it must imply one. The sentences in (21), for example, together imply 'Although many of my friends don't believe in modern science , all my friends do believe in modern science.' This is an outright contradiction. To see that these two characterizations come to the same thing, recall that a set S of sentences implies a sentence A just in case it's impossible for every sentence in S to be true while A is false. Contradictions, of course , are always false. When A is a contradiction, then, this amounts to the following: S implies A if and only if it's impossible for every sentence in S to be true. Therefore, a set of sentences implies a contradiction just in case it is itself contradictory. Or, to put it another way, satisfiability is freedom from contradiction. Satisfiability is important: sets of sentences that are not satisfiable don't have a fighting chance at truth. They must contain at least one false sentence, no matter what the facts might be. A satisfiable set may also contain false sentences, but at least there is a possibility that all the sentences it contains are true. This explains the significance of satisfiability in legal contexts. A lawyer may try to trap an opposing witness in a contradiction. The lawyer, in most cases, cannot alone provide any direct testimony relevant to the case. He or she may introduce witnesses of his or her own to dispute what the opposing witness says. If the opposing witness falls into a contradiction, however, then the witness must be saying something false , regardless of the facts of the case.

24

Logic, Sets and Functions

Even more fundamentally, people often use arguments to disprove someone else's contention. To refute an assertion, we have to recognize when we have shown something that contradicts that assertion. So the notion of refutation depends on the notion of contradiction. Finally, the concept of satisfiability has been very important in modern mathematics. Around the turn of the century, several mathematicians and logicians deduced contradictions from mathe­ matical theories in use at the time. Ever since , mathematicians have been extremely cautious about the satisfiability of their theories, and have sought, whenever possible, proofs that theories are satis­ fiable. This concern has led to some of the most important developments in twentieth-century logic, mathematics and computer science.

Problems Evaluate these sets of sentences as satisfiable or contradictory. l. The yard isn't white unless it's snowing. It's not snowing. But the yard is white. 2. If a student's G PA is very high , he or she will get into a good graduate school. Frank's GPA is not very high. Nevertheless, he 'll get into a good graduate school . 3. John is a good guitarist. John is also an accountant, but not a good one. 4. Everyone who can cook a good chicken kung pao knows the value of hot peppers. Some who know the value of hot peppers don't themselves like hot food. Anybody who can cook a good chicken kung pao likes hot food. 5. If Marsha takes a job with a state commission, she 'll gain much experience in new areas, although she won't get to travel. If she takes a job with a private company, she'll get to travel, and she'll be paid well, although she won't gain much experience outside her area. Marsha won't be paid well, but she will get to travel. 6. If the Court's decision here is consistent with the decision in Yick Wo v. California, it will hold that statistical arguments alone can suffice to establish discrimination. If, however, it is compatible with the decision in several recent cases, it will hold that establishing discrimination requires something beyond purely statisical argumentation. 7. I like this painting, even though I don't think it's very good. I like everything that Elmer likes, and Elmer likes everying painting that's good. 8. Many Inda-European languages are descended from Latin. All languages descended f rom Latin developed a word for yes from the Latin sic (meaning thus ). But few Inda-European languages have a word for yes developed fr om sic. 9. No drugs are approved for use without careful screening. Careful screening takes years. A few drugs in great demand, however, are approved for use in less time. 10. Few communist parties in Europe seek to identify themselves with the Soviet party. Parties seeking to identify themselves with the Soviets have a difficult time becoming part of coalition governments. Almost all European communist parties find it difficult, however, to become part of coalition governments. 11. * Stocks of companies with high debt-equity ratios are fairly risky. If a stock is fairly risky, it must reward investors with better-than-average returns, or they will eschew the risk. But many stocks that fail to reward investors with better-than-average returns are those of companies with high debt-equity ratios.

Basic Concepts of Logic

25

12.

* People have a right to life. Fetuses are not people. If something has a right to life, it wrong to kill it. Abortion is the killing of a fetus. Abortion is wrong.

13.

* Few contemporary composers write anything that could reasonably be called twelve-tone compositions. If so, then atonal music is defunct. But atonal principles of composition still exert some influence on contemporary composers. And nothing that still exerts influence is defunct.

1 4. * Many football stars never graduate from the colleges where they first become famous. Most of these colleges insist that almost all their football players receive degrees. These schools are telling the truth. 15. * Most actresses begin their careers as successful models. Every woman who begins her career as a successful model is very glamorous. Nevertheless, few actresses are very glamorous. 16. * Many well-known American novels deal with the character of a specific region of the country. Every well-known American novel, of course, portrays a certain conception of America itself . Nonetheless, many novels that portray a conception of America do not deal with any specific region of the country. 17. * My barbe r, who lives and works in town, shaves every man in town who doesn 't shave himself. Furthermore , my barber doesn't shave anyone in town who does shave himself.

** If 3 were an

18.

even number, then 4 would be odd. But , if 3 were even, then 6 would be even, and if 6 were even, then 4 would be even too.

19.

**

If God exists , it's surely true that He exists necessarily. It's possible that God exists. It's also possible that He doesn't exist.

20.

**

Because you promised , you ought to take your brother to the zoo. But, since you also have duties as club treasurer, you have an obligation to go to the club meeting . And you can't do both.

True or false? Explain. 21. If a set of sentences is satisfiable , no member of that set implies a contradictory sentence. 22. If no member of a set implies a contradictory sentence, that set is satisfiable. 23. Some satisfiable sets of sentences imply contradictions. 24. Some satisfiable sets of sentences imply no contingent sentences. 25. Every contradictory set of sentences implies every contradiction. 26. No satisfiable sets of formulas imply every sentence. 27. Some contradictory sets of sentences imply every sentence. 28. If A implies B, then the set consisting of A and B together is satisfiable. 29. If the set consisting of just A together with B is contradictory, then A implies that B is false. 30. Any argument with a contradictory set of premises is valid . 31. Arguments with satisfiable sets of premises have satisfiable conclusions.

26

Logic, Sets and Functions

32. Every satisfiable set of sentences c ontains at least one true sentence. 33. Every contradictory set of sentences c ontains at least one false sentence. 34. Any set consisting of all valid sentences is satisfiable. 35. Any set consisting of all c ontingent sentences is satisfiable. The Englishman William of Ockham ( 1285-1349) , perhaps the most influential philosopher and logician of the fourteenth century, recorded eleven rules of logic in a chapter of his Summa Totius Logicae. Ten of these use concepts we've already devel oped. Say whether each is true, given the definiti ons of this chapter, and explain why. 36. The false never foll ows from the true. 37. The true may follow fr om the false. 38. W hatever follows fr om the conclusion of a valid argument follows from its premises. 39. The c onclusion of a valid argument follows from anything that implies the argument's premises. 40. W hatever is c onsistent with the premises of a valid argument is also consistent with the argu­ ment's c onclusi on. 41. W hatever is inc onsistent with the conclusi on of a valid argument is also inconsistent with the argument's premises. 42. The c ontingent does not foll ow from the valid. 43. The contradictory does not follow from the satisfiable. 44. Anything whatsoever foll ows from the contradictory. 45. The valid foll ows from anything whatsoever.

Chapter 2

Sentences Sentential logic examines the relationships between sentences that pertain to reas oning. This chapter will focus on only a p ortion of that logic. First, attention will be restricted to sentences that are either true or false, such as 1 . Nome is in Alaska. 2. No other university in the United States is as old as Harvard. 3. If Michael Jackson had been born a woman, he'd have wanted to marry himself. We will ignore sentences such as 4. Ouch! 5. Shut the door. 6. Does anybody really know what time it is? which cannot be true or false, th ough they may be classified as appr opriate or inappropriate, wise or unwise, etc. Second, we'll develop a theory only of the connecti ons between sentences that are truth­ functional. These connections have a rather neat logical character. The truth values of comp ound sentences formed by way of them depend entirely on the truth values of the smaller sentences they connect. The result of forming these connections is thus completely predictable on the basis of the truth values of the smaller sentences alone.

2.1

The Language of Sentential Logic

Sentences, we've seen, can be parts of other sentences. Sentences that appear within other sentences are often called embedded or subordinate clauses. In this section we'll discuss a symbolic language designed t o clarify the structure of some of these embedding mechanisms. The language of sentential logic does not treat complemented verbs- indeed, their logic is extremely complex- but it can represent many of the expressions that operate on one or more sentences to pr oduce a new sentence. Definition 2 . 1 An n-ary sentence connective is a word or phrase that forms a single, compound

sentence from n component sentences.

27

28

Logic, Sets and Functions

Sentence connectives constitute the chief object of study for sentential logic. Some examples of singulary connectives are not, maybe, of course, possibly, necessarily, and- somewhat controversially­ auxiliary verbs such as may, can, could, might, must, should, etc. Examples of binary connectives are and, but, however, although, if, or, unless, though, before and because. Some of these connectives behave very predictably, in a certain sense, while others are more complex. Valid arguments never lead us from truths to falsehoods; in every circumstance in which the premises are true , the conclusion is true as well. In trying to construct a theory of validity, therefore, it seems n atural to focus on the concepts of truth and falsehood. When I say that some connectives are predictable, l mean that their effects on truth values are predictable on the basis of truth values. That is, predictable connectives form compound sentences whose truth values are a function of the truth values of their components. Predictable connectives are called truth-functional.

Definition 2 . 2 A n n-ary sentence connective is truth- functional if and only if the truth values

of the n component sentences always completely determine the truth value of the compound sentence formed by means of the connective. To put this another way, if a connective is truth functional, then compounds formed from it match in truth value whenever the truth values of their components match in truth value. Knowing whether the component sentences are true or false is all one needs to know the truth value of the compound. Consider, for instance, the sentence 'It's not snowing.' This contains the component 'It's snow­ ing.' Suppose that this component is true. Then the compound is false; if it's true that it's snowing, it's false that it's not snowing. Suppose that the component is false. If it's false that it's snowing, then it's true that it 's not snowing, so the compound is true. The truth value of 'It's snowing,' in other words, completely determines the truth value of ' It's not snowing.' Nothing about this depends on any special feature of the sentence 'It's snowing. ' Any sentence in this role should produce much the same result. ' Not ' is thus a truth-functional sentence connective. In contrast, consider 'after.' This connective is not truth-functional. To see why, consider a sentence such as ' George resigned after the commissioner stopped the trade.' This contains the components ' George resigned' and 'The commissioner stopped the trade. ' Suppose that both these sentences are true. That does not determine whether the compound sentence is true. George, after all, may have resigned before the commissioner nixed the trade. Note that, if either or both of the component sentences are false, the compound must be false; if George didn 't resign, then he didn 't resign after the commissioner stopped the trade. If the commissioner didn't stop the trade, then did George resign after he stopped the trade? Here, matters are less clear, but the compound sentence, if not false, at least is not true, and is highly misleading. So the fact that the truth values of the components sometimes suffice to determine the compound's truth value isn 't enough; they must suffice in every case if the connective is to be truth-functional. We can summarize these examples readily with the help of a small table. Consider the possible truth values of the sentences involved, and see if, in every case, then truth value of the compound is fully determined. This results in the tables: It's snowing It 's not snowing T F

F

T

Sentences

29

George T he c ommissioner George resigned after the resigned stopped the trade commissioner stopped the trade ? T T T F , or misleading F F F F F F T he letters T and F here abbreviate true and false. A c onnective is truth-functional just in case it's possible to fill in every entry under the compound sentence with a determinate truth value. If, as with 'after,' there is any row that can't be filled in with either T or F , then the connective is n ot truth-functi onal.

Problems Are these c onnectives truth-functional? Why, or why not? l . because 2 . or

3. before

4. nevertheless 5 . may

6. in order that 7. in spite of 8. regardless whether 9. implies 10. sh ould 1 1 . it's improbable that 12. it's a logical truth that

1 3. can 14. c ould 1 5 . maybe

1 6. it's obvi ous that 1 7. it's surprising that 18. when 19. provided that 20. if

30

2.2

Logic, Sets and Functions

Truth Functions

We can think of sentence connectives as functi ons that take sentences as inputs and yield comp ound sentences as outputs. Truth-functional connectives yield outputs whose truth values depend solely on the truth values of the inputs. For each truth-functional connective, then, there is a corresp onding function from truth values into truth values. They take as inputs the truth values of the comp onent sentences and yield the truth value of the comp ound sentence. Such functions are called truth functions. Definition 2.3 An n-ary truth function is a function taking n truth values as inputs and pro­ ducing a truth value as output. T here are four singulary truth functi ons. Such a function takes a single truth value as input, and pr oduces a truth value as output. T here are only two candidates for inputs- truth and falsehood­ and, similarly, only two candidates for outputs. One of the four singulary truth functions takes both truth and falseh o od into truth; another takes both into falseh o od. One takes each value into itself; an other takes them into each other. T hat's all the p ossible singulary functions. T here are 16 binary functions, and, in general, for each n,2 2" truth functi ons. Since we may make n as large as we like, there are infinitely many truth functi ons all together. H ow can we formulate a theory to describe this infinite array of functi ons? Luckily, it's not very difficult. I'll present a few commonly used truth functions. Any truth function at all can be defined in terms of them al one. In fact, as I'll explain, a single binary truth functi on suffices to define every truth functi on in this infinite collecti on. T he first functi on I'll define is singulary. It is called negation, and I'll use the symbol -, to represent it. (T his symb ol has no name in the literature; I 'll call it the hoe.) In this definiti on , A is any sentence or formula. (Formulas will corresp ond to sentences in our symbolic language.) A T F

,A F T

Negati on transforms the truth value of the comp onent sentence into its opp osite. We've already seen an English connective that has this effect: the logical particle not. Other English expressions having much the same impact are it is not the case that, it 's false that, no, never, and, often, the prefixes un-, dis-, a-, im-, etc. T he second function is binary. Called conjunction, I'll represent it with the ampersand, & .

A B

T T T F F T

F F

( A &B) T F F F

A conjuncti on is true just in case b oth its comp onents- called conjuncts- are true. English expressions functioning in this way include most grammatical conjunctions: and, both . . . and, but, though and although. T h e third functi on is al so binary. Represented by V, the wedge, it's called disjuncti on.

Sentences

A T T F F

3 1

B (A V B) T T T F T T F F

A disjunction is true just in case either of its components- called disjuncts - is true. English expressions corresponding to this function are or, either . . . or and unless. The fourth function, again binary, is represented by the arrow , ---., and is called the conditional:

(A _, 13) T F T F T F T T F F

A 13

T T

Here, for the first time, the order of the components makes a difference. The first component of a conditional is its antecedent; the second is its consequent. A conditional is true just in case it doesn't have a true antecedent and false consequent. English expressions having, roughly, the force of the conditional truth function are B if A, if A then B, A only if B, B so long as A, B provided that A, 13 assuming that A, B on the condition that A, A is a sufficient condition for B, and 13 is a necessary condition for A. Finally, the biconditional is a binary truth function, symbolized by ~ ·

A 13

T F F T F F

T T

(A ~ B)

T F F T

Biconditionals are true just in case their components agree in truth value. English expressions such as if and only if, when and only when, is a necessary and sufficient condition for, and just in case correspond to the biconditional. The symbols ...., , & , V , ---. and ~ are in common use. Unfortunately, however, there is no standard logical notation. This table shows other symbols that have been used as logical connectives: Truth Function Negation Conjunction Disjunction Conditional Biconditional

Our Symbol -,p P&Q PVQ p -, Q P~Q

Other Symbols -P, ~ P, P' , P P I\ Q, PQ, P · Q, K PQ Py_Q , APQ P :::J Q , CPQ P Q, P Q, EPQ

=

~

These five truth functions have enough expressive power to allow us to construct any truth function from them. Indeed, not even all five are necessary. Negation and conjunction alone suffice; so do negation and disjunction, or negation and the conditional. These sets of truth functions­ {,, & } , {·, V}, and {-,, ---. }- are functionally complete. Two binary truth functions are functionally complete all by themselves. T hey are called Sheffer functions, after Henry Sheffer (1883-1964), a Harvard logician. (Actually, they were discovered by

32

Logic, Sets and Functions

the American philosopher Charles Sanders Peirce around 1880, but went unnoticed until Sheffer reproduced the discovery in 1913. ) These functions are Sheffer's stroke (/) and nondisjunction ( 1 )

A B (A/ B) T T F F

T F T F

F T T T

(A l B) F F F T

English correlates of Sheffer's stroke are not both . . and, is incompatible with, and is inconsistent with. The chief correlate of nondisjunction is neither . . . nor. The existence of these functions is quite intriguing; computer logic gates equivalent to them (called NAND and NOR gates) have been extremely important in the development of computer logic design. Nevertheless , we will generally restrict our sentential language to the five connectives first presented. The English correlates to truth functions are often only approximately equivalent in meaning to them. The correlates to negation express that function quite faithfully, except that English seems to admit some intermediate , "fuzzy" ranges, while the negation truth function does not. If Uncle Fred has only a small amount of hair on his head, for example , it may not be quite correct to say that he's bald, but it's not exactly false either. Cousin Fran may not really be happy these days, but she's not really unhappy either. And so on. Another problem arises with sentences involving attitudes, which sometimes seem to have presuppositions. If Roger Staubach never had to serve time in Huntsville prison , then neither 'Roger Staubach regrets that he had to serve time in Huntsville prison' nor 'Roger Staubach doesn't regret that he had to serve time in Huntsville prison' seem to be true. Similarly, if Yogi Berra has never thought much about Spinoza, then neither 'Yogi Berra believes that Spinoza was a great philosopher' nor 'Yogi Berra doesn't believe that Spinoza was a great philosopher' seem true. The correlates of conjunction are very faithful to it, except that and sometimes seems to express a temporal ordering. There is a difference in meaning between 'Heidi got married and got pregnant' and 'Heidi got pregnant and got married', even though the definition of conjunction indicates that these should be true in exactly the same circumstances. Also, English treats conjunction as a multigrade connective , that is, one that can take two, three , or more sentences and combine them into a single compound sentence . The logical & always links two sentences, but in English we can say 'John brought the mustard, Sally brought the pickles, and I brought the hot dogs.' The correlates to disjunction are also quite close to the logical definition. Or, too, appears to be multigrade in English. The biggest surprise is that unless has the same meaning, logically, as or. Consider the sentence: 'The patient will die un less we operate.' This is equivalent to 'Either we operate, or the patient will die .' According to the definition, this should be true if we operate , and true if the patient dies. Yet the English sentence seems to assert that there is a connection, usually causal, between the operation and the patient's chances for survival. We wouldn 't normally count the sentence true, just because the operation was performed, or just because the patient died. In fact, a surgeon who said this when he knew that the patient was going to die in any case seems to be unethical. This becomes even more acute with the conditional. At first it seems surprising that A only if B and If A, then B are both correlates of the conditional (A ___. B). But both these sentences seem equivalent to the earlier examples and to each other: 'If we don't operate, the patient will die' and 'The patient will survive only if we operate.' These , of course, are not quite the same: one is (-.A ___. B) while the other is (--.B ---> A). Switching the negations and the English connectives, furthermore , yields sentences that ought to be equivalent to these, logically speaking, but sound bizarre: 'If the patient survives, we operate' and 'We don't operate only if the patient dies.' The

33

Sentences

first makes it sound as if we operate only when the patient is out of danger, and the second makes it sound as if we operate as long as we have a live body to operate on. The conditional doesn't capture these differences, and for the same reason that disjunction deviates from unless. Whatever follows the English words if and unless states a condition, usually causal, on the other component. So there is a strong feeling that whatever happens to make the condition true should happen before what happens to make the other component true. The then of if . . . then, in other words, has some temporal meaning, and our logical rendering of the conditional can 't capture it. The same problem, of course, occurs doubly with the biconditional. These problems with the truth-functional rendering of the conditional have been recognized since the third century B .C., when Philo of Megara (a classmate of Zeno, the founder of Stoicism) first proposed such an analysis. Diodorus, Philo's teacher, held that conditionals involve necessity, and Chrysippus, the third head of the Stoic school and widely acclaimed as the greatest logician of his time, held that the sort of necessity involved is specifically logical, as opposed, for instance, to causal necessity. The controversy among their followers became so intense that Callimachus wrote that "even the crows on the roofs caw about the nature of conditionals." Philo's treatment has the advantage of being truth-functional. It is also the weakest analysis that preserves the inference patterns that everyone agrees should characterize the conditional. Consequently, many other analyses of the conditional can be defined in terms of the P hilonian conditional together with another connective, such as necessity. Finally, Philo's truth-functional construal of the conditional has proved valuable for expressing sentences such as 'All humans are mortal' in quantificational logic.

Problems Explain why a connective in each of these sentences deviates somewhat from the truth function with which it is generally correlated. l . Mary fell down and got up. 2. My heart rate approaches 200 if and only if I do heavy exercise. 3 . Only if you come to me on your knees will I let you out of the contract. 4. Give me a place to stand, and I will move the earth. (Archimedes) 5. If the moon smiled, she would resemble you. (Sylvia Plath) 6. Why, if 'tis dancing you would be, there 's brisker pipes than poetry. (A. E. Housman) 7. It is not the case that if God is dead, life is meaningless. 8. I 'll leave only if you have somebody to take my place. 9. If Mike clears his throat once more, I'll strangle him. 10. Fame and rest are utter opposites. (Richard Steele)

2.3

A Sentential Language

Aristotle, the first logician, was also the first logician to use symbols. The Stoics made the practice common, using numbers to stand for sentences. Nevertheless, no one developed a fully symbolic logical language until the nineteenth century, when George Boole (1815-1864) saw that logic could profit from mathematical analysis. (His greatest work, in fact, he entitled The Mathematical Anal­ ysis of Logic. ) In this section we'll present a symbolic language that permits ready evaluation of

34

Logic, Sets and Functions

arguments depending on sentence connectives. The syntax of this language is fairly simple. The vocabulary falls into three basic categories: sentence letters (or variables) , connectives, and grouping indicators.

Vocabulary Sentence letters: A, B, C, ... , Z with or without numerical subscripts Connectives: ...,, &, V, --+, and Grouping Indicators: ( and ) The syntactic rules that allow us to combine these elements to form meaningful strings of the language can be presented in several ways. First, we'll define the notion of a meaningful string- a formula - as logicians usually do. Then we 'll present some phrase structure rules that allow us to see the similarity between the syntax of this artificial language and the syntax of a natural language such as English.

Formation Rules Any sentence letter is a formula. If A is a formula, then ...,A is a formula. If A and l3 are formulas, then (A&/3) , (A V !3), (A --> !3) and (A - !3) are too. Every formula can be constructed by a finite number of applications of these rules. In presenting the above definitions and rules we've assumed that sentence connectives and other vocabulary items of our symbolic language are names of themselves. (A --+ !3) thus refers to the formula resulting from concatenating a left parenthesis, the formula A, an occurrence of the con­ ditional connnective, the formula !3, and a right parenthesis. Notice that we can't refer to this formula by using ordinary quotation: '(A --+ !3) ' both signifies the string of symbols quoted, i.e., a left parenthesis, followed by the calligraphic letter A, followed by an occurrence of the conditional, followed by the calligraphic letter !3, followed by a right parenthesis. This is not a formula of our sentential language; calligraphic letters never appear as sentence letters. It is, instead, a part of the language we use to talk about the sentential language. We'll call calligraphic letters such as A and l3 schematic lette rs, and strings such as (A --+ !3) schemata. We will adopt three conventions for using our symbolic language. These conventions allow us to work with the language more easily. First, we'll count brackets- [ and ] - and braces- { and } - as poorly drawn parentheses, letting them act as grouping indicators when that makes a formula more readable. Second, we'll allow ourselves to drop the outside parentheses of a formula, since they do no further work. One can imagine them drawn in so lightly that we can't see them. T hus, we'll count these pairs of symbol strings as equally acceptable as formulas, even though only those on the left are formulas, strictly speaking. (P --+ Q) p --+ Q ( ( (P --+ Q ) --+ P) --+ P) { (P --+ Q) --+ P} __, P We'll close this section with two definitions that will prove extremely useful throughout the entire text. An occurrence of a connective forms a compound sentence from one or more component sentences. The connective occurrence itself, together with these components, constitute the scope of the connective occurrence in a given formula. The connective occurrence with the largest scope is the main connective of the formula. Definition 2.4 The scope of a connective occurrence in a formula is the connective occurrence

itse lf, together with the components (and any grouping indicators) it links together in the formula . Thus the scope of these connective occurrences, in these formulas, is:

Sentences -,

V &

(P __, ,Q) : ,(P --> Q) : (P V ( Q&R)) : (P V ( Q&R)) :

in in in in

,Q ,(P --> Q) (P V (Q&R)) (Q&R)

35

Definition 2.5 The main connective of a formula is the connective occurrence in the formula with the largest scope. The main connective is always the connective occurrence having the entire formula as its scope . Problems Classify each of these as (a) an official formula, (b) a conventional abbreviation of a formula, or (c) neither. If you answer (a) or (b) , identify the main connective . l . P&,Q

2. ,p V ,Q

3. ,(P& ,Q)

4. ( ,P V ,Q)

5. (&P V &Q)

6. (P V Q) --> R

7. P V Q --> R

8. (P V Q --> R)

9 . P V (Q --> R)

10. (P V Q) --> R)

11. ( (P v Q) - A)

12. ,((P V (Q V R)) &Q 13. ,[P

(R

,Q) ,_. ,P]

14. M --> [R ,_. {K V L}] -->

-->

15. ,((P --> (Q --> R)) V P

16. The rules for forming formulas of sentential logic admit no syntactic ambiguity. How could these rules be altered to admit ambiguity? Give an example of a set of rules allowing syntactic ambiguity, and specify an ambiguous formula.

Logic, Sets and Functions

36

2.4

Translation

We can use the symbolic language we've constructed to help evaluate arguments in natural language only if we can translate fr om English into the symbolic language. Luckily, this is not usually very difficult. To translate an English discourse into symb olic notation, • Identify sentence connectives; replace them with symb olic connectives. • Identify atomic sentential comp onents of the English sentences, and replace each distinct com­ ponent with a distinct sentence letter. • Use the structure of the English sentence to determine grouping. Several factors complicate these steps. The first step- identify and replace sentence connectives­ relies on the correlati on we have outlined between truth functions and certain English sentence connectives. As we've noted, these correlati ons are not always exact. So some linguistic sensitivity needs t o accompany this pr ocess. Step two- identify and replace atomic sentences- is complicated chiefly by the fact that English arguments try n ot to be repetitive or dull. Rarely will an author use even a component sentence twice in the same passage without varying the wording of it. This means that deciding when two atomic sentence comp onents are "distinct" or "the same" is no easy matter. Here, again, linguistic sensitivity is needed. Components sh ould be judged "the same" only if they have basically the same meaning. The third step- determining the grouping- is, in a sense, the trickiest of all. Our symbolic language avoids ambiguity by using parentheses as gr ouping indicators. English doesn't use paren­ theses in this way, and so natural language sentences are sometimes ambiguous. But English offers a number of devices for making gr ouping clear. One is the use of commas. The English sentence 'J ohn will come and Fred will leave only if you sign' has no very clear gr ouping. But, by placing a comma judici ously, we can make it clear that we intend and to be the main connective: ' J ohn will come, and Fred will leave only if you sign. ' Basically, a comma emphasizes a break in the sentence, stressing the presence of two different phrases being combined in the sentence. Commas therefore tend to suggest that the nearest connective has some priority. Second, English offers c o o ordinate phrases such as either . . . or, both . . . and, and- if . . . then. These can make clear the grouping by identifying the comp onents clearly. 'Either Bill br ought Mary and Susan brought Sam, or Susan br ought B ob' makes the intended grouping clear. The co ordinated connective takes priority over any connective appearing within the co ordinated secti ons. N ote h ow much clearer than 'Bill br ought Mary and Susan br ought Sam or Susan br ought B ob' the above sentence is. Third, English allows a device that logicians have called "telesc oping" and that linguists call "conjunction reduction" . 'Susan br ought Sam or Susan brought B ob', for example, can be. "reduced" to the shorter sentence 'Susan br ought Sam or Bob' . Similarly, 'Fred likes Wanda and Kim likes Wanda' can be reduced to 'Fred and Kim like Wanda'. This too can clarify gr ouping: we could gr oup the last paragraph's sentence in an other way by saying 'Bill br ought Mary and Susan brought Sam or B ob'. Problems Translate each of these sentences into our symb olic language. If any translate only with difficulty, explain why. 1 . Kindness is in our power, bur fondness is not. (Samuel J ohnson) 2. My father taught me to work, but not to l ove it. (Abraham Lincoln)

Sentences

37

3. And April's in the west wind, and daffodils. (John Masefield) 4. But if you wisely invest in beauty, it will remain with you all the days of your life. (Frank Lloyd Wright) 5. If you get simple beauty and naught else, you get about the best thing God invents. (Robert Browning) 6. A man can hardly be said to have made a fortune if he does not know how to enjoy it. (Vauvenargues) 7. It is not poetry, if it make no appeal to our passions or our imagination. (Samuel Taylor Coleridge ) 8. The forces of a capitalist society, if left unchecked, tend to make the rich richer and the poor poorer. (J. Nehru) 9. Honesty pays, but it don't seem to pay enough to suit a lot of people. (K. Hubbard) 10. If you don't get what you want, it is a sign either that you did not seriously want it , or that you tried to bargain over the price. (R. Kipling) 11. Honesty is praised, but it starves. (Latin proverb) 12. It does not do to leave a live dragon out of your calculations, if you live near him. (J. R. R. Tolkein) 13. I am an idealist. I don't know where I'm going but I'm on my way. (Carl Sandburg) 14. Size is not grandeur, and territory does not make a nation. (Thomas H. Huxley) 15. And the light shineth in the darkness; and the darkness comprehendeth it not. (John 1:5) 16. If you are patient in one moment of anger, you will escape 100 days of sorrow. (Chinese proverb) 17. It is only by painful effort, by grim energy and resolute courage , that we move on to better things. (Theodore Roosevelt) 18. If we are to preserve civilization , we must first remain civilized. (Louis St. Laurent) 19. Miracles sometimes occur, but one has to work terribly hard for them. (Chaim Weizmann) 20 . Happiness is not the end of life, character is. (Henry Ward Beecher) 21. A sense of duty is useful in work but offensive in personal relations. (Bertrand Russell) 22. Loafing needs no explanation and is its own excuse. (Christopher Morley) 23. So then neither is he that planteth any thing, neither he that watereth; but God that giveth the increase. (I Corinthians 3:7) 24. In cases of difficulty and when hopes are small, the boldest counsels are the safest. (Livy) 25. The mind itself, like other things, sometimes be unbent ; or else it will be either weakened or broken. ( Philip Sidney) 26. I could not love thee, dear so much/ Loved I not honor more. (Richard Lovelace )

Logic, Sets and Functions

38

27. Nothing will ever be attempted if all possible objections must first be overcome. (Jules W. Lederer) 28. A man doesn't need brilliance or genius; all he needs is energy. (Albert M. Greenfield) 29. It's not good enough that we do our best; sometimes we have to do what's required. (Winston Churchill) 30. Action does not always bring happiness, but there is no happiness without it. (Benjamin Disraeli)

2.5

Validity

Our tactic for analyzing English arguments and sentences is twofold. F irst, we will translate the sentences or arguments into our symbolic language. Second, we will evaluate the results for validity, satisfiability, etc. The clarity and convenience of the symbolic language pays big dividends whenever logical problems become at all complicated. So far, we have set up our language and looked at ways of translating English into it. This solves only part of the problem , however. We want to be able to assess the validity, or satisfiability, or whatever , of the logical formulas we obtain through translation. But the notions of validity, etc., as we've defined them, make no sense when applied to formulas. To see why, consider a simple formula , P. Is this formula valid? That is , is it true in every possible world? It doesn't even make sense to speak of P as true or false in our world, since it is not a sentence, with a determinate meaning, but simply a sentence letter. The same thing happens to arguments. Sequences of sentences , when translated into a symbolic language, become sequences of formulas, or argument forms. Definition 2.6 An argument form is a finite sequence of formulas. The last is the conclusion

formula; the others are premise formulas.

Neither premise nor conclusion formulas can be true directly in any possible world. Consequently, we need to reformulate our basic semantic concepts for formulas. The first person to do this successfully was Bernard Balzano (1781-1848), a professor at the University of Prague. Formulas are not directly true or false, in this or any other possible world. They are, however, true or false under some interpretation. If we decide that P represents the sentence ' Frank Langella plays a good Dracula ', then we can assess the truth value of this sentence, and, so, speak of P as either true or false under that interpretation. Clearly, P could represent either a true or a false sentence, so it should be a contingent formula. The formula P -; P, on the other hand, will represent a truth, whether P stands for 'John is smiling', 'Bruce Wayne is Batman' or 'Ice cream and strawberries are a tasty combination. This formula, then , should be valid- it comes out true no matter how it is interpreted. In speaking of "interpretations" here, we mean two things. First, we can interpret a sentence letter by specifying that it represents a certain English sentence. Second, we can interpret it more directly, from the perspective of sentential logic, by simply assigning it truth or falsehood. All the analysis we are currently prepared to do uses nothing about a sentence beyond its truth value. So we can interpret a sentence letter, from the point of view of the logic of sentences, by saying whether the sentence letter is true or false. We can interpret a formula , in general, by saying whether the sentence letters it contains are true or false. Definition 2. 7 An interpretation of a sentence letter is an assignment of a truth value to it. An interpretation of a formula of sentential logic is an assignment of truth values to its sentence

Sentences

39

letters. An interpretation of an argument form or set of f ormulas in sentential logic is an assignment of truth values to all the sentence letters in the argument f orm or set. Now we can easily formulate definiti ons of validity and other semantic notions for formulas and argument forms. They corresp ond very closely to the definitions for sentences and arguments. Definition 2.8 A formula is valid if and only if it 's true on every interpretation of it. A f ormula is contradictory if and only if it 's false on every interpretation of it, satisfiable if and only if it 's true on at least one interpretation of it, and contingent if and only if it 's neither valid nor contradictory. Definition 2.9 An argument form is valid if and only if there is no interpretation of it making its premise formulas all true and its conclusion formula false. Definition 2 . 1 0 A set of f ormulas S implies a formula A if and only if there is no interpretation of S t ogether with A making every member of S true but A false. Definition 2 . 1 1 Two formulas are equivalent if and only if they agree in t ruth value on every interpretation of them. Definition 2.12 A set of formulas is satisfiable if and only if, on some interpretation of it, every member of the set comes out true; the set is contradictory otherwise. These definitions are almost exactly the same as those for sentences and arguments, but with "p ossible worlds" replaced by "interpretati ons" . Th ough our method of translating sentences and arguments into a symb olic language is funda­ mental to modern l ogic, one must be aware of its limitations. If a translati on is a g o od translati on , and if the argument form, say, that results is valid, then the original argument must have been valid. But, even if the translation is g o od, an invalid argument form doesn't show that the original argument was invalid. Many English arguments depend on logical relati onships that sentential logic doesn't capture. An appropriate form for that argument in sentential logic, then , may be invalid , even though the argument is valid. Because of the nature of our method, therefore, "valid" verdicts merit more trust that "invalid" verdicts. It's always possible that, to establish the validity of an argument , we would have to bring out bigger guns than are available in the logical theory we're working in.

Problems Classify these sentences as true or false , and explain. l. Valid argument forms having valid conclusion formulas are valid. 2. Some argument forms with contradictory premise formulas aren't valid. 3. There is a formula that implies every other formula. 4. There is a formula that is equivalent to every other formula. 5. N o satisfiable sets of formulas imply every formula. 6. Any formula that follows from a satisfiable formula is satisfiable. 7. Any formula that implies a contingent formula is n ot valid. 8. Any formula that follows from a contingent formula is contingent.

40

Logic, Sets and Functions 9. Any formula that follows from a valid formula is valid.

1 0. Any formula that implies a valid formula is valid. 1 1 . All contradictory formulas imply one another. 1 2. All contingent formulas imply one another. 1 3. All valid formulas imply one another. 14. If a set of formulas is satisfiable, no member of that set implies a contradictory formula. 1 5. If no member of a set of formulas implies a contradictory formula, that set is satisfiable. 16. If a set of formulas implies no contradictory formula, that set is satisfiable. 1 7. Some argument forms with contradictory formulas as conclusion formulas are valid. 18. No formula implies its own negation. 19. Any formula that implies its own negation is contradictory. 20. Any formula implied by its own negation is valid. Bolzano calls an argument form exact or adequate if and only if (1) it is valid and (2) the conclusion formula does not follow from any proper subset of the premise formulas. (That is, omitting any premise formula or formulas makes the argument form invalid . ) In such an argument form, the premise formulas exactly imply the conclusion formula. Are these assertions about Bolzano's concept true or false? Explain. 21. No exact argument form has a valid formula as conclusion. 22. Some exact argument forms have valid formulas as premises. 23. Some exact argument forms have contradictory conclusion formulas. 24. Some exact argument forms have contradictory premise formulas. 25. Some exact argument _forms have contingent conclusions. 26. Every formula exactly implies itself. 27. Some formula exactly implies every formula. 28. Some formula is exactly implied by every formula. 29. Sets of contingent formulas exactly imply only contingent formulas. 30. No formula exactly implies its own negation.

Sentences

2.6

41

Truth Tables

The language of sentential logic has the power to express forms of very many English arguments. Merely expressing these arguments, however, is only a part of the task. To evaluate arguments in natural language, we must also find ways of evaluating argument forms as valid or invalid. The same point holds, of course , for sentences and set of sentences. The strategy underlying the use of symbolic logic has two components: translation into a symbolic language , and evaluation of formulas, argument forms, etc. , in that symbolic language. So far in this chapter we have discussed only the first component. In the remaining three sections of this chapter, we will present a method for testing sentential ar­ gument forms for validity. The same method can also test for satisfiability, equivalence, implication , and validity of formulas. The method we 're about to present, that of truth tables, is simple but powerful. Although it becomes cumbersome when dealing with complex problems, it can in principle answer any question we might raise about validity, implication, satisfiability, etc., within the bounds of sentential logic. To make this more precise: truth tables constitute a decision procedure for all the logical properties and relations we've discussed.

Definition 2.13 A decision procedure for a property P is a mechanical, infallible method for determining whether any given object has the property P. (Similarly, a decision procedure for a relation R is a mechanical, infallible method for determining whether any given objects stand in the relation R.) The truth table technique is mechanical in the sense that no particular ingenuity is required to implement it. One needs only to follow the rules. In this respect, constructing a truth table is like dividing two numbers, or performing some other elementary arithmetical operation , but very unlike painting a portrait, writing a short story or getting a job. The truth table technique is also inf allible : it always yields a correct answer after a finite period of time (at least , if it is constructed correctly). One can think of mechanical and infallible procedures in another way: these are the procedures that a computer can implement. Theoretically, a computer can perform any mechanical procedure. If the procedure is not infallible, however, the computer may run on forever without producing an answer. Practically speaking, therefore, mechanical and infallible procedures- often called effective procedures- are uniquely suited to computerization. Because there is a decision procedure for validity and the other basic semantical notions of sentential logic, the set of valid formulas, the set of satisfiable formulas, the set of valid argument forms, etc., are all said to be decidable.

Definition 2.14 A set is decidable if and only if there is a decision procedure for membership in the set. To put this another way, a set is decidable just in case there is a mechanical, infallible test for determining whether any given object belongs to the set or not. Any finite set is decidable, because it's possible to list its members. To test a given object for membership, one can simply search through the list. This tactic isn't any help when the set is infinite, however. So only when we deal with infinite sets do we need to worry about decidability. Having a decision procedure for a set- or a theory, since a theory is a set of sentences, together with all their logical consequences­ tells us a great deal about that set or theory. The question whether a decision procedure exists for various theories and sets has thus taken on substantial importance in 20th-century logic. In fact, this question, when applied to a particular theory or set, has been given a special name: the decision problem for that theory or set.

Logic, Sets and Functions

42

Earlier in this chapter, we defined five connectives that form the heart of our symb olic language. These definitions themselves have the form of small truth tables:

A -.A A B ( A V B) T T F T T F T T F T F T T F F F

(A&B) T

F F F

(A __, B) T

F T T

(A ...... B) T

F F T

These definitions specify the values of c omplex formulas on the basis of the values of their com­ ponents. It is fairly easy to see that this allows us to compute the truth value of a formula of any length, given the values of the at omic formulas- i.e., sentence letters- appearing in the formula. (Actually, in the hist ory of logic, this was not so easy to see: Bo ole and G ottlob Frege, the German mathematician and logician who first invented quantification theory, both defined connectives using tables like th ose ab ove in the nineteenth century. But not until around 1920 did Ludwig Wittgen­ stein, J. Lukasiewicz and Emil L. P ost recognize, independently, that the tabular definitions suffice for a c omputation of the truth values of complex formulas. ) To take a simple example, suppose that P and Q are true, but R is false. What is the truth value of the formula ( ( P --> Q) --> R ? The main c onnective of this formula is the second conditional. The antecedent of this conditional is another conditional, both of wh ose comp onents are true; the definiti on of the conditional indicates that this smaller conditional is true. The larger conditional, then, has a true antecedent, but its consequent, R, is false. The definition of the conditional tells us that the larger conditional is therefore false. Approaching this pr oblem m ore systematically, we might list the values of the sentence letters first, and then pr oceed t o generate the values of subformulas until we reach the formula as a wh ole. The structure of the formula on the left, then, would lead us t o the table on the right, and, perhaps, t o the more compressed table below it:

((P __, Q) __, R) p Q R (P --, Q) R T T F p

p T

Q

Q T

R

F

( ( P - Q) --, R) T T T F F

( P --> Q ) T T T

A b oldface letter in the tables represents the truth value of the entire formula ab ove it. Notice that, in constructing these tables, we begin with sentence letters. We then assign values to items one level up, and work our way, gradually, to the top level, where the formula itself appears. This has the effect that we work fr om inside parentheses out. Whenever tw o subformulas are equally "deep" inside parentheses, it makes no difference which we attack first. Our definitions thus allow us t o c ompute the value of a formula, given some interpretation assigning values to its sentence letters. This is the central idea behind truth tables.

Problems

('}p V Q) V ( P V -. R) ( P and Q false; R true)

Calculate the truth values of these formulas on the interpretations listed.

,'-- ····

, ° 1. ',,

,,,.-·

Mt

2. (P --, (Q ,_. R)) --, (Q

V

-.R) (P and R true; Q false)

(Q V -.R)) ( P false; Q and R true)

3. ((R&Q ) _,, -.P)&(-.Q ,_. P) (P true; Q and R false) 4. -. ((P -+ -. Q)

V

Sentences 5. (P

+->

--.R) _, (R _, Q) (P, Q and R true)

43

6. --.(P v Q)&--.(P _, R) (P, Q and R false) 7. (P V (Q&R) ) _, (P&Q) (P and Q true; R false) fa'",

/8. ( P&Q)

--.R (P and R true; Q false)

+->

"'"1r --.(--. ( --.P& --.Q)&--.R) (P true; Q and R false) 11. (P

R) )

-,p (P and Q false; R true)

10. --. ( (Q V -,p) -> -,R) (P false ; Q and R true) +->

--. (Q

12. (P V -,Q)

+->

+->

+->

(P V -,R) (P and R false; Q true)

13. (-, (P&R) _, -,Q) -, -,Q (P, Q and R false) 14. -, ( ( P _, -,Q)

+->

-,R) (P, Q and R true)

15. (P _, --.R) -----, -,(P&Q) (P true; Q and R false)

2.7

Truth Tables for Formulas

Given an interpretation, we can now compute the truth value of a formula on that interpretation. This in itself is interesting only if we have some reason for singling out a particular interpretation­ say, because we think it corresponds to reality, given some translation between English and our symbolic language. Nevertheless, it becomes far more interesting when we realize that the validity of a formula amounts to its truth on every interpretation of it. Though no individual interpretation may seem very intriguing, therefore, the set of all possible interpretations merits some attention. A truth table is, in essence, a computation of the truth value of a formula under each of its possible interpretations. If the formula is valid, then the outcome of the truth table technique will be a string of T s. If it's contradictory, the result will be a string of F s. If the formula is contingent, finally, the result will be a string containing both T s and F s. Truth tables amount to nothing more than several simple tables, of the kind we saw in the last section, done at once. But they allow us to evaluate any formula of sentential logic as valid, contradictory or contingent. A truth table for a formula consists of four elements: (1) a listing of the sentence letters of the formula; (2) the formula itself; (3) a list of all possible interpretations of the formula; and ( 4) a computation of the truth value of the formula on each interpretation. We will write truth tables in this configuration, which is very similar to that we developed for evaluating a formula on a single interpretation : Sentence letters List of Interpretations

Formula Computation

The top of a truth table lists the formula to be evaluated, preceded by the sentence letters it contains. Under the sentence letters, the table lists all possible interpretations of them. If there is one sentence letter, then there are obviously only two interpretations. One assigns truth to the letter, and the other assigns f alsehood. If there are two letters, then there are four interpretations; three letters have eight possible interpretations; and so on. In general, n sentence letters can be given 2n interpretations. There is a simple method for listing all these possibilities, and in a standard order: count backwards in base 2 numbers from 2 n - 1 to 0, replacing O and 1 with F and T, respectively. Alternatively, imagine an array of the following kind, which is based on the notion that the truth

44

Logic, Sets and Functions

value of any sentence letter is independent, logically speaking, of the truth value of any other. The second letter, for instance, could be either true or false , whether the first letter happened to be true or happened to be false.

T T

F

F

T

F

F

F

T

T

F

T

T

F

T

T

T

T

F

F

T

F

F

T

F

F

T

F F

T

This table suggests that the listings of possible interpretations, for cases involving one, two, three or four sentence letters, should look like this:

F

T

T T

F F

F

T

F

T

T T T

F F F F

F F

T

T T

T T

T

F F

F

F

F

T

T T T T T T T T

F F F F F F F F

T T T T

F F F F

T T T T

F F F F

T T

T F T

F F T T F F T

T

F F T T

F F

T

F

F

F

T

F

T

F

T

F

T

F

T

If the formula contains more than four sentence letters, the table will have 32 or more rows­ one for each possible interpretation of the formula. In such cases, the method works, but is so cumbersome that other techniques are probably better suited to the task. To perform the computation , do precisely what we did to compute the value of a formula on a single interpretation. It's often easiest to think vertically rather than horizontally in doing this. That is, we could simply do what we did in the last section 2n times, or we could think things through as we did in the last section , but, where we filled in a value for a subformula there , fill in a column of values for that subformula on all interpretations. T he latter strategy is generally more efficient.

Sentences

45

T o break this down step-by-step: first, copy the interpretati ons for each letter under its occur­ rences in the formula. T his gives us values for the b ottom n odes of the formula's phrase structure tree. T hen , begin searching for subformulas one level up from the b ottom of the tree; l o ok, that is, for subformulas as far inside parentheses as possible. Negations of single sentence letters, alth ough they might not be at this level of the tree, are always safe as well, since the value of the negated letter depends on nothing but the value of the letter itself. T o systematize this pr ocess, we can decide to compute values for negations of single sentence letters first. Then, compute values for subformulas, working fr om inside parentheses out. This brings us up the phrase structure tree until, finally, we reach the formula as a wh ole. The last computati on should be for the main connective of the entire formula. Truth tables thus follow a "b ottom-up" strategy; they begin with the letters at the bottom of the phrase structure tree and work up the tree to compute values for more complex units. Under each sentence letter and connective of the formula, a completed table will have a column of T s and F s. T hese represent the truth values the formula or subformula has under each interpretati on of it. T he column for the entire formula itself- the last to be filled in- is the table's final column. It specifies the truth value of the formula on each interpretati on. Definition 2.15 The final column of a truth table is the column under the main connective of the

formula at the top of the table.

By now, it may be obvious how to use truth tables to evaluate formulas for validity, contradic­ toriness, etc. A valid formula is true on every interpretation of it; the final column of a table for it, therefore, sh ould contain all T s. A contradictory formula is false on every interpretation of it; the final column of a table for it, therefore, sh ould contain all F s. Satisfiable formulas, which are true on at least one interpretation of them, give rise to tables whose final columns contain at least one T. T he final columns of tables for contingent formulas, finally, contain b oth T s and F s. To summarize: Ts and Fs At least one T Final Column Formula All Ts All Fs Valid Contradictory C ontingent Satisfiable It sh ould now be clear how truth tables function as a decision procedure for the various l ogical properties of formulas. Constructing a table is mechanical, and tables always have a finite size.

2.8

Examples

To see how this works in practice, let's take an example. Is the formula (P V (Q --+ P)) valid, contradictory, or contingent? It says something like "if Q is true, so is P, unless of course P is true anyway" . This doesn't sound like either a logical truth or a contradiction, so a plausible guess would be that it's contingent. We can set up the table by listing the sentence letters in the formula, foll owed by the formula, and then listing all possible interpretations:

P Q

T F F T

T T

F F P T T F F

Q T F T F

(P V (Q

--+

(P v (Q T T T F F T F F

P) )

At this point, we copy the truth value assignments under the relevant letters in the formula: --+

P)) T T F F

46

Logic, Sets and Functions

Now we begin moving up the "grammatical tree" (which linguists call the phrase stru ctu re tree, working from negations of single sentence letters (in this case, there aren't any) to other subformulas, from inside parentheses out. In this case, then, we begin with the conditional:

p T T F F

(P

Q

T T F F

T F T F

V

(Q T F T F

T T F T

---+

P))

T T F F

Finally, we conclude by computing the value for the larger disjunction. Note that this gives us the final column ; the V is the main connective of the formula, so the truth values we are computing are those of the formula as a whole.

p T T F F

(P V (Q T T T T T F F F T F T F

Q

T F T F

---+

T T F T

P)) T T F F

The final column contains both T's and F 's, so the formula is indeed contingent. Let's look at at another, more complex example. The formula ,((P ---+ Q) V (Q ---+ R)) says , roughly, that the conditionals "if P, then Q" and "if Q , then R" are both false. This may sound as if it ought to be contingent also. To find out, we begin a truth table, writing the three sentence letters in the formula, followed by the formula itself, and then listing all possible interpretations, copying these down under the occurrences of the sentence letters in the formula:

p T T T T F F F F

Q

T T F F T T F F

R T F T F T F T F

-,

((P

T T T T F F F F

---+

Q) T T F F T T F F

V

(Q

---+

T T F F T T F F

R)) T F T F T F T F

Now, we begin to compute the values of subformulas, working our way up the phrase structure tree. There are no negations of single sentence letters- the negation here is, in fact, the main connective-- so begin as far inside parentheses as possible, with the two conditionals. 'vVe can do these in either order, ending up with the table:

p T T T T F F F F

Q R T T F F T T F F

T F T F T F T F

-,

((P -> Q) T T T T T T T F F T F F F T T F T T F T F F T F

V

(Q T T F F T T F F

-+

T F T T T F T T

R) ) T F T F T F T F

Computing the truth values of the two conditionals puts us in a position to calculuate the values of the disjunction, which would be immediately above them on a phrase structure tree:

47

Sentences

---, p Q R ((P -> Q ) V (Q -> R)) T T T T T T T T T T T T T T T F F T T F T F T T F F T F T T T F F T F F T F T F F T T T T T T F T T F T T T T F F F T F F T F T F T T F F T F T F T F T F F F F Notice that the disjunction, surprisingly, is valid; it is true on every interpretati on. Finally, then, we use the definition of negation to calculate the values of the entire formula: ---, p Q R ( ( P -> Q) V ( Q -> R)) F T T T T T T T T T T F T T T T T F F T T F T F T F T F F T F T T F T F F T F T F T F F F F T T T T T T F T T F F T T T T F F F T F F F T F T F T T F F T F F T F T F T F F F F Since the formula is false on every row- that is, on every interpretation of it- it is c ontradictory.

Problems Determine whether these formulas are valid, contradictory or c ontingent.

1. p 2 . P&,P 3. P V ,P

4. , ( ,P -> P) 5. p --, p 6. p _, ---,p 7. p _, ( Q _. P) 8. ( -.P -> P) P

9, ---, p _, ( P _.,. Q)

10. p _. (Q&P) 1 1 , ( P -+ Q )

V

(Q -> P)

1 2 . ( P __, Q)& -.( Q -+ P) 13. ( P __, Q) _. ( Q -> P)

14. ( P V Q)

->

( P Q)

1 5 . ( ( P& -.Q) __, Q ) _. ---,p

48

(P& ( ( P -+ Q)&(P -+ ,Q)))

Logic, Sets and Functions

17. (P V Q ) ,-, ,(,P&,Q)

16.

18. (P&Q) -+ (P -+ Q)(Q V r) -+ ( (R -+ P) V (Q -+ P) ) 19. (P -+ Q)

( P &Q) ( Q V R) -+ ((P -+ Q) V (P -+ R))

20. (P&Q) -+ (P V Q) (P&,(Q V R) ) ,-, ,(P _, Q) -+

2 1 . (P&(Q V R)) ,...., ( (P&Q) V (P&R) )

22. (P v (Q&R) ) ,-, ((P V Q)&(P&R) )

23. (P -+ (Q V R)) ,...., ( (P -+ Q) V (P -+ R))

24. (P v (Q -+ R)) ...... ( ( P v Q ) -+ (P v R) )

25. (P ,...., (Q -+ R)) ,-, ((P ,...., Q) -+ (P ._, R))

2.9

Truth Tables for Argument Forms

Truth tables thus serve readily as decision pr ocedures for the validity, contingency, and contradic­ toriness of formulas. They als o functi on as decisi on pr ocedures for argument forms. Recall that an argument form is valid just in case every interpretation making the premise formulas true makes the conclusion formula true as well. Truth tables let us calculate the truth values of formulas under each of their interpretations, so it is easy to set up a truth table to give us the information needed to evaluate an argument form. Actually, there are two ways of going about this. One might set up a table on which one computed the values of each of the premise formulas, and the conclusi on formula, separately, and then learn to interpret these tables as evaluating argument forms. An argument form would be valid, of course, just in case no interpretation made the premise formulas all true but the conclusion formula false. Alternatively, one could find a single formula that "expressed" the content of the argument form, and evaluate it for validity. If the formula were set up correctly, then its validity would guarantee the validity of the argument form. In this secti on we'll present a method that uses the latter strategy. We'll first translate an argument form into a formula- the argument form's ass ociated formula - and then use the technique of the last section to determine whether that formula is valid. If the associated formula is valid, the argument form itself will be valid. If the associated formula isn't, then neither is the argument form. The ass ociated formula of an argument form is a conditional, with the conclusion formula as its consequent and the conjunction of the premise formulas as its antecedent. That is: Definition 2 . 1 6 The associated formula of an argument form is the formula that results from

(1) conjoining the premise formulas of the argument form, in order, and (2) linking that conju nction to the conclusion formula by means of a conditional. Below are several argument forms, together with their ass ociated formulas:

Argument forms

(1) P P -+ Q .. Q

(2) P -+ Q Q ,-, R . . (R&Q) _, P

Associated Formulas

( 3) ( P V Q) -+ ,(Q&R)

Q . . ,R

49

Sentences

l . (P&(P _, Q)) -+ Q

2. ((P -+ Q)&(Q

~ R)) _, ((R&Q) -+ P)

3. [((P V Q) -+ --i(Q&R))&Q] -+ --,R At this point it's easy to evaluate the argument forms for validity. An argument form is valid if and only if its associated formula is valid. So the associated formulas can be tested for validity, j ust as any other formula can be. This results in the tables:

p Q

(P &(P

T

T

F F

F F

F F

T

T F F T

T T p T T T T F F F

(Valid)

T T F F T T F Q

T F T F T F T

R

F F F

p Q R

(Invalid)

T T T F F F

T F F T T F

F T F T F T

T T T

F F F (Valid )

F

T T F F

-->

((P _, T T T T T F T F F T F T F T F T [ ((P T T T T F F F F

T T T T T T F F V

T F T T

T F T F

Q))

T T F F T T F F

Q)

T T F F T T F F

Q)

--,

T F F F T F F T &

Q T T T T

T F T F

( Q ~ R)) _. T T T T T F F T F F T T F T F T T T T F T F F T F F T T F T F T

((R T F T F T F T F

& Q) T T F T F F F F T T F T F F F F

--, --, (Q & R)) & Q] -+ --, F F T T T F T T F T T T F F T T T T T T F F T F F T F T T F F F F F T T F F T T T F T T F T T T F F T T T T T T F F T F F T F T T F F F F F T T

T T T T F T T T

_,

R T F T F T F T F

P) T T T T F F F F

The first and third tables have all T's in their final columns, so they indicate that the associated formulas they are testing are valid. The validity of these formulas, furthermore, demonstrates the validity of the argument forms with which they are associated. The second table has both T's and F's in its final column. The associated formula and its corresponding argument form, therefore, are invalid. When, as in this case, the truth table shows an argument form to be invalid, it also specifies an interpretation making the premise formulas true and the conclusion formula false. The table, in other words, not only indicates that there is such an interpretation, but tells us what it is. If there are several, it specifies them all. To see under what circumstances the premise formulas would be true while the conclusion formula would be false, look at those rows of the table which have an F in the final column. In the example above, there are two such rows. The rows detail interpretations. On the first row having an F in the final column, P is false and Q and R are true; on the second, the last row of the table, all three sentence letters are assigned falsehood. The table thus spells out two interpretations making the premise formulas true and the conclusion formula false.

Logic, Sets and Functions

50

On rows having an F in the final column, the associated formula comes out false. But a conditional comes out false only when the antecedent is true and the consequent false. That means that the associated formula is false only when the conclusion formula is false, and the antecedent formula- the conjunction of the premise formulas- is true. But a conjunction is true just in case all the conjuncts are true. So, on any interpretation making the associated formula false, the premise formulas must all be true, and the conclusion formula must be false. That is why the associated formula method works: the formula is valid when and only when the argument form itself is valid. In fact, Aristotle stated all the arguments and argument forms he discussed as conditionals; historically, then, the associated formulas preceded the argument forms with which they are associated. The Stoics were the first to connect the validity of inferences and the necessary truth of the associated conditionals explicitly.

Problems Translate these arguments into our symbolic language, and evaluate the corresponding argument forms as valid or invalid. 1. Sondra mailed the grant proposal I placed on her desk. So she either mailed the grant application or threw it away. 2. You are either a knave or a fool. You are a knave, so you're no fool. 3. If I'm right, then I 'm a fool. But if I'm a fool, I'm not right. Therefore, I'm no fool. 4. If I'm right , then I'm a fool. But if I'm a fool, I'm not right. Therefore, I'm not right. 5. Unless I'm mistaken, I'm a fool. But if I am a fool, I must be mistaken. So I'm mistaken. 6. If Einstein's theory of relativity is correct, light bends in the vicinity of the sun. Light does indeed bend in the vicinity of the sun. It follows that Einstein's theory is correct. 7. The thief is in either Tewkesbury or Bristol. He is not in Tewkesbury; therefore he is in Bristol. 8. Aristotle was a brilliant thinker. If his theory of remembering is right, then modern psycholog­ ical accounts are wrong. So either modern accounts of memory are wrong, or Aristotle wasn't so brilliant after all. 9. Jessica meows just in case she is hungry. She is meowing, but she isn't hungry. Therefore the end of the earth is at hand. 10. If Socrates died, he died either while he was living or while he was dead. But he did not die while living; moreover, he surely did not die while he was already dead. Hence, Socrates did not die . 11. Nothing can be conceived as greater than God. If God existed in our imaginations, but not in reality, then something would be conceivable as greater than God (namely, the same thing, except conceived as existing in reality). Therefore, if God exists in our imaginations , He exists in reality. 12. You believe this only if you're a turkey. You don't believe this, so you are no turkey. 13. If you have a cake , just looking at it will make you hungry; if looking at it makes you hungry, you will eat it. So you can't both have your cake and fail to eat it. 14. A man cannot serve both God and Mammon. But if a man does not serve Mammon, he starves ; if he starves , he can 't serve God. Therefore a man cannot serve God.

Sentences

51

15. Either we ought to philosophize or we ought not. If we ought, then we ought. If we ought not, then also we ought (to justify this view). Hence in any case we ought to philosophize. (Aristotle) The Stoic logician Chrysippus (280-207 B.C.) regarded these five inference schemata as basic to sentential logic. Which of these are valid in our logic? 16. A ---> 8; A; :. 8. 17. A ___, 8; --,8; :. --,A_

19. A V 5; A; . . -.8.

18. --,(A&5) ; A; :. __,5_

20. A V 5; --,5; :.A. 21. Cicero, the famous Roman orator, summarized Stoic logic by citing seven principles: the five of Chrysippus above, a repeat of the third, and --, (A&5) ; --,A; :. 8. Is this valid in modern sentential logic? Sextus Empiricus, a sceptic who wrote in the third century A.D. , preserves some theorems that the Stoics deduced from their basic principles. Which of these are valid? ,;,fr .'( A&5) ---> C ; --,C; A; :. --,5_ \ _,,.�, 8; A ---> ,f3; :.-,A. ·"'··"I 25. A ___, A; _,A ---> A; :. A. 22. A ---> (A ---> 5); A; :. l3. f'-•�

2 . 10

Implication, Equivalence and Satisfiability

As we've seen, argument form validity is tantamount to implication. A set of formulas implies a given formula just in case every interpretation making every member of the set true makes the given formula true as well. This is exactly the relation that holds between the premise formulas of an argument form and the conclusion formula. Consequently, the method of associated formulas also works for implication problems. To find out whether a set of formulas {A 1 , . . . , An } implies a formula 5, simply form the conditional statement ( (A 1 & . .. &.A,,) ---> 5), and test it for validity. The formula will be valid just in case the implication holds. If only two formulas are involved, this test is even simpler : A implies 8 just in case the conditional (A ---> 5) is valid, This fact can be summarized in a slogan: implication is the validity of the conditional. Equivalence, of course, is just implication in both directions. It would be easy, then, to use truth tables as a test for equivalence: form two associated formulas, and test both for validity. A formula A is equivalent to a formula 8 just in case both A ---> 8 and 5 -> A are valid. But there is an easier way. A and A are equivalent just in case they have the same truth value on every interpretation of them. There is a truth function that comes out true just in case the components agree in truth value : the biconditional. So, A is equivalent to A if and only if the biconditional statement A +-+ 8 is valid. We can summarize this by saying that equivalence is the validity of the biconditional. To show that P is equivalent to -,--,p, for example, we can test the biconditional P +-+ -,-,p for validity:

Logic, Sets and Functions

52

p +--t p -, p T T T F T T F F T F T F The final column of this table contains all T's, so the formula is valid. That means that P and -,-,p agree in truth value on every interpretati on of them, so they are equivalent. Finally, truth tables also serve as a decisi on pr ocedure for satisfiability. We have already seen this with respect to individual formulas. Even when we are concerned with the satisfiability of sets of formulas, h owever, truth tables offer a simple test. A set of formulas is satisfiable i f and only if its members are all true on some interpretati on of them. Of course, if every member of a set is true on a given interpretati on, the conjunction of the members must be true on that interpretation as well. So a set is satisfiable just in case the conjuncti on of the members of the set has a true i nterpretati on. If the conjuncti on is contradictory, the set must be contradictory; if the conjuncti on is satisfiable, so is the set. This, too, may be put into a slogan: satisfiability ( of a set) is the satisfiability of the conjuncti on. C onsider, for instance, the set of formulas {-.Q, P V Q, P ---+ Q}. Is this set satisfiable? To find out, we can form the conjuncti on of the members of the set, and test it for satisfiability: p Q (-. Q & (P V Q)) & (P ---+ Q) F T F T T T F T T T T T T F T T T F F T F F T F F T F T F F T T F F T T F F T F F F F F F F T F The final column here consists of all F's, so the formula is contradictory- false on every inter­ pretati on of it. But the conjuncti on would be true if every member of the set of formulas were true. So this indicates that there is no interpretati on making every member of the set true. The contradictoriness of the c onjuncti on implies the contradictoriness of the set of formulas. The truth table technique, therefore, is a simple but remarkably powerful t o ol for solving prob­ lems in sentential logic. It serves as a decisi on pr ocedure for argument form validity, formula validity, implicati on, equivalence, formula satisfiability and set satisfiability. It does all this, furthermore, in a clear and easily understandable way. A truth table, in effect, surveys all p ossible interpretati ons of a formula, argument form, etc., and computes a truth value for it on each interpretati on. This clarity, of course, comes at a price. When the formula or argument form becomes very complex, the number of p ossible interpretati ons may be very large. In the next chapter, therefore, we'll turn toward a method that verifies the validity of formulas and argument forms, with out running through every p ossible way of assigning truth values to sentence letters.

Problems Determine whether the formulas in each pair are equivalent. If they are not, say whether either formula implies the other. 1. P ---+ Q and P +--+ ( P&Q) 2. -.(P V Q) and -,p V -.Q 3. -.(P&Q) and -. P&-.Q 4. -.(P ---+ Q) and -,p ---+ -.Q 5. -.(P +--+ Q) and -,p +--+ -. Q 6. -.(P

Q) and -,p +--+ Q

7. -.(P ---+ Q) and P& -.Q

53

Sentences

8. -,(P&Q) and -,p V -,Q 9. -,(P V Q) and -,p&-,Q 10. P&Q and (P V Q)&(P ._. Q) 1 1 . P&Q and (P V Q)&(P -+ Q) 12. P&Q and (P V Q)&(Q -+ P) 14. P V Q and -,Q ---> P 13. P V Q and -, p ---> Q

16. P ---> Q and Q ._. (P V Q)

15. P ---> Q and P Q)&(Q -+ P) 18. P ._. Q and ( P&Q)&(-,P&-,Q) 19. P ._. Q and -,p ._. -,Q 20. P (Q V R) , P -+ -,Q, P -+ -,R} 24. {P __, Q, P -. -,Q} 2 5. { Q ._. (P ---> (Q V -,R) ) , -, (P ---> ( Q V R) ) , R} 26. { P, Q -+ -,R, R}

27. { P __, -,Q, (R&P) ---> Q , (R&-,P) __, P} 28. {-,(P ---> (Q&-,R} ) , P V (Q&-,R) , R -+ (P V Q)} 29. { (P V ,Q} R, ( (R -+ P) --> R) -+ Q , ( (Q -+ R) -+ Q) -+ P}

54

Logic, Sets and Functions

Consider the following formulas: ? V Q, P&Q , P -+ Q, P ,_, Q, ,P, and ,Q. Where A and B are different formulas from among this collection , there are 30 possible statements of the form A implies B. Of these, only five are true. Show that these are the five. 36. P&Q implies P V Q 37. P&Q implies P -+ Q 38. P&Q im pliesP ,_, Q 39. P ,_, Q implies P --> Q 40. --,p implies P -+ Q

Chrysippus held as a basic valid argument form P V Q ; P; :. ,Q, which has the associated formula ((P V Q)&P) -" ,Q. Subsequent logicians advanced a number of principles which we would today consider invalid. Do any of these follow from the formula associated with Chrysippus's argument form? 41. ((P -+ ,Q)&P) --> ,(P -+ Q) (Stoics) 42. ( (,P --> Q ) &,Q) --> ,(P --> Q) (Stoics) 43. (,(P&Q)&,P) -+ Q (Cicero) 44. (P -+ ,Q) ,_, ,(P --> Q) (Boethius) 45. ( (P -+ ( Q --> R))&(Q -+ ,R)) --> --,p (Boethius)

Chapter 3

Nat ural Deduction The method of truth tables can evaluate arguments in a theoretically clear and practically efficient way. But it provides little insight into how people construct arguments and, especially, extended arguments and proofs. In this chapter we'll develop a system designed to simulate people's construction of arguments. Although it forces arguments into rather rigid structures, it is natural in the sense that it comes close, in certain respects, to the way people argue, particularly in legal , scientific and philosophical contexts. It comes even closer to the way mathematicians prove theorems.

3.1

Natural Deduction Systems

A natural deduction system is a collection of rules of inference. The central notion of a natural deduction system is that of proof. Some proofs, called hypothetical proofs, begin with assumptions, or hypotheses. The assumptions serve as premises for the argument found in the proof; the conclusion depends on these assumptions. Such proofs show that the conclusion is true, not outright, but if the assumptions are true. Other proofs, however, use no assumptions; they show that their conclusions are true outright. A proof in a natural deduction system is a sequence of lines; on each line is a formula. Each formula in a proof must either be an assumption or derive from formulas on previously established lines by a rule of inference. In our formulation, the formula on the topmost Show line of a proof is its conclusion; the proof is a proof of that formula from the assumptions. Formulas proved from no assumptions are theorems. A theorem of a natural deduction system, then, is any formula that can be proved from no hypotheses in the system. Rules of inference are either simple or complex. Simple rules allow us to write down formulas having certain shapes in a proof if other formulas of certain kinds are on already-established lines in that proof. For example, we'll have a rule that lets us write A or B if we've already established A&B. Complex rules, in contrast, allow us to write down a formula of a certain shape in a proof if some other proof is completed . We'll have a rule, for instance, that allows us to write ,A if we've proved a contradiction from the assumption A. Another complex rule allows us to assert a conditional formula A --> B if we can, in a subordinate proof, assume A and derive B. Because our system has some complex rules, proofs will sometimes appear within other proofs. A proof appearing within another is subordinate to it. The larger, superordinate proof will make use of the information in the subordinate proof by way of a complex rule. Sometimes the larger proof will simply take over the conclusion of its smaller partner. In other

55

Logic, Sets and Functions

56

circumstances, the larger pro of will use the fact that the sub ordinate was able t o prove a c onclusion from a given assumption to state an assertion that doesn't itself depend on this assumption. An assertion within a proof established by a subordinate pro of is called a lemma. As our talk of "shapes" suggests , natural deduction systems are purely syntactic. It's p ossible to verify that a proof is successful without any reference to the meanings of the symb ols in the pro of. Nevertheless, we use the rules of inference we do because, taken t ogether, they all ow us to prove fr om a set of assumptions only th ose formulas that follow from the set. The rules themselves are syntactic, based on the shapes of formulas, but their justification is semantic. For the most part , the deduction system of this b o ok has two rules for each connective. One rule tells us h ow t o prove a formula with that connective as main c onnective. In short, it tells us how to introduce formulas of that form into pro ofs. The other rule tells us how to use the information encoded in a formula having that connective as main connective. That is, it tells us how to exploit formulas of that kind in pro ofs. For this reas on , the basic rules of the system will largely fall into two groups: introduction rules and exploitation ( or eliminati on) rules. Most c onnectives will have one rule of each sort. Before discussing these rules in detail, we need to know more ab out the kinds of proofs that the system all ows. Pro ofs will be able to appear inside other pro ofs. The sub ordinate pro ofs will fulfill various functions. In this chapter , there will be three such functions. Accordingly, there will be three methods of proof. Two of them are really c omplex connective-introduction rules. The remaining, " pure" method is that of direct pro of. All pro ofs are sequences of lines which are structured in certain ways. As we'll write them, proofs will have three columns. The middle column will consist of a sequence of formulas; some may be preceded by the word Show. The left column will number these formulas. The right c olumn will provide justifications for the formulas. Thus, if a formula on a given line derives from previously established formulas by a rule of inference, the right column will say what rule of inference and what earlier lines were used. If the formula is an assumption, the right c olumn will say so. Only when the formula derives from an entire sub ordinate proof will the right column be empty. A direct proof begins with premises- or, if it appears within an other pro of, formulas deduced from earlier lines- and proceeds to its conclusi on. We'll begin a direct pr oof, once any premises or earlier lines are recorded, by stating what we want to sh ow: n. Show A Lines that we hope to establish by doing pro ofs- that is, theorems and lemmas- will always have this form. The left c olumn c ontains a line number; the right column is empty. The formula we hope t o pr ove is prefaced by the word Show t o indicate that we haven't proved it yet; s o far , the information recorded on the line is just wishful thinking. H ow can we make our wishes reality? Clearly, by proving A. If we can prove A from what we are given, then we can go back to line n and cr oss out the Show that signalled that the following formula was, at that point , only fantasy. The proof allowing us t o do this will provide our explanation for cancelling the Show. It will follow line n immediately. To show graphically what lines constitute the proof, we'll draw a bracket enc ompassing th ose lines to the left of the formulas in the pro of. A successful, completed direct proof will therefore look like this:

Direct Proof n. n+m

Shew,A [ �

57

Natural Deduction

Obtaining A allows us to complete the direct proof only if two conditions are satisfied. First, A, on line n + m, must not already be enclosed in another set of brackets. Second, there must be no uncancelled Show statements in the area to be bracketed. We want to show that A follows from what has gone before. If A, on line n + m, were already enclosed in a bracket, or if we would be enclosing any uncancelled Show statements by drawing a new bracket to complete the proof, we could not be sure that A followed from what we were given. We would know only that A followed from what we were given together with some additional assumptions, for other proof methods introduce assumptions into the proof. Neither of the following, then , count as legitimate instances of direct proof: n.

ShEJwP

;

[ r:Q

WRONG n. :

ShE>WP [ :how Q

WRONG

On the left, we are trying to use the occurrence of 'P' on line k to complete the direct proof of 'P'. But 'P', on that line, is already enclosed in another set of brackets. Because the proof of 'Q' may have begun with an added assumption, we have no guarantee that 'P' follows from the formulas above line n alone. On the right, there is an uncancelled Show statement within the bracket. T his too may have let in added assumptions. Since the bracketed lines constitute a proof that provides the justification for line n, no other justification is needed. T he right column for line n can thus remain empty.

3.2

Rules for Negation and Conjunction

In this section we'll develop the rules of inference for negation and conjunction. First, however, we'll introduce a simple rule that allows us to construct hypothetical proofs. T he assumption rule says that you may begin a proof by listing premises or assumptions. T hat is, before the very first Show line, you may write premises. The conclusion you derive will depend on these premises. T hus: AssumptionA n.A

A

Here line n must precede the first Show line in the proof. Suppose that we want to prove 'Q' hypothetically from 'P - Q' and 'P V Q'. We would begin the proof by writing the premises, and then the Show line containing the conclusion: l . P -, Q A 2. P V Q A 3. Show Q T he rule of conjunction exploitation asks what use we can make of the information encoded in a conjunction. If a conjunction is true, in other words, what follows? Clearly, the truth of both conjuncts. If A&B is true, then both A and B must be true. The rule of conjunction exploitation thus takes two forms, since the truth of a conjunction implies the truth of both conjuncts:

Logic, Sets and Functions

58

Conjunction Exploitation( &E)

n.A&B n + m.A &E, n

Conjunction Exploitation ( &E)

n.A&B n + m.B &E, n He reafter, we 'll abbreviate rules having two forms by using parentheses. We can write &E as Conjunction Exploitation( &E)

n.A&B n + m.A(orB)

&E, n

This rule, often called Simplification, tells us that, from a conjunction, we can prove either or both conjuncts. The conjuncts may be written on any later line. When this rule is applied, we write a conjunct together with the explanation that the line comes by application of & E- conjunction exploitation- to the formula on line n. (The line just above the formula that results from applying this rule serves to separate the formula deduced from what must be present earlier; it won't appear in actual proofs.) To apply this rule to a formula , the conjunction must be its main connective. We can move from 'P&(Q ---> R)' to 'Q ---> R', but we cannot go from ' (P&q) ---> R' to 'P ---> R'. Let's do a simple proof using this rule. Let's assume ' ((P&Q)&-.R) ' and prove 'Q' . We begin with an assumption and a Show statement.

I . ((P&Q)&,R) 2. Show Q

A

Now, we can exploit the conjunction to derive the smaller conjunction ' (P&Q) ' . Note that we can 't leap inside to derive ' Q' directly: the connective to which we apply the rule must always be the main connective of the formula.

I . ((P&Q)&,R) 2 . Show Q 3 . (P&Q)

A

&E,1

The right column tells us that the formula on line 3 comes from line 1 by applying conjunction exploitation. At this stage, we can easily apply that same rule again to obtain 'Q', which is what we want in order to finish the proof:

l . ((P&Q)&,R) 2. � Q &Q) [�

t

A &E,1 &E ,3

The rule of conjunction introduction tells us how to prove conjunctions. It's a very simple rule, since there is an extremely obvious strategy for proving a conjunction: prove each conjunct. If you



Natural Deduction

59

know that A is true, and that l3 is true, you can conclude that A&B is true as well. The rule thus states that from the two formulas A and B you can derive A&B: Conjunction Introducti on( &I) n.A m.B p.A&B

&I, n, m

The right column simply indicates that A&B comes fr om applying conjunction introduction to the formulas on lines n and m. The order in which A and /3 appear in the pr oof makes no difference. We could just as easily have concluded l3&A. To see how this might be used in a pro of, let's show that '(P&Q) ' allows us to derive '(Q&P) '. Again we begin by using the assumption rule and writing a Show line:

I . (P& Q ) A 2. Show (Q&P) N ow, we need to show 'Q&P' . To do this, we need t o separate out the two conjuncts. We can therefore derive them separately, using conjunction expl oitation, and put them back together in the other order, using conjuncti on introduction:

A 1 . (P&Q) 2 . � ( Q &P) &E,1 &E,l 4. Q 53.. (Q&P) &l,4 ,3

[p

One negati on rule serves as b oth an introduction and an exploitation rule. Because it always intr oduces or expl oits two negati on symb ols at once, we'll refer to it as ',.., '. It says, basically, that two consecutive negation signs "cancel each other out" : ..,..,A is equivalent to A. First formulated by the Stoics, this rule is often called Double Negation. One can add two consecutive negation signs, or delete them, without affecting truth values. Negati on Introduction(,,)

Negati on Exploitation(,.., n . ..,..,A

n + p.A

,,, n

We can express this in a more compact form by writing a double line between A and ,..,A. A double line indicates that the rule is invertible - that one can go from what is ab ove the lines to what is below them, or vice versa. The rule works in b oth directions. So:

60

Logic, Sets and Functions

D ouble Negation(-.-.)

n.A m. -.-,A

-. -., m --,---, , n

T o illustrate the use of this rule, let's show that we can derive '-.-.P&-.Q' fr om ' ,,(P&-.-.-.Q) ' .

1 . ,,(P&-i •• Q ) 2. � ( • ..,p&.Q) 3 . (P&···Q) 4. p 5 . -,-,p 6. -,-, ,Q 7 . -i Q 8 . ( -i,P&-iQ)

A ..,.., , I &E ,3 -,-, ,4

&E ,3

-,-, ,6

&l,5 ,7

Another , more powerful negation rule is indirect pr o of, which is essentially another negation intr oducti on rule. The indirect pr oof rule is complex. It says that we can write -.A if we can derive a c ontradicti on from the assumpti on that A. Indirect pr o ofs always introduce assumptions, called assumptions for indirect pr o of (AIPs) , fr om which they try to prove contradictions. What point is there to that? If an assumpti on leads to a contradiction, it must be false. Consequently, indirect pr o ofs always establish the negati ons of their assumptio ns. An indirect pr oof begins with a statement of what we want to pr ove: a formula prefaced by the word Show. T his formula w i ll have a negati on as a main c onnective. The pr o of will then make an assumpti on. The assumpti on will be the same as the formula we're trying to establish , but with the main negati on omitted. An indirect pr o of will thus begin: n. Show -.A n + l. A AIP T o c om plete the pr o of, it's necessary to pr ove a contradiction. The contradicti on doesn't have to relate directly to the assumption; we do not, in other words, have to use the assumption A to pr ove A&-.A, or anything else c ontaining A. Furtherm ore, we d on't have to get our contradiction into a single formula; two formulas, one of which is the negati on of the other, suffice. W hat we want, then, is to prove a formula B and also its negati on -.B. A completed indirect pro of, then, l o oks like this:

Indirect Proof AIP

It makes no difference which of B and -.B is pr oved first. Notice that, once the pr oof is c omplete, we cancel the word Show to indicate that we've establish what, earlier, we had merely hoped for. Again, neither B nor -.B may already be enclosed in brackets on lines n + p and n + q, and we may enclose no uncancelled Show statements when we draw a bracket to c omplete the pro of. To take an example, this rule allows us to pr ove the thesis that a c ontradicti on implies anything. To see this, take a contradiction such as 'P&-.P', and a c ompletely unrelated formula 'Q'. We can sh ow that, fr om 'P&-.P' , we can derive 'Q'. So we begin with an assumption and a Show line:

Natural Deduction

61

1 . (P& -,P ) A 2 . Show Q Now, how can we get to 'Q' ? There seems to be no way to get it by a direct pr o of, since it bears no relati on to our assumpti on. Our only choice, at this point, is to use indirect proof. This bit of reasoning is not restricted to this example. Throughout this text we'll resort to indirect proof when nothing else suggests itself as a good proof technique. Of course, as we have stated the rule of indirect pr oof, we can prove only negated formulas. But our rule for negation tells us that 'Q' and ',-,Q' are equivalent. Instead of proving 'Q', then, we can pr ove ' -.-.Q', and then use the rule for double negation to obtain 'Q'. This strategy leads to the pro of:

1 . (P&-,P ) A 2. � Q 3. -,-, Q 4. AIP 5. &E,l 6. &E,l 7. -,-, ,3

[f

Notice that, in this pro of, we used a line that contained the word Show; that's how we got from '-.-.Q' to 'Q'. Yet we can't use such lines all the time. If we could use Show lines anytime we wanted, we could easily prove anything. Consider, for instance, this "pr o of" that pigs fly: 1. ShE>WPigs fly -,,Pigs fly 2. [ -.-., 1 Error 3. Pigs fly -,-, , 2 Clearly we've got to restrict the circumstances in which we can use such lines. We'll say that any line we can use in a pro of at a given point is free at that point. In sentential logic, every line is free, except (1) lines that begin with an uncancelled Show; and (2) lines that are imprisoned within a bracket. Lines beginning with an uncancelled Show c ontain formulas that we haven't yet proved. All we can say is that we hope to prove them. So the informati on on those lines is inaccessible to us in the pro of. That's why the proof that pigs fly seems silly; we used what we wanted to prove in order to prove it. Once a Show has been cancelled, on the other hand, we've proved the formula, so we can use it throughout the rest of the pro of. It's only a little harder to see the point of the sec ond restricti on. Lines that are enclosed in a completed bracket or "prison" may depend on a particular assumpti on. B oth indirect and conditional pro ofs introduce assumpti ons; the assumpti ons themselves, and the following lines that depend on them, are true only given those assumpti ons. We can't assume that they are true in general. Ab ove, for example, we intr oduced a contradiction- 'P&-.P'- as an assumpti on. We proceeded to show that, if this were true, then anything would be true. Here, and in subordinate pro ofs, we wouldn't want to say outright that the contradicti on was true; we would merely pretend that it was true. The contradicti on, in other words, would serve only as a dialectical premise. If this pro of were part of a larger pro of, it would be a terrible mistake to c ome back and claim that, in general, we c ould prove 'P&-.P' because it was introduced earlier, withln a sub ordinate pr oof, as an assumption. The only free lines, then, are those that are neither prefaced with an uncancelled Show nor imprisoned within a bracket (i.e., a completed proof ) . These "pr o ofs" , c onsequently, make grave errors:

Logic, Sets and Functions

62

l.

2. 3. 4. 5.

l.

2. 3. 4.

Shew( (P&"P) - P)

Shew-,?

[

[ (J -, p

& -, P )

Shew( P&-,P)

[

p -,p

( P&-,P)

ACP

&E, 3 &E, 3

&E, l &E, l

&I, 2 , 3

Error

Error Error

In summary, formulas preceded by an uncancelled Show have not yet reached legal age, and th ose within a completed prison are sentenced to life. Both kinds of formulas are inaccessible. We've already seen one rule in this section, the assumption rule, that is structural in that it makes no reference to any particular connectives. We'll now introduce another structural rule, which says that whatever is true, is true. It allows you to repeat yourself. The Reiteration rule allows us to repeat a formula that appeared earlier in the pro of, so l ong as the line c ontaining that formula is still free. ReiterationR n.A n + p.A

R, n

This rule supp oses, in effect, that whatever we've shown to be true is still true. Repeating what we've established can never get us into tr ouble. To see h ow this rule works, consider a simple argument: It's n ot true that Congress will cut military spending and refuse to raise taxes, because Congress will raise taxes. We can symb olize this as

p :. -, ( Q&-,P)

We begin a pr o of to demonstrate the validity of the argument form by using the rule of assumption to introduce the premise, and then trying to show the conclusi on.

1. p

2. Show

,(Q &,P)

A

The first rule of thumb for constructing pr o ofs is this: cho ose a proof meth od by lo oking at the main connective of the formula being proved. In trying to pr ove a negation, use indirect pro of.

l. p A 2 . Show ,(Q&,P) AIP 3 . (Q & ,P ) We can apply c onj unction exploitation to obtain '-,P', and then get a contradiction by reiterating the premise:

63

Natural Deduction

I. p 2. � -.(Q&-.P) 3 . (Q&-.P) 4. [ ..,p 5. p

A AIP &E,3 R,1

In general, reiteration is useful chiefly for completing indirect and other subordinate proofs.

Problems Using deduction, show that each individual formula is provable, and that the conclusion of each argument form is provable from the premises. l . P&Q; .. Q

2. P&(Q&R); :. Q

3. ,(P&,P) 4. ,(P&(Q&,P) ) 5. ·((P&,Q)&(Q&,P) ) 6. .

,p&,Q ; :. ,(P&Q)

7 . (P&Q)&R; .. P&(Q&R) 8. P&,Q; ,R&S; :. S&P ,(P&Q) ; Q; :. -.P

9.

10. Q; :. -.(,P&-.Q)

-�-.,,.,

.

/1 1 ., P; .-. ,(Q&-.(P&Q)) ··. ... \·\ " .

1 2 . ,((P&,Q)&R) ; :. ,((P& R)&,Q) 13. -.(P&-.Q); P; .. Q 14. -.(-. P&Q); ,(P&Q); :. -.Q

15. P&Q; , (Q&,R) ; -.(P&,S) ; :. R&S 16.

-.(-.(-.(P&-.Q)&,P)&-.P)

17. P&Q; ,(,R&Q) ; ,(r&,S); .. P&S 18.

* ,(P&-.Q); -.(Q&-.P) ; -. (P&Q); -.(, P&,Q); :. R

19. * ,( -.S&Q); ,(P&(,Q&,R)) ; ,(R&,S) ; :. , (,S&P) 20.

* ,(P&R) ; -.(-.(P&Q)&,P) ; :. R -> ,S

64

Logic, Sets and Functions

3.3

Rules for the Conditional and Biconditional

The rule of conditional exploitation is very simple. Many axiomatic systems, in fact, have only it as a rule of inference. Often called modus ponens, this rule sanctions the inference from 'P' and 'P -> Q' to ' Q ' . It thus stands behind arguments such as 'If you're smart, you'll do well at logic; you're smart; so you'll do well at logic. ' Conditional Exploitation

n.A -+ I3 m . .A --, E, n, m p.B

--t

E

To illustrate this rule, let's show that we can derive 'P&Q' from the hypotheses 'R --, Q' and

'R&P'.

1 . (R - Q ) A A 2 . (R&P) 3 . � (P&Q) 4. &E ,2 5. p &E ,2 6. Q - E,1 ,4 (P 7. &Q ) &l ,5 ,6

[R

Conditional introduction is a complex rule, which constitutes the method of conditional proof. A conditional proof, like an indirect proof, may use some premises or earlier lines of a proof, but it doesn't have to. It always establishes a conditional formula, that is, a formula with a conditional as its main connective. It begins with a statement of what we want to prove, and proceeds to make an assum ption, called the assumption for conditional proof ( ACP). This assumption is always the antecedent of the conditional we're trying to establish. A conditional proof, then, begins as follows:

n.

n+l

Show.A -> B A

ACP

The method mimics actual arguments for conditional statements in English. To argue for a conclusion such as If the fetus is a person, it has a right to life, we can begin by supposing that the fetus is a person, and seeing what follows. If we can show that it follows from the assumption that the fetus has a right to life, then we have established the original conditional. Summarizing: to show that if .A, then l3, assume A and try to show that l3 follows. A successful conditional proof has this form:

ShowA _. B n. A ACP n + l.

Conditional Proof

n +p

[ : B

65

Natural Deduction

Once again, obtaining the sub ordinate conclusion E allows us to cancel the Show above it and count the conditional statement as established. A pr o of proves its t opmost Show line whenever the Show is cancelled. At that p oint, the conditional proof is complete. Drawing a bracket around the pr o of indicates that it is complete, and also all ows us to see easily what lines constitute the justification for the c onditional statement. J ust as in direct and indirect pro ofs, we may n ot draw the bracket if B, on line n + p, appears inside an other bracket, or if we would be encl osing an uncancelled Show statement. T o take an example, we can use conditional pr o of to show that 'R -+ Q ' is derivable from 'P&Q ' .

1 . (P&Q ) A 2. � (R - Q) ACP 3. [ R &E,1 4. Q

An other example points out that ' (P -+ -.Q)

1 . � ((P- -,Q) -( Q --.P)) 2 . (P- -.Q ) 3 . � (Q -- -,P) 4. Q 5 . � -,p 6. p 7. -i Q 8. Q

-+

(Q -+ -.P) ' is pr ovable:

ACP ACP AIP --E,2,6 R,4

Here, an indirect pr o of is subordinate to a c onditional pr oof that is, in turn, sub ordinate to another c onditional pro of. T he rules for the bic onditional are straightforward. First, consider bic onditional intr oducti on. Under what circumstances may one introduce a bic onditional int o a pro of? What does one have t o d o , that is, t o establish the truth of a biconditional? Recall that a biconditional is so-called because it am ounts t o two conditionals. This is crucial in devising a proof strategy. Mathematicians, for example, tend to prove bic onditionals in two steps. They do the "left-to-right direction" and the "right-to-left direction" separately. In other words, they pr ove two conditionals in order to establish the bi conditional. Our rule for bic onditi onal introduction similarly requires two conditi onals: Bic onditional Intr oducti on

n.A -+ E m.E -+ A p.A +-> E

+->

+->

I

I , n, m

T he rule for expl oiting bic onditionals rests on the fact that a bic onditional asserts that two sentences have the same truth value. If we know the truth of a bic onditional, and also the truth of one of its comp onents, we can deduce the truth of the other. Biconditional Exploitati on

n.A +-> B m.A(orE ) p.E(orA)

+->

E, n, m

+->

E

66

Logic, Sets and Functions

Notice that this r ule differs from conditional exploitation by allowing us to deduce A from B or vice versa. If we have one component, we can derive the other; it makes no difference which appears on which side of the biconditional. The rule of conditional exploitation , however, works only in one direction. If we have the anteced�nt of the conditional, we can obtain the consequent. But we can't go from the consequent to the antecedent. To show how these rules work, let's show that 'P ....., Q' is derivable from ' (P -+ Q)&(Q -+ P) ', and vice versa.

I . (P~Q) 2 . � ((P-+Q) &(Q -+P)) 3 . � (P -+Q) 4. 5. 6 . � (Q -+P) 7. 8. 9 . ((P-Q)&(Q-+ P))

[� [�

1 . ((P-Q)&(Q-P))

2. � (P-Q) 3 . [ (P-Q) 4 . (Q -P) 5 . (P-Q)

A

ACP ...-.E,1 ,4 AC P ~E,1 ,7 &I ,3 ,6

A &E,1 &E,1 ~I ,3 ,4

Problems Theophrastus (371-286), a pupil of Aristotle, cited these principles as hypothetical syllogisms. Use natural deduction to demonstrate their validity.

1 . P ___. Q; Q _, R; :. P -+ R.

t_2.;jP ---> Q; Q _, R; :. -,R _, _,p IV· ;;

3. P ___. Q; ,p __, R; :. -,Q __, R.

4. P ___. Q ; ,p __, R; :. -,R -> Q .

5. P ---> R ; Q -, -,R; : . P -, -,Q_

P ,_. (Q&P) ; :. P -> Q

Show , using deduction, that these argument forms are valid. 7. P ___. Q; -, (P ---> R) ; :. -,(Q -+ R) 6.

8. P ___. Q ; P -, R; :. P -> (Q&R) 9. P ,_. Q; P .-. R; :. Q .-. R

Q; -,Q; :. -, p

ll. P ,_. Q ; -,P; . . -, Q

10. P

--->

12. (P ___. Q ) _, P; :. P

Natural Deduction ,

67

13. ,·"P _, Q ; :. P ,_. (P&Q)

\,,,..."•·· · .

14. P&,Q; R ---, (R&Q); ,R __, S ; . . S

16. P ---> (Q&(R __, ,P) ) ; Q ,_. ,R; . . • p

1 5 . S ---> ( R&P) ; Q ---> (,R&,Pi ) ; . . (Q&S) ---> P2 17. (P _, Q) ---> (R _, S); ,(P&Q)

--->

S; :. ,s ---> ,R

18. * (S&-.R) __, ,P; (Q ---> ,S) ,_. ,P; :. P ,_. (R&(S&Q)) 1 9 . * P +-> Q; :. -,p ,_. ,Q 20. * (P

3.4

+->

Q) ,_. (P ,_. R) ; ·((,P&,Q)&,R); :. P +-> (Q ,_. R)

Rules for Disj unction

Like most connectives , disjunction has both an introduction and an exploitation rule. The introduc­ tion rule is very simple. It says that one may introduce a disjunction into a proof if one has already obtained either disjunct. Disjunction Introduction (V I )

n . .A(orB) n + p.A v l3 vJ, n To see how this rule works in practice, let's try to prove "the law of the excluded middle"­ '? V ,P'. We can't prove either disjunct separately- neither 'P' nor ' ,P' are valid- so we need to use indirect proof. This means we need to prove ',,(P V ,P) '. Furthermore, introducing the assumption for indirect proof leaves us with very little to work with. We should be able to prove a contradiction, but it's not obvious how we can get anything out of ' .(P V ,P) ' . So, we can begin by trying to show ',P'. (After all, from a contradiction, anything should follow.)

1 . � (Pv-.P) 2. � -.,(Pv,P) AIP 3 . -.(Pv-.P) 4 . � -,p 6. 75 . 8. 9.

(Pv,P) ,(Pv,P) (Pv-.P) (Pv,P)

[p

vl,5 R ,3 vl ,4

AIP

-,, ,2

This shows how useful disjunction introduction is, even in proving a fairly simple theorem. Disjunction exploitation is perhaps the most complicated rule in our entire proof system . How can we exploit the information encoded in a disjunction? That is, if we know a disjunction, how can we use it to obtain some conclusion? This is what a mathematician encounters in doing a "proof by cases" . Often, one can say only that there are several possibilities, and then examine each individually. If the conclusion one is seeking holds in every case, then one can conclude , since those were the only possibilities, that the conclusion holds in general. So, if we have a disjunction, and

Logic, Sets and Functions

68

ways of getting from each disjunct to a conclusion, then we can obtain that conclusion. This leads us to the rule often called constructive dilemma: Disjunction Exploitation (V E)

n.A V B m.A --> C p.!3 __. C q .C

VE, n, m, p

Faced with a disjunction, then, we must usually prove our conclusion in each of the two cases the disjunction presents. We've got to "get down to cases." To see how disjunction exploitation works, let's derive 'Q' from 'P V Q' and 'P --> Q'.

1

A (PvQ) A . (P -Q) 2. 3. Q � (Q - Q ) ACP 5 .. [� Q 6. vE,1 ,2 ,4 4 �Q

Note that we can prove 'Q same.

-->

Q ' in just one step, since the antecedent and consequent are the

Problems Show that the conclusion formulas of these argument forms can be proved from the premise formulas. l . P V Q; -iP;:. Q

3. (P&Q)

( -. P&-.Q) ; :. P ,_. Q

2. P V Q; P -, R; Q V

-->

S; :. R V S

5. ,p v ,R; :. -.(P&R)

4. -,p V --.Q; :. Q

-->

-,p

6. P V Q; ,p V -.Q;:. P ,_. ,Q

7. (P v Q) V R; :. P V (Q v R) ,,,..,,;.. •: 8. P&(Q V R); .. (P&Q) V (P&R)

9. (R&-,P) v (Q&R); :. (P --> Q)&R

10. P V Q; R V S; ,(P V S) ; .. (Q&R) V T 1 1 . P; ,S V ,P; P ......, R; :. ,S&R

1 2 . P V Q; P --> R; ,R; :. Q 14. (P&---,Q)

13.

,p

V

Q; ,Q -->

-.R; R; :. ---,p V (P&Q) V

R; -iR; :. ,p

Natural Deduction 15. -,S

69

v (S&P) ; (S _. P) _. R; :. R

16. P -> -iQ; -.p V R; Q; (Q&R) -+ P; . . -.R 17. -i (P& -,Q) V -.P; :. P _. Q 18. P&S; P _. (-.S V R); . . R 19. _,p V Q; ,Q; -,p _. R; :. R

20. P -> Q; R

-->

P; :. -, R V Q

2 1 . R -+ P; -,R _, Q ; Q

--->

S; j :. P V S

22. P -> -iQ; R; R _. (Q V ,S) ; :. S _. -,p 23. P _. ( -,R v S ) ; P _, -.S; :. P -+ ,R 24 . P

-+

-iQ ; -.p

-->

-iR ; R V -iS; :. -. Q V -,5

25. P&-,S; R _. S; P -+ (Q V R) ; :. Q

27.

* P&Q; R&,S; Q -+ (P --> Pi) ; Pi -+ (R -+ (S V Q i ) ) ; :. Q 1 * P&Q; P -+ (S V R) ; -.(R&Q); . . S

28.

* P V (Q V S) ; S1 &-.S2 ; ,(-,S 1 v S2 ) -> -, p ; (S -+ R)&-.R; :. Q

26.

29. * P&(,Q&-.P1 ) ; P _. (S _. R); S _. (R

+->

(Pi

V Q) ) ;

. . -. S

30.

* S _, P; (S&P) -+ Q; R -> S 1 ; R V S; :. Q V S1

31 .

* P V (R V Q); (r _. Si )&(Q -> S2 ) ; (S 1 V S2 )

32.

* P _, (S&R) ; (R V -.S) _. (Q&Q 1 ) ; Q 1 +-> Q2 ; :. P -> Q2

33.

* (P&Q) V (Q&R) ; . . -,Q ----> S

---->

(P V Q) ; ,P; . . Q

34. * ( P V Q)&R; Q _. S; :. -,p -+ (R _. S) 35. *-,P v (Q&R); (R v -.Q) - (S&Q 1 ) ; (Q 1 &Q2) V -i(Q 1 V Q 2 ) i :. -.P V Q 2 36.

3.5

* , (-,P&Q)&(P +-> -.Q) ; :. P (Q ----> R)

Derivable Rules

The series of pr oof methods and rules we've ad opted allow us to prove the validity of any valid argument form in sentential l ogic. That is to say, the system is complete - every valid argument form can be pr oved valid in the system. This guarantees that we have enough rules. Our proof system is also sound in the sense that every argument form we can show to be valid is in fact valid. Our rules, in other words, never lead us astray. To speak in terms of formulas rather than argument forms: the system is c omplete, in that every valid formula is a theorem, and also sound, in the sense that every theorem is valid. The pr ovable formulas are exactly the valid formulas. So this system matches precisely our semantics for the sentential connectives. Every aspect of the meanings of the connectives we've captured in some rule, pr oof meth od, or combination of these. Establishing the

70

Logic, Sets and Functions

soundness and completeness of this chapter's natural deduction system requires a fairly sophisticated proof in our metalanguage ; such a proof is possible, but we won 't attempt it here. To cover all of sentential logic, therefore, we need no more rules or proof techniques. Nevertheless, this section will present some added rules and methods . Everything we can prove with them is still valid. But they are all derivable rules, since they force us to accept nothing new about the logical connectives . They are shortcuts; they abbreviate series of proof lines that we could write in terms of our basic rules. Though they are theoretically unnecessary, then, they save a great deal of time and effort. F irst, we can develop a new proof method, or, really, an amendment to a basic proof method. An indirect proof shows that something is false by showing that the assumption that it is true leads to a contradiction. As we've introduced it, this means that the conclusion of an indirect proof is always a negation. But it's easy to extend this method to any formula. If we want to show that a formula A is true, we can assume that A is false by assuming ,A, and showing that a contradiction follows. So our extended method of indirect proof will work as follows:

Indirect Proof ( Second Form) n. n + 1.

£h0wA

n+p n+q

13 -,[3

r�

A

AIP

Using this eliminates at least one application of negation exploitation. The first derivable rules pertain to negated formulas. Negations are somewhat difficult to exploit in our basic system , since we can eliminate negations only two at a time. It's very useful, therefore , to have ways of simplifying formulas that begin with a negation. These derivable rules all give equivalents for negated formulas. Because they do give equivalents, they are all invertible; they work in either direction. The first two are often called DeMorgan's Laws. Negation-Conjunction(,&) n., ( A&B) m., A V ,/3

, &, m -.&, n

Negation-Disjunction( ,V) n.,(A V /3) m.,A&,13

,V , m ,V, n

Negation-Conditional(, -+) n.,(A -+ !3) m.A&,B

m --+ n '

-, --+ , -,

Negation-Biconditional( , ) n.,(A ,_. B) m.,A !3(orA ,._. ,B)

--, ,f-----+ , m -,

f--t , n

71

Natural Deduction

Closely related to these rules are several others that define a connective in terms of other con­ nectives. The first allows us to transform disjunctions into conditionals, and vice versa: Conditional-Disjunction(-> v) n.A -> B m.,A V E

-> V, m -> V, n

The next allows us to characterize the biconditional in terms of the conditional: Conditional-Biconditional(->+-->) n.A ._. B m.A -> B(orB -> A)

-> +--> , n

Next, four rules say that the order and grouping of subformulas is irrelevant in continued con­ junctions and continued disjunctions. The first says that the order of conjuncts makes no difference : A&B is equivalent to B&A. It thus indicates that conjunction is commutative. Commutativity of Conjunction(&C) n .A&B m.B&A

&C, m &C, n

The second of this group says that the grouping of conjuncts makes no difference or, in other words, that conjunction is associative. Associativity of Conjunction( &A) n.(A&B)&C m .A&(B&C)

&A, m &A, n

The third and fourth say the same for disjunctions. Commutativity of Disjunction(VC) n.A V l3 m.B V A

VC , m VC, n

Associativity of Disjunction(VA) n.(A V E) V C m.A V (B V C)

VA, m VA, n

Logic, Sets and Functions

7 2

Finally, four rules abbreviate commonly used proof steps. The first is a variation of conditional exploitation ; the second, a variation of biconditional exploitation ; the third , a variation of disjunction exploitation. All these variations allow negations to function readily, without detours. The fourth expresses the principle that anything follows from a contradiction. Conditional Exploitation *(---> E* ) n.A ---> B m.-iB p. -i A Sometimes the above rule is called modus to/lens. Biconditional Exploitation * ( - E * ) n.A - B m. -,A(or-,B) p.,B(or,A)

- E* , n, m

Disjunction Exploitation * ( v E * ) n.A V B m. -, A(or,B) p.B(orA)

vE*, n, m

Disjunction exploitation * is occasionally called disjunctive syllogism. Contradiction( ! ) n.A m.-,A p.B

!, n, m

In addition to these derivable rules, we'll adopt a principle of Replacement. In general, we can apply rules only to formulas with the appropriate main connectives. For instance, we can apply conjunction exploitation only to conjunctions- formulas with & as their main connectives. But invertible rules are justified by the equivalence of the formulas they link. And if we replace a subformula of any formula with an equivalent subformula , we obtain a formula equivalent to the original. If 'P' and ',,P' are equivalent, for instance , then so are 'P ---> Q ' and ' ,-,p ------> Q '. Consequently, we can apply invertible rules to subformulas as well as formulas. So we can use -,-, to move from ',-,p' to 'P', but also from '-,-,p ---> Q' to 'P ---> Q '. The derivable rules are often tremendous time-savers, as attempts to show that they are derivable from the basic rules will demonstrate. We 'll close this chapter by summarizing strategy hints . Overall proof strategies derive , most significantly, from what one is trying to prove , and, secondarily, from what one already has. This table contains some of the most important strategies. In all cases, a direct proof is easiest when it

Natural Deduction

73

can be achieved. But, if it's not obvious how to prove the conclusion directly, then use the strategies listed.

Proof Strategies

If you are trying to get: 1. -,A, try using indirect proof. 2. A&B, try proving A and B separately. 3. A V B, try indirect proof. 4. A -> B, try using conditional proof. 5. A ,___. B, try to prove the two conditionals A -> B and B -> A. If you have: 1 . -,A, try using it with other lines that have A as a part, or use a derivable rule. 2. A&B, try using &E to get A and B individually. 3. A V B, try (i) getting the negation of one disjunct, and using VE * to get the other, or (ii) using VE by taking each case separately. 4. A -> B, try (i) getting A and then reaching B by ,A by -> E * .

->

E, or (ii) getting -,B and then reaching

5. A +-> B, try (i) getting either component and then reaching the other by +-> E, or (ii) getting the negation of either component and then the negation of the other by +-> E*. These strategies indicate how to construct proofs of various kinds. They can serve as a helpful guide in a wide variety of situations. Sometimes, however, the obvious ploys may not work. When this happens, there are two "safety valves" - strategies that work well when pressure is high. First, when in doubt, use indirect proof. Anything provable can be proved with an indirect proof. Second, within an indirect proof, if it's not clear what to try to prove, choose a sentence letter , and try to prove it. The assumption for indirect proof should lead to a contradiction, so absolutely anything should follow. No matter what letter you select, therefore, you should be able to prove it.

Problems Construct a deduction to show that each of these arguments is valid. 1 . If you are ambitious, you'll never achieve all your goals. But life has meaning only if you have ambition. T hus, if you achieve all your goals, life has no meaning. 2. God is that, the greater than which cannot be conceived . If the idea of God exists in our understanding, but God does not exist in reality, then something is conceivable as greater than God. If the idea of God exists in our understanding, therefore, God exists in reality. 3. God is omnipotent if and only if He can do everything. If He can't make a stone so heavy that He can't lift it, then He can't do everything. But if He can make a stone so heavy that He can't lift it, He can't do everything. Therefore, either God is not omnipotent, or God does not exist.

74

Logic, Sets and Functions

4. If the objects of mathematics are material things, then mathematics can't consist entirely of necessary truths. Mathematical objects are immaterial only if the mind has access to a realm beyond the reach of the senses. Mathematics does consist of necessary truths, although the mind has no access to any realm beyond the reach of the senses. Therefore the ob jects of mathematics are neither material nor immaterial. 5. If the President pursues arms limitations talks, then, if he gets the foreign policy mechanism working more harmoniously, the European left will acquiesce to the placement of additional nuclear weapons in Europe. But the European left will never acquiesce to that. So either the President won't get the foreign policy mechanism working more harmoniously, or he won't pursue arms limitations talks. 6. If we introduce a new product line, or give an existing line a new advertizing image, then we'll be taking a risk, and we may lose market share. If we don't introduce a new product line, we won 't have to make large expenditures on advertizing. So, if we don't take risks, we won't have to make large expenditures on advertizing. 7. If we can avoid terrorism only by taking strong retaliatory measures, then we have no choice but to risk innocent lives. But if we don't take strong retaliatory measures, we'll certainly fall prey to attacks by terrorists. Nevertheless, we refuse to risk innocent lives. Consequently, terrorists will find us, more and more, an appealing target. 8. If God is all powerful, He is able to prevent evil. If He is all good, He is willing to prevent evil. Evil does not exist unless He is both unwilling and unable to prevent it. If God exists, He is both all good and all powerful. Therefore, since evil exists, God does not. 9. My cat does not sing opera unless all the lights are out. If I am very insistent, then my cat sings opera; but if I either turn out all the lights or howl at the moon, I am very insistent indeed. I always howl at the moon if I am not very insistent. Therefore, my lights are out, I am very insistent, and my cat is singing opera. 10. If we continue to run a large trade deficit, then the government will yield to calls for protec­ tionism. We won't continue to run a large deficit only if our economy slows down or foreign economies recover. So, if foreign economies don't recover, then the government will resist calls for protectionism only if our economy slows down. 1 1 . If companies continue to invest money here, then the government will sustain its policies. If they don 't invest here, those suffering will be even worse off than they are now . But if the government sustains its policies, those suffering will be worse off. Thus, no matter what happens, the suffering will be worse off. 12. We cannot both maintain high educational standards and accept almost every high school graduate unless we fail large numbers of students when (and only when) many students do poorly. We will continue to maintain high standards; furthermore, we will placate the leg­ islature and admit almost all high school graduates. Of course, we can't both placate the legislature and fail large numbers of students. Therefore, not many students will do poorly. Construct deductions to demonstrate the validity of these argument forms. 13. P V Q;-,P :. Q

15. P - Q;-,Q :. -,p

14. P;-,P :. Q

Natural Deduction

75

16. p ,_. Q;-,P :. -,Q 17. p ...... Q;-, Q :. -,p 18 . p ...... Q; . . p -> Q 19. p ,_. Q :. Q - P 20. P&Q :. Q&P 2 1 . (P&Q)&R :. P&(Q&R) 22. P V Q :. Q V P 23. (P V Q) V R :. P V (Q V R)

Use deduction to solve each of these problems. 24 . Holmes and Watson question three suspects : Peters, Quine and Russell. Hearing that their responses conflict, Holmes declares, "If Peters and Quine are telling the truth, then Russell is lying ." Watson seemingly assents, saying, " Indeed, at least one of them is telling us a falsehood." Irritated, Holmes insists, "That 's not all, my dear Watson ! We know that Russell is the trickster, if the other two are telling us the truth!" Show that Holmes' irritation is unjustified by showing that his original statement is equivalent to Watson 's. 25. Jones, feeling upset about the insecurity of the Social Security system, sighs that he faces a dilemma: " If taxes aren't raised, I'll have no money when I'm old. If taxes are raised, I'll have no money now." Smith, ever the even-tempered one, reasons that neither of Jones' contentions is true. Jones answers, "Aha! You've contradicted yourself!" Show that Smith's assertion that both Jones ' claims are false is indeed contradictory. 26. Roger, a hapless accounting major, is trying to analyze a problem on an accounting exam. He needs to figure the tax liability of a corporation engaged in overseas shipping. Some of the fleet counts as American for tax purposes, and some does not. Poor Roger recalls the definition in the Tax Code of an American vessel as running like this: " Something counts as an American vessel if and only if (1) it is either numbered or registered in the U.S., or (2) if it is neither registered nor numbered in the U.S . , and is not registered in any foreign country, then either its crew members are all U.S. citizens, or they are all employees of U.S . corporations." Show that this is the wrong definition , by showing that it implies that if something is registered in a foreign country, it counts as an American vessel. 27. On the way to the barber shop (adapted from Lewis Carroll) : you are trying to decide which of three barbers- Allen, Baker and Carr- will be in today. You know Allen has been sick, and so reason that (1) if Allen is out of the shop, his good friend Baker must be out with him. But, since they never leave the shop untended , (2) if Carr is out of the shop, then , if Allen is out with him, Baker must be in. Show that (1) and (2) imply (a) that not all three are out; (b) that Allen and Carr are not both out; (c) that, if Carr and Baker are in, so is Allen. Show that each of our derivable rules is in fact derivable from the basic rules, by using only basic rules to prove valid these argument forms. 28. * -,(P&Q) :. -,p V -,Q 29. * -,p V -, Q :. ,(P&Q)

76

Logic, Sets and Functions

* -. ( P V Q) :. ---,p&...,Q 31 . * ...,p&--,Q :. -.(P V Q) 30.

32 . * -.(P --> Q) :. P&--,Q 33.

* P&-.Q :. -. (P --> Q)

* P Q ..---, p V Q 37.

* ---,p V Q :. P --> Q

Show that the following argument forms are valid by using deduction.

38.

* -.(P --> ,Q);R --> (,P v -.Q) ;(R V S) (Q V R); (-.Q&Q i ) v (S --> P);, (-.R --> ,P) :. ,S V Q

41.

* P - (Q - R) ; (•Q +-> S) --> ,P;P V Q 1

42. * (P 43.

Q)

+->

+--->

(R&S) - Q 1

R;,(P ....., ,R);(Q V ,S) --> S1 ; :. S1

* -.S --> ,S 1 ; ( S&S i ) --> (P

44. * ( Q 45.

+->

:.

Q);,(-.P V Q) :. S 1 --> -.S 1

,P) --> -.R; (,Q&S) V (P&Q i );(S V Q 1 ) --> R :. P --> Q

* (P& ,R)

+--->

(S v ,Q) ; S1 &((...,S&,R) - P) ; (S 1 --> Q) V (S1 --> R) :. Q&R

Chapter 4

Quantifiers Aristotle's theory of the syllogism treats only arguments having a very restricted form. Every sentence must have the structure 'All F are G', 'No F are G', 'Some F are G ' or 'Some F are not G', where F and G are general terms, expressions true or false of individual objects. Every syllogistic argument must have two such sentences as premises and one as conclusion, with the terms meshing in just the right way. As a result, syllogistic logic covers a rather limited domain. Sentential logic, by taking sentences as basic units of analysis, oversees a broader realm. It can handle arguments with any number of premises and sentences of any length and with any degree of complexity. But it too suffers from narrow horizons. Just as the theory of the syllogism can't solve problems characteristic of sentential logic, so sentential logic fails to solve syllogistic problems. Consider even a simple example: 1. All cows are mammals. All mammals are animals. :. All cows are animals. This argument is surely valid. Yet sentential logic cannot explain why. It has no choice but to construe (1 ) as 2. C M :. A which plainly fails to be valid. The same happens with any syllogism. The validity of arguments such as (1) depends on the structure within sentences, not on the structure relating distinct sentences. No theory that declines to analyze what sentential logic calls "atomic" sentences can hope to account for syllogistic reasoning. The split between syllogistic and sentential logic, which began in Greece in the third or fourth cen­ tury B. C., persisted for more than two thousand years. Neither theory could account for arguments that the other took as paradigms of correct reasoning. In the nineteenth century, sentential logic became symbolic, attaining a fairly high level of mathematical sophistication. The symbolization of logic paved the way to a unification of the sentential and syllogistic realms. Two logicians working independently- Gottlob Frege and Charles Saunders Peirce- overcame the ancient divergence between syllogistic and sentential logic in 1879. They introduced symbols representing determiners, such as 'all', 'some', 'no', 'every', 'any', etc. Frege and Peirce used two symbols: the universal quantifier, which we will write "V ' (Peirce's work has 'P'), and the existential 77

78

Logic, Sets and Functions

quantifier, ' 3 ' (in Peirce, 'S') . The universal quantifier corresponds roughly to the English 'all', 'every' and 'each '; the existential quantifier, to the English 'some', 'a' and 'an ' . To see how introducing quantifiers proved to be the crucial move in unifying sentential and syllogistic logic, notice that the theories use different basic units. Sentential logic treats sentences lacking connectives as unanalyzed, while syllogistic logic so treats general terms. Frege and Peirce combined a scheme relying on general terms with another relying on sentences by making general terms b ehave, in essence, as if they were sentences. Their move enabled sentential tools to work on syllogistic and, as it turned out, far more complex arguments.

4.1

Constants and Q uantifiers

Sentential logic is limited precisely because it takes sentences as its basic, unanalyzed units of explanation. To expand this logic to include syllogistic arguments , we must look inside "atomic" sentences to see how they are put together. In general, they consist of a main or subject noun phrase and a main verb phrase . We can gain some insight into the consolidation of sentential and syllogistic logic by examining the character of verb phrases. Consider a few examples: 3. (a) is a man (b) knows some people who live in Oklahoma City ( c) sleeps very soundly ( d) kicked the ball into the end zone (e) thought that Yosemite would be more fun to visit ( f ) gave Fred a copy of the letter All verb phrases are general terms in the sense of syllogistic logic. T hey are true or false of individual objects. Pick any object you like ; it will either be a man or not be a man. It will either sleep very soundly or not sleep very soundly. It will be true that either it gave Fred a copy of the letter or it didn 't. Alone, verb phrases and other general terms are not true or false. But we can think of them as yielding a truth vaiue when combined with an object. Objects of which the verb phrase or general term are true satisfy it; the verb phrase or general term applies to them. Equivalently, we can think of verb phrases and other general terms as classifying objects into two categories: those of which they are true, and those of which they are false. The first category- the set of objects of which the verb phrase or general term is true- is called its extension. Verb phrases combine with noun phrases to form sentences. Since verb phrases are expressions that are true or false of particular objects, noun phrases must specify an object , or a group of objects, and say something about the application of the verb phrase to them. Noun phrases take several forms. In this chapter, we'll concentrate on two of them. First, noun phrases may simply pick out a single object by naming it. A sentence containing a proper name as its subject will be true if the verb phrase is true of that object, and false otherwise. Each of these sentences results from combining a proper name with a verb phrase from (3) : 4.

(a) Socrates is a man (b) Maria knows some people who live in Oklahoma City ( c) Mr. Hendley sleeps very soundly (d) Barr kicked the ball into the end zone (e) Penelope thought that Yosemite would be more fun to visit

79

Quantifiers (f) Nate gave Fred a copy of the letter

To suggest that (4)a. is true just in case the verb phrase 'is a man' is true of Socrates, we could write it in the form (5) a., or, more simply, (5)b. : 5.

(a) i s a man(Socrates) ( b) Man (Socrates) .

We can read (5)b. as saying that man applies to, or is true of, Socrates. Our formal language will thus need at least two kinds of symb ol to represent sentences such as ( 4)a. First, we'll use lower case letters from the beginning of the alphabet, with or without numerical subscripts, to represent proper names; we'll call these symbolic names individual constants or, more simply, constants. Second, we'll use upper case letters fr om the middle of the alphabet, with or with out numerical subscripts, as predicate constants, or more simply, predicates. Each predicate comes with a number assigned to it; predicates are called n-ary if they are assigned the number n. Every predicate yields a truth value when combined with a certain number of objects. T he assigned number indicates of how many objects at once the predicate is true or false. Recall that a general term is true or false of individual objects. We will therefore symb olize simple general terms such as man with singulary predicates, that is, predicates to which 1 is assigned. T hese predicates are true or false of single objects, and produce sentences when combined with a single proper name. Other predicates yield sentences only when combined with two names; they are true or false of two objects taken together. Such predicates are assigned the number 2, and are called binary. T hey are useful for symb olizing, am ong other things, transitive verbs. Respect, for example, applies not to objects taken individually, but to objects taken in pairs. It requires a direct object. We can ask whether Robin respects Julia, but not simply whether Robin respects. Letting 'a ' symbolize Socrates, and 'M' symb olize man, we can symbolize 'Socrates is a man' by 6. Ma, which i s merely a symbolic version of 4(a). ' Ma' is a formula o f quantificational logic. One way of building formulas, then, is to combine individual constants with predicates. A second sort of noun phrase we'll consider in this chapter consists of a determiner, such as every or some, together with a common n oun such as man or truck. The common noun may be modified by adjectives, adjectival phrases, prep ositi onal phrases, or relative clauses. Whatever its grammatical structure, however, the modified noun wi11 constitute a general term. N oun phrases of this more complex sort, when combined with the verb phrases in (3), yield the sentences 7.

(a) (b) ( c) ( d) (e) (f)

One reporter who covered the match is a man A few friends kn ow some people who live in Oklahoma City Every endomorph sleeps very soundly Several prospects kicked the ball into the end zone A taxi driver th ought that Yosemite would be more fun to visit Nobody gave Fred a copy of the letter

Nobody, in (7)f. , is a special case; the word itself contains b oth a determiner and a general term, and is equivalent to no person. We can gain some insight into the structure of the sentences in (7) by examining some related sentences that contain subject noun phrases that, while complex, consists only of determiners and the rather colorless general terms thi ng and object. To begin, consider the sentence

80

Logic, Sets and Functions 8. S omething is missing.

To symbolize this sentence, we can't use any individual constant for something. We don't want to say that missing applies to any object in particular. The sentence says that missing applies to some object. S o, using our earlier strategy, we c ould try writing 9. Missing(something). But it is extremely useful to think of missing, a general term, as true or false of objects. That, after all , is what m otivated us to intr oduce the pattern of (9) in the first place. We would like Missing(Ralph) to mean that Missing is true of Ralph. The name Ralph picks out a particular entity. But, if we write Missing(something), although we can continue to read it as saying that Missing is true of something, the something picks out no object in particular. The form of (9) tempts us to ask, "Well, then, what does something stand for?" Medieval l ogicians grappled with this pr oblem for centuries with out producing a fully satisfactory solution. The problem, acc ording to modern logic, is that this is the wrong questi on. We don't want to say that Missing is true of the object denoted by something, but rather that Missing is true of some object. Quantificati on theory allows us to say, in effect, that x is missing is true for some object x. So, instead of thinking of a c omplex noun phrase such as something as picking out some object or objects satisfying a general term, we can indicate that general term is true of x for some value of x. The existential quantifier has exactly this function. To express the idea that x is missing is true for some x, we write the symb olic equivalent of 10. (for some x)(x is missing), which is 1 1 . :lxMx. We may read ( 1 1 ) as saying the following: 12.

(a) for some x , x is missing (b) some x is of such a kind that x is missing ( c) there is an x of such a kind thatx is missing (d) an x is of such a kind that x is missing.

In go od English, these become 13.

(a) something is missing (b) there is something missing (c) an object is missing.

N otice that the words thing and object serve in English much as variables such as x serve in quantificati on theory. Variables link quantifiers to the predicates they accompany. This strategy, then, requires two new sorts of symbol. First, we must intr oduce individual variables, or, m ore simply, variables. Variables will be l ower case letters from the end of the alphabet, again with or without numerical subscripts. They act, in quantificational logic, much as variables for numbers such as 'n' or 'x' act in number theory or algebra. They stand for n o objects in particular; instead, they mark places where names of particular objects could go. Second, the language of quantification theory must include quantifiers: ':l ', the existential quantifier, and '\;/ ', the universal quantifier.

Quantifiers

81

To take another simple example, suppose we want to say, recalling the title of a popular song, that everything is beautiful. To say that Pittsburgh is beautiful, or that Ingrid Bergmann is beautiful, we can introduce the predicate 'B' and the individual constants 'p' and 'b' and write 'Bp' and "Bb'. To say that everything is beautiful, however, we need to say that 'x is beautiful' is true for every object x. The universal quantifier does just this. We can prefix the quantifier to 'Bx', writing the symbolic equivalent of 14. ( for every x ) (x is beautiful) , or 15. VxBx. (15) says 16.

(a) (b) (c) (d) ( e) (f) (g) (h)

for every x , x is beautiful for all x, x is beautiful for each x, x is beautiful for any x , x is beautiful every x is of such a kind that x is beautiful all x are of such a kind that x is beautiful each x is of such a kind that x is beautiful any x is of such a kind that x is beautiful

or, in plain English, 17.

(a) (b) (c) (d)

everything is beautiful all things are beautiful each object is beautiful any object is beautiful.

All, every, each and, usually, any thus receive the same symbolization. In English, these words differ subtly but significantly in meaning. Note , for example, that though 'any object is beautiful' seems to mean just what 'everything is beautiful' does, 'anything is beautiful' sounds strange. Quan­ tification theory cannot capture all the differences between these determiners, but it can capture some, as a later section of this chapter will show. The theory, despite its idealizations, succeeds surprisingly well in explaining the validity or invalidity of English arguments depending on these determiners. Variables, which in effect represent English words such as thing and object, have no meanings independent of their symbolic context. In (11) and (16 ) , 'M' and 'B' represent missing and beautiful, respectively. They cannot be interchanged without changing the translation manual relating English sentences to their symbolic representations. Variables, in contrast, can be interchanged, with very few restrictions, without altering meaning. Something is missing could just as well be symbolized by 18. 3yMy as by (16), and Everythin g is beautiful could just as well be represented by 19. VzBz as by (21). This is true because the variables 'x ', 'y' and 'z' themselves have no significance. They serve only to link quantifiers to predicates, and mark places where constants could be placed. Any sort of mark could do the same job.

82

Logic, Sets and Functions

4.2

Categorical Sentence Forms

Sentences such as something is missing or everything is beautiful carry one only so far. M ost of the time, we want to say something ab out, for example, some people or every frog, n ot ab out just something or everything. We need to be able to handle sentences with subject n oun phrases that contain more c omplicated general terms. If we could do this, it would be easy to represent any sentence having one of the four classic categ orical forms in quantification theory. Consider first universal affirmative sentences, having the structure 20. All F are C. We might want to represent, for instance, 21. All frogs swim. We already know that 'Sa' can represent, say, A lbert swims, and that '' Gx) -> Vx:lyHxy ; (3x(Fx --> Gx) --> Vx3yHxy) (b) Vz'v'w:lt(Fzt&Gwz) ,__. P; (Vz'v'w:lt(Fzt&Gwz) - p)

Third, we 'll allow brackets and braces to count as sloppily drawn parentheses when that increases the readability of a formula. Fourth, because we'll usually drop predicate superscripts, we'll avoid using the same capital letter as both a monadic and a polyadic predicate, or as both a sentence letter and a predicate letter, within the same formula; though 51. 3x'v'y(Fxy -> Fa) is correctly formed, using ' F' to represent two English expressions at once tends to be confusing; the formula , properly written out, would be ':lx'v'y (F2xy -> F 1 a)', which makes it clear that there are two different predicates. F ifth, lower case letters can be constants or variables; the vocabulary specifies that 'a' through 's' are constants, and 't' through 'z' are variables. We'll maintain this usage strictly only for variables, which are similar enough to constants in function to cause confusion. Several points about formulas of Q deserve mention. Note that only variables may appear with quantifiers; ' :laFa' is not a formula. Neither is 'Vp(p --> q) ' or '3FFa'. Intuitively, individual constants and variables take objects as values. In Q we can quantify over objects, speaking about all objects of a certain kind, or some objects of that kind, etc. We cannot do the same with sentences or predicates. Because it allows quantification over individuals alone , Q is a system of first-order quantification, sometimes called first-order logic. Other logical theories, called higher-order logics, do allow quantification over sentences and predicates, but at the price of substantial complication.

Problems Evaluate each of the following as (a) an official formula of Q, (b) a conventional abbreviation of a formula of Q , or ( c) neither of the above.

Quantifiers

89

3. Fx _, Fy

4. Fa -> Fe 5. (Fx -, Fy) 6. (F 1 a -> F 1 b)

7. 3xFx -> Fx � /8. ' 3x(Fx -, Fx) \..� ·

9. 3xF 1 x -> F 1 a

12. \:/x\:/yFxy -, Fyx 13. \:/x\:/yF2 xy 14. Vx\:/yFxy

--+

--+

F2 ab

Fa

15. \:/x\:/y ( Fxy -, Fyx) . 1 6 : \;x\:/yFxy '· ·

-->

,. · '

\:/x\:/yFyx

2 1 7 . (Vx\:/yF2 xy --+ \:/x\:/yF yx)

18. \:/x\:/yFxy -, Fy 19. VxFx

->

3xFx

20. \:/x(Fx -, 3xFx)

2 1 . (VxF 1 x _, 3xF 1 x) ., ·'1l· -.

22 . ..·3xVy Gy

.. • . ., •

.4

23. \:/x\:/y\:/b(Fxy V Fyb) 24. \:/x3F(Fx

-->

Fa)

25. G 1 y 26. G2 b

28. VxGxy 29 . \:/yG 1 y _, G 1 z 30. Vy (G 1 y -> G 1 z) 3 1 . VyGy _, Cy

Logic, Sets and Functions

90 32. Vy( G1 y ,_. G 1 y) 33. VxF 2 xy -. VyG2 yx 34. Vx(Fxy -. VyGyx) 35. VxVy (Fxy -. Gyx) 36. Vy(VxFxy -. Gyx) 37. VxVy (3zFyz ,_. (Gy&Hzx) ) 38. Vx(Vy3zF2 yz ,_. (G 1 y&H 2 zx)) 39. VxVy3z(Fyz ,_. (Gy&Hzy)) 40. (VxVy3z(Fyz& Gx) ,_. Hzy)

Taking each expression below as A, write, where possible , (a) A[c/d] , (b) A[d/c] , (c) A [x/c] , (d) A[y/d] , and (e) A[y/x] , and say whether the result in each case is a formula. (Count abbreviations of formulas as formulas.) If the substitution is impossible , say so. 41 . Hcd 42. Hee 43. Hex 44. Hxy 45 . VxFx ,_. Ge 46. Vx( Fx

Ge)

47. VxFxe

3xFdx

48. VxFxe ,_. 3yFdy 49. Vx ( Fxe ,_. 3yFdy) 50. Fxe ,_. 3yFdy 51 . 3xFxc&VxFxd 52. Fxe&VxFxd

4.5

Translation

With the addition of polyadic predicates, quantification theory has the power to express and evaluate a very large group of sentences and arguments. In this section we'll present a guide to representing English sentences in the theory. To translate even a simple sentence, we must distinguish its grammatical subject from its gram­ matical predicate. Because the word predicate takes on a different meaning in logic, we 've called grammatical subjects subject (or main) noun phrases, and grammatical predicates main verb phrases. As we 've indicated, it is easy to translate simple sentences such as All men are mortal, Some com­ puters are not reliable, and Nobody admires everybody into Q. But noun and verb phrases may become far more complex. After listing some of the ways in which this can happen, we'll explain how quantification theory can incorporate them.

Quantifiers

4.5 . 1

91

Noun Phrases

Some noun phrases are easy to handle in Q. Proper names translate as individual constants; common nouns, such as woman and airplane, translate as monadic predicates. But here the simplicity ends. Determiners

The determiners all, each, any and every translate as universal quantifiers, while some and a (n) generally translate as existentials. We say "generally" because even these rules have excepti ons. First, a and an have a generic use, where they seem to talk ab out typical members of a kind. Thus 59. A whale is a mammal is not saying that some whales are mammals, but that all are. In these cases, a and an corresp ond roughly t o universal quantifiers. Second, a, an and some all interact with conditionals when they are part of the antecedent. 60. If you steal something, you'll get into trouble can be translated straightforwardly as a conditional with a quantified antecedent (where a represents you) : 61 . 3xSax

-->

Ta.

But, when the consequent contains a word that refers back to something in the antecedent, this doesn't work. We could try to translate 62. If you steal something, you'll pay for it as 63 . 3xSax

-->

Pax

but this is not a formula; the final occurrence of 'x' is free (not within the scope of an appropriate quantifier phrase). Changing the parentheses so that the quantifier has scope over the entire formula in itself does not help. 64. 3x(Sax --> Pax)

is a formula, but it says the wrong thing. Because of the nature of the conditional, it is equivalent to 65 . 3x(....,Sax

v Pax) .

But this says that there is an object that either you don't steal or you pay for. And this is true if there is an object you don't steal. But the same does not hold of (62), the truth value of which is not determined by whether or not there are things you don't steal. To represent (62), we must use a universal quantifier with the entire formula as its scope: 66. Vx(Sax --> Pax) .

Logic, Sets and Functions

92

T his says that everything you steal, you pay for, which is equivalent to (62) . So, in certain cases in which they appear in the antecedent of a conditi onal, with reference back to the antecedent in the consequent, the determiners a, an and some correspond t o universal quantifiers in Q. N otice that If you steal something, you 'll pay for it is equivalent to If you steal anything, you 'l l pay for it. Any translates as a universal quantifier, but with the widest possible scope. Each usually takes wide scope among quantifiers; any, however, demands wide scope over connectives as well. Every and all make n o such demand. T his explains why the above sentences are not equivalent to If you steal everything, you 'll pay for it and If you steal each thing, you 'll pay for it. It also explains why any often seems similar to some or a. John didn't see any deer is equivalent to John didn't see a deer, not t o John didn't see every deer, because any appears as a universal quantifier to the left of the negation sign ('' Dix) (for every x, if x is a time and, for some y , I've placed my hopes in y at x, then I've been disappointed at x)

Relative clauses, in general, translate quite easily. Only one minor wrinkle ruins their simplicity. Some relative clauses restrict the group of things the noun phrase they modify applies to. If I tell you that everyone I know prefers Mexican to Chinese food, then I am speaking, not of everyone, but just of everyone I know. (74a.) and (74c.) contain such restrictive relative clauses. Other clauses, however, make almost parenthetical comments about their nouns or noun phrases. (74b.) contains such an appositive relative clause. Most relative clauses in actual discourse are restrictive. To tell whether a given clause is restrictive or appositive, ask whether the clause is helping to specify what the sentence is talking about or providing additional information concerning an already determinate topic. English does offer two linguistic hints. F irst, that often signals that a clause is restrictive; which, with some exceptions (for example , the phrases in which and with which), often signals an appositive. Relative clauses often begin with other wh-words , however, or with no special word at all. In these cases, there are no signals. Furthermore, the use of that and which is not very firmly established; these words are unreliable guides. Second, and more reliably, commas often do, and always can, set appositive clauses off from the rest of the sentence. Restrictives, on the other hand, reject commas in this role. So virtually all relative clauses set off by commas are appositive. For those not set off by commas, there is a simple test; try inserting commas. If the result sounds acceptable, the clause is probably appositive. Otherwise, it is restrictive. Restrictives and appositives, in symbolic representations, both connect to the remainder of the formula by conjunction. Most of the time, therefore, it makes no difference to the translation whether a given clause is restrictive or appositive. When universal quantifiers are involved, however, and the clause modifies the subject noun phrase, it does matter . Consider these sentences: 76.

(a) All the Democratic candidates for President, who are already campaigning, support labor unions. (b) All the Democratic candidates for President who are already campaigning support labor unions.

The only difference between them is the pair of commas setting off the relative clause in (76a.) . In that sentence, the clause is clearly appositive. It asserts that all the Democratic candidates for President support labor unions, and remarks, on the side, as it were, that all those candidates are already campaigning. (76b.), in contrast, does not claim that all the Democratic candidates support

Quantifiers

95

labor unions; it asserts only that all those who are already campaigning do so. (76b. ) is thus a weaker contention than (76a.). To translate these sentences, first translate the relative clause. Who are already campaigning derives from x is{are) already campaigning, which we can write as ' Cx'. Adopting the obvious representations ( and letting 'Lx' correspond to x supports labor unions) , then, we can symbolize the sentences in (76) as 77.

(a) v'x (Dx _, Lx)&Vx(Dx _, Cx) (b) Vx ((Dx&Cx ) __, Lx) .

Note that the restrictive clause is conjoined, in effect, to the rest of the subject; the appositive clause, to the rest of the entire sentence. Prepositional Phrases

Prepositions are rather ordinary English words such as in, to, of, about, up, over, from, and so on. They combine with noun phrases to form prepositional phrases, which act as either adjectives or adverbs: thus up a creek, from Pennsylvania, and in the middle of Three Chopt Road. We'll discuss those acting as adverbs, which translate together with the verbs or adjectives they modify as single units, in a few pages. Here we'll talk about prepositional phrases modifying nouns, which have separate translations. In prepositional phrases that function more or less as adjectives, prepositions relate two noun phrases. T hey thus translate into Q as dyadic predicates. The representatives of prepositional phrases themselves connect to the symbolizations of the noun phrases they modify by conjunction. Consider these examples: 78.

(a) Everyone from Pittsburgh loves the Steelers. (b) If I don't meet you, I'll be in some jail.

(78a.) contains the prepositional phrase from Pittsburgh. Since from translates into a dyadic predi­ cate, say 'F' (and since the Steelers here functions as the proper name of a team), (78a.) becomes 79. Vx((Px&Fxp) _, Lxs ) . (78b.) contains the prepositional phrase in some jail, which itself contains a determiner. T he conjunction of prepositional phrase to noun phrase, then, occurs within the scope of a quantifier: 80 . ,Hab -, 3x(Jx&Iax)

(where 'a' and 'b' symbolize I and you, respectively) symbolizes (78b.). Prepositional phrases mod­ ifying nouns thus translate readi ly into quantification theory.

4.5.2

Verb Phrases

So far we've discussed how noun phrases and their modifiers translate into quantification theory. Since sentences consist of a subject noun phrase and a verb phrase, however, we also need to explain how to symbolize verb phrases and their modifiers in Q. In any verb phrase, of course, there is a verb. Verbs fall into several categories, depending on their ability to take certain kinds of ob jects. Some verbs are intransitive; they cannot take objects at all. Fall, walk and die are all intransitive. Transitive verbs take noun phrases as direct objects. Examples are throw, win and send. Some of these, such as give, also take noun phrases as indirect

96

Logic, Sets and Functions

objects. Other verbs take sentences, or grammatical constructions closely related to sentences, as objects. Believe, know and persuade are such clausally complemented verbs. The logic of verbs taking sentential c omplements remains the subject of much debate. Here, therefore, we'll c onsider only transitive and intransitive verbs. N ote that many verbs fall into more than one category. Eat, for example, can have a noun phrase object (in, for example, We eat spaghetti every Wednesday night ) , but does not need one (Let 's eat out ) . Believe can take a sentence (J believe that God exists) or a n oun phrase (J believed him) . Intransitive verbs translate into Q as monadic predicates. John walks, for instance, becomes ' Wj ' ; everyone who d oesn't own a car walks becomes 8 1 . Vx((Px&-,3y(Cy&O xy)) -, Wx). Transitive verbs translate into Q as polyadic predicates. Usually, they bec ome dyadic predicates; 'Lmf ' represents Mary loves Fred, etc. Occasi onally, however, a verb relates more than two noun phrases. Mike gave John War and Peace, for example, translates as 'Gmjp' . In general, predicates of more than two places prove very useful in symb olizing sentences with indirect objects or adverbial modifiers of certain kinds. This, however, raises the general issue of adverbs. Adverbial Modifiers

Adverbs, such as quickly, well, anytime and somewhere, modify verbs. They specify how, when or where a certain c ondition holds or a certain activity occurs. Unfortunately, most adverbs have no direct symbolizations in quantificational logic. Q must represent them, together with the verbs they modify, as single units; expressions such as walks slowly or plays well become predicates such as ' W ' o r 'P'. Th ough s ome logicians have attempted to devise schemes fo r representing adverbs in Q, none has managed to find a way of translating all adverbs accurately. Some adverbs, h owever, do translate into quantificational logic. We'll call always, anytime,whenever, wherever, anywhere, sometime, etc. , adverbs of quantification. Consider the sentence I like Alfred sometimes. I like Alfred, normally, would bec ome 'Lia '. So how do we represent 'Lia, sometimes' ? Some is a determiner. So the sentence is saying, in effect, that, for some times x, I like Alfred is true at x. Instead of 'Lia', then, we need 'Liax', meaning I like Alfred at x. I like Alfred sometimes thus bec omes 82. 3x(Tx&Liax) .

Similarly, c onsider Everywhere I look there are timeshare resorts. This, in essence, amounts to for every x, if x is a place and I look at x then there are timeshare resorts at x, or, in symb olic notation, 83. Vx( ( Px&Li x) -, 3y(Ry&Ayx)) . Some adverbs of quantification, however, have no correlates in Q. Frequently, which am ounts roughly to at many times, and rarely or seldom, which amount roughly to at few times, would translate into Q only if Q had a way of symb olizing many and few. Since quantification theory represents only a few determiners, it can represent only a few adverbs of quantification. Prepositional phrases, as we've seen, can modify nouns. They can als o modify verbs. John ran d own the street, We 're singing in the rain and I'll have a hot d og on a paper plate all c ontain prepositional phrases functioning adverbially. Just as adverbs, in m ost cases, d o not translate into Q except as parts of verb phrases that become predicates, so prepositions linking noun phrases to verbs or verb phrases translate together with the modified verb or verb phrase. They do not bec ome dyadic predicates in their own right, as they do when modifying noun phrases. Nevertheless, because

Quantifiers

97

prepositional phrases contain noun phrases , their symbolic representations are more interesting than those of adverbs. Think about a sentence such as Laura lives on East 72nd Street. This becomes, when symbolized, 'Lle', where 'e' represents East 72nd Street. Lives on translates as a single dyadic predicate. Note that we cannot apply the strategy appropriate to adjectival prepositional phrases ; the above sentence is not equivalent to Laura lives and is on East 72nd Street. To take a more complex example, Richard has worked in every division of Reynolds Metals Company contains an adver bial prepositional phrase that itself contains a determiner. Work here is intransitive. It would usually translate into a monadic predicate. But in is a preposition that can combine with the verb for purposes of translation. Instead of using the simple open sentence x works, therefore, we can use x works in y, a dyadic predicate, to obtain 8 4. ' Wrx). Notice that of in this sentence does not add another place to the predicate; of Reynolds Metals Company functions adjectivally, modifying division. 4.5.3

Connectives

Quantification theory includes sentential logic. Sentential connectives can link quantified sentences together; they can even inhabit noun and verb phrases. We've discussed some problems that this creates in Chapter 2. Here, we'll recapitulate them briefly and extend them to some more interesting cases. Recall that noun and verb phrases can be joined together by and, or and if not. Chapter 2 recommended a policy of splitting such phrases. Connectives linking noun or verb phrases usu­ ally accept transformation into connectives linking sentences. Thus Abraham Lincoln and Calvin Coolidge were Republican Presidents amounts to Abraham Lincoln was a Republican President, and Calvin Coolidge was a Republican President. Similarly, Fred likes hot dogs and hamburgers amounts to Fred likes hot dogs and Fred likes hamburgers. In quantification theory this advice becomes more important in many cases. All lions and tigers are cats is equivalent to All lions are cats and all tigers are cats. But the conjoined noun phrase can tempt us into a translation 8 5. ' Cx), which says that everything that is both a lion and a tiger is a cat. Of course, nothing is both a lion and a tiger. So this symbolization is incorrect. Separating sentences results in the formula 86. ' Cx)&' Cx) ,

which captures the meaning of the original. When existential quantifiers are involved, or when the connectives are in the verb phrase, splitting makes little difference. But, in subject noun phrases, it is vital. As in the case of sentential logic, however, we must take care to split only those sentences for which the process preserves meaning. Harry loves knackwurst and sauerkraut may not be equivalent to Harry loves knackwurst and Harry loves sauerkraut; he may love the combination without being very excited about the individual components, or vice versa. Mary and Susan own the entire company is probably not saying that Mary owns the entire company, and that Susan does too, but that they own the entire company between them. Arguments relying on these special conjoined noun or verb phrases lie outside the bounds of quantification theory. Another problem pertains to connectives such as if, only if, etc. They cannot join together two noun phrases or two verb phrases, but they can appear within sentences in ways that do not reduce to simple sentential connection. Consider:

98

Logic, Sets and Functions

87. A formula is contingent only if it's not valid. We might think of this as a sentence with a formula as subject noun phrase and is contingent only if it 's not valid as main verb phrase. The determiner a is clearly functioning generically, so it translates as a unive rsal quantifier. The common noun formula appears as a monadic predicate. (87) thus looks like a complex version of a universal affirmative sentence form. Its symbolization begins with 'Vx(Fx -> . . . ' . The main verb phrase contains two connectives, only if and not; the adjectives contingent and valid appear as monadic predicates. It acts much like a variable. So the symbolization of (87) turns out to be 88. Vx(Fx __. (Cx -----> ,Vx) ) .

Alternatively, we might think of (87) as containing a connective, only if, joining together two sen­ tences, a formula is contingent and it's not valid. On this approach, (87) resembles (62). It amounts to If a formula is contingent, it 's not valid. This we might be tempted to translate as 89. :lx (Fx&Cx) _, ,Vx,

but, in (89 ) , the final occurrence of 'x' is not in the scope of the existential quantifier. So this is not even a formula. Once again, the solution requires using a universal quantifier with the entire formula as its scope: 90. Vx ((Fx&Cx) -> , Vx) .

This, too, is an acceptable translation of (87). Fortunately, it is equivalent to (88 ) . So the ways of construing the sentence's structure yield equivalent results. Naturally, connectives can also join together entire sentences that have no troublesome links between them : If we don't hang together, we 'll surely all hang separately, Some political parties die out after a short time, but others last for centuries, and Unless everyone leaves, I'll refuse to come out all work as we might expect from sentential logic. Finally, quantification theory contains not only connectives but sentence letters. It might seem that any sentence can translate into Q by using just predicates, constants, variables and quantifiers. But a few , very simple sentences- it is raining and it's three o 'clock, for example-- resist this analysis. It, in these sentences, does not stand for an object, so it would be very odd to translate it is raining as, say, ' Ri'. To see this , try asking, "What is raining?" The question doesn't make very good sense. So sentences containing it in what linguists call its pleonastic use seem to be best translated with simple sentence letters.

Problems

Translate the following sentences into Q, exposing as much structure as possible. If any translate only with difficulty, explain why. 1. All men are born good. (Confucius) 2. All that I know , I learned after I was thirty. (Georges Clemenceau) 3. Children are always cruel. (Samuel Johnson) 4. All who remember, doubt. (Theordore Roethke) 5. All big men are dreamers. (Woodrow Wilson) 6. Only the shallow know themselves. (Oscar Wilde)

Quantifiers

99

7. All are not friends that speak us fair. (James Clarke) 8. All's Well That Ends Well. (William Shakespeare) 9. No sound is dissonant which tells of life. (Samuel Taylor Coleridge) 10. If any would not work, neither should he eat. (II Thessalonians 3: 1 0) 1 1 . There is no detail that is too small. (George Allen) 12. Alas! It is delusion all. . . . (George Gordon, Lord Byron) 1 3 . . . . and now nothing will be restrained from them, which they have imagined to do. (Genesis 1 1 :6) 14. All finite things reveal infinitude. (Theodore Roethke) 15. Poets are the unacknowledged legislators of the world. (Percy Bysshe Shelly) 16 . . . . we are dust and dreams. (A. E. Housman) 1 7. Everything that man esteems endures a moment or a day. (William Butler Yeats) 18. To be beloved is all I need, and whom I love, I love indeed. (Samuel Taylor Coleridge) 19. Hope is a delusion; no hand can grasp a wave or a shadow. (Victor Hugo) 20 . . . . the things which are seen are temporal; but the things which are not seen are eternal. (II Corinthians 4: 1 8) 21. Some people with great virtues are disagreeable while others with great vices are delightful. (La Rochefoucauld) 22. Nothing which is true or beautiful or good makes complete sense in any immediate context of history. . . . (Reinhold Neibuhr) 23. T hey also live who swerve and vanish in the river. (Archibald MacLeish) 24. Nothing is done. Everything in the world remains to be done or done over. (Lincoln Steffens) 25. So then neither is he that planeth any thing, neither he that watereth; but God that giveth the increase. (I Corinthians 3 :7) 26. Loafing needs no explanation and is its own excuse. (Christopher Morley) 27. Any mental activity is easy if it need not take reality into account. ( Marcel Proust) 28 . . . . it is not poetry, if it make no appeal to our passions or our imagination. (Samuel Taylor Coleridge) 29. When a man is wrong and won't admit it, he always get angry. (Thomas Haliburton) 30. My only books were women 's looks, and folly's all they've taught me. (Thomas Moore) 31. All things fall and are built again, and those that build them again are gay. (William Butler Yeats) 32. All men have aimed at, found, and lost. . . . (William Butler Yeats)

1 00

Logic, Sets and Functions

33. Great is the hand that holds dominion over man by a scribbled name. (Dylan Thomas) 34. He that stays in the valley shall never get over the hill. (Jann Ray) 35 . It is always the secure who are humble. (G. K. Chesterton) 36 . . Every country can produce good men. (Gotthold Lessing) 37. A dull axe never loves grindstones. (Henry Ward Beecher) 38. To whom nothing is given, of him nothing can be required. (Henry Fielding) 39. There has never been any 30-hour week for men who had anything to do. (Charles F. Kettering) 40. A thing of beauty is a joy forever. . . . (John Keats) 41 . Work is a grand cure of all the maladies that ever beset mankind. (Thomas Carlyle) 42. He that has no patience has nothing at all. (Italian proverb) 43. Every man without passions has within him no principle of action, no motive to act. (Claude Adrien Helvetius) 44. You can't have a better tomorrow if you are thinking about yesterday all the time. (Charles F. Kettering) 45. Nothing will ever be attempted if all possible objections must be first overcome. (Jules W. Lederer) 46. There is a singer everyone has heard . . . . (Robert Frost) 47. All man's friend, no man's friend . (John Wodroephe) 48. To do nothing is in every man's power. (Samuel Johnson) 49. Nobody ever did anything very foolish except from some strong principle. (William Lamb) 50. We receive only what we give. (Samuel Taylor Coleridge) 51 . If you build a castle in the air, you won't need a mortgage. (Philip Lazarus) 52. In every work of genius we recognize our own rejected thoughts. (Ralph Waldo Emerson) 53. Nothing is more boring than a man with a career. (Aleksandr Solzhenitsyn) 54 . . . . the Bears were good Bears, who did nobody any harm, and never suspected that anyone would harm them. (Robert Southey) (Use a predicate 'B' for is one of the Bears.)

4.6

Interpretations

First, we need to develop a precise notion of what an interpretation in quantificational logic is. Recall that in sentential logic an interpretation is simply an assignment of truth values to atomic formulas, i.e. , to sentence letters. Quantificational logic includes sentential logic, so interpretations within it will incorporate such truth value assignments. But quantificational interpretations are more complex.

Quantifiers

101

Definition 4 . 1 An interpretation M of a set S of formulas of quantificational logic is an ordered pair (D, ¢) , where D (M 's domain, or universe of discourse) is a nonempty set and is a function assigning (1) truth values to sentence letters in S, (2) elements of D to constants in S, and (3) sets of n - tuples of elements of D to n- ary predicates in S. An interpretation thus has two components. The first is a set that specifies what objects the formulas in question are talking about. The quantifiers range over this set, in the sense that we construe "for all x and "for some y" as meaning "for all x in D" and "for some y in D" or, in other words, "for all elements of D" and 'for some element of D." The last section already introduced the notion of a universe of discourse in the context of translation. If we are speaking about nothing but people, or automobiles, or income groups, then we can avoid populating all our formulas with a predicate representing this subject matter by taking the set of people, or of automobiles, or of income groups as the universe of discourse. We thereby count "for all x" as having the significance of, for example, 'for all people x" . Specifying a domain can thus simplify translations. In interpreting quantified formulas, however, specifying a domain is a necessity. We can either say that the domain is the set of people, or income groups, or whatever, or enumerate the members: for example, {Tom , Dick Harry} . The second component of an interpretation M is an interpretation function. This function, in effect, assigns meaning to the constants, predicates and sentence letters in the formulas we're interpreting. It assigns truth values to sentence letters, telling us whether the sentences they represent are true or false. It assigns elements of the domain to constants, telling us which ob­ jects they stand for. Finally, it assigns sets of n-tuples of objects to n- ary predicates. Con­ sider a unary or monadic predicate, 'R', which informally means red. The interpretation func­ tion assigns 'R' a set of !-tuples. Intuitively, it tells us which objects satisfy 'R'; it tells us, in other words, which objects are red. The function assigns to a binary or dyadic predicate such as 'L', meaning loves, a set of ordered pairs. The function tells us, then, who loves who. If ¢>(L) = { (Bob, Carol) , (Carol, Ted ) , (Ted, Alice) , (Alice, Bob) } , then Bob loves Carol, Carol loves Ted, Ted loves Alice, and Alice loves Bob. This table summarizes how the interpretation function works. Symbol . Interpretation truth value sentence letter object in the domain constant n-ary predicate set of n-tuples of objects in domain The definition of an interpretation only performs part of the task that we need to accomplish. We want to be able to produce interpretations that make various formulas true and others false. To do this, we need to know how to evaluate the truth value of a formula on an interpretation. It's easy to j udge the truth value of a sentence letter on an interpretation; just see what value the interpretation function assigns to it. The interpretation M = (D, ¢>rangle makes a sentence letter P true just in case ¢ assigns truth to P:

(l) [P] M

= T ¢:> (P) = T.

Other atomic formulas, consisting of an n-ary predicate followed by nconstants, are also easy to evaluate. The sentence Bob loves Carol is true if and only if Bob loves Carol. The sentence is true, in other words, just in case the interpretation we assign to the predicate loves includes the pair (Bob , Carol) . In general, then, an atomic formula of the form Ra 1 • • . a n will be true on an interpretation Mjust in case the set the interpretation function assigns to R includes the n-tuple consisting of the objects that a1 , . . . , an stand for:

102

Logic, Sets and Functions

Assessing the truth values of formulas with sentential connectives as main connectives is also easy, provided that we know the truth values of the components. We can proceed exactly as in sentential logic:

• [-,A] M

= T Ga

14. Ga -+ Fa 15.

3xFx

1 6 . VxFx

17. 3xGx 18.

VxGx

19.

Vx(Fx -------> Gx)

20. forallx( (Fx V Gx) -------> Gx) Let D = { a } , ¢(a) = a, ¢( F ) on this interpretation? 21 .

Raa

22.

Vx(Fx -------, Rxx)

23.

VxRxx

24.

VxRax

=

{ (a) } , and ¢(R)

= { (a, a) } .

What is the truth value of these formulas

25. Vx(Rax&Rx a) 26.

VxYy(Rxy -------, Ryx)

27.

Vx(Fx -+ 3y (Fy&Rxy))

28. Vx\/y\/z(( Rxy&Ryz) -+ Rxz) 29.

VxVy(Rxy -+ 3z (Rxz&Rzy) )

30.

3x(Fx&Rxx)

+-+

Let D = {a, b} , (a) this interpretation? 31.

Fa

32.

Fb

33.

3xFx

34.

VxFx

35.

existsx...,Fx

36.

\/x,Fx

37.

3xFx&3x,Fx

Vx(Fx&Rxx)

= a, ¢(b) = b, and ¢(F) = { (b) } .

What is the truth value of these formulas on

Quantifiers

105

39. 3x(Fx _, VyFy) 40. Vx(Fx _, VyFy)

Let D = {a, b} , (a) = a, ¢(b) = b, and ¢(R) = { (a , a ) , (b, a ) } . What is the truth value of these formulas on this interpretation? 4 1 . Rab

42. Rba 43. Rbb 44. 3xRax 45. 3xRbx

46. 3xRxa 47. 3xRxb 48. VxRxx 49. \/x\/y(Rxy _, Ryx) 50. Vx3yRxy

Let D = { a, b } , ¢(a) = a, ¢(b) = b, ¢(F) = { (b) } , and ¢(R) = { (b, b) , (b, a) } . What is the truth value of these formulas on this interpretation? 5 1 . 3x( Fx&Rxx) 52. existsx(Fx&Rxa)

53. Vx(Fx -> Rxx) 54. Vx(Fx _, Rax) 55. Vx\/y(Rxy -+ Ryx) 56. Vx\/y(Rxy -> Rxx) 57. Vx\/y(Rxy

-+

Ryy)

58. Vx\/y\/z((Rxy&Ryz) -+ Rxz) 59. \/x\/y'r:/z( (Rxy&Ryz)

->

-. Rxz)

60. Vx\/y((Fx&Fy) _, (Rxy&Ryx))

Let D = {a, b } , ¢(a) = a, (b) = b, ¢(R) = { ( a , a) , (b, a ) } , ¢(S) = { (b, b) , (a, b) } , and ¢(F) = { (a) } . What is the truth value of these formulas on this interpretation? 61 . Vx(Fx _, Rxx)

62 . Vx3yRxy 63. Vx3ySxy

106

Logic, Sets and Functions

64. 3x\/yRxy 65. 3xVySxy

66. 3xVySyx

67. 3xVyRyx

68. \/x\/y(Rxy V Sxy) 69. \/xVy (Rxy -+ ,Sxy)

70. \/x\/y (Rxy -+ (Fx V Fy) ) Let D = {a, b} , ¢,(a) = a, ¢(b) = b, ¢(S) these formulas on this interpretation?

= 0 and ¢(R) = { (a, a) , (b, b) } . What is the truth value of

71 . 3xRxx 72. 3xSxx

73. \/xRxa

74. 3xRxa

75. \/x(Rxb -+ Sxb)

76 . \/x\/y(Rxy -+ Sxy)

77. \/x\/y(Sxy -+ Rxy)

78. \/xVy((Rxy&Sxy) _, Syx)

79. \/x\/y( (Rxy&,Sxy) _, Ryx) 80. \/x\/y((Rxy&Ryx) -> 3z(Rxz V Sxz) )

Let D = {a, b , c} , ¢(a) = a , (b) = b, ¢, (F) = { (b) , (c) } , and ¢(R) truth value of these formulas on this interpretation? 8 1 . 3xRxx 82. \/x(Fx -> 3yRxy)

83. 3x (Fx&\/yRxy)

84. 3x (Fx&VyRyx) 85. \/x(Rxx -+ Fx)

86. \/x3yRxy

87. 3x\/yRyx

88. (Fa V Fb)

89. \/y(3xRxy

-+

(Raa V Rab)

-+

3zRyz)

90. \/y(3zRyz -+ 3xRxy)

= { (a, c) , (b, c) , (c, c) } . What is the

107

Quantifiers

Let D = {0, 1, 2, . . . , 1 0 } , with ' < ' and ' ::; ' having their usual interpretations. What truth values do these formulas have on this interpretation? 91. 'ixx

x

< z)

99. 'ix'iy(x < y --> :3z(x < z&z < y)) 100. 'ix'iy(x :::; y --> :3 z(x :::; z&z :::; y) ) 101. 'ix:3yx < y 102. 'ix:3yx ::; y 103. 'fx:3yy 104. :ly'ixx

x < z)

1 14 . 'ix'iy(x < y -> 3z(x < z&z < y)) 1 15. VxVy(x :::; y --> 3z (x :::; z&z :::; y)) 1 16. Vx3yx < y 1 1 7. 'ix:3yx ::; y

Cb

Conclusion 3xFx 3xGax 3xGxb 3xHcx 3xHex 3xHxx 3y3xFxy 3z(VxFx ---> Gz)

In each case, the premise is an instance of the conclusion. The rule of existential exploitation allows us to move from an existentially quantified formula to an instance of it. 2 It is almost exactly the reverse, then, of the existential introduction rule. But it does impose a restriction: the instance must involve a constant new to the proof. The rule says that we may drop an existential quantifier serving as a main connective in a formula, and substitute for the quantified variable a constant that hasn't appeared earlier in the proof. The constant must have appeared nowhere in the deduction, not even in a Show line. (Actually, no harm would result from allowing us to use, for 3E, constants that appear earlier only on already bracketed lines. But, to minimize confusion, we'll always use completely new constants.) 3E(Existential Exploitation) n . 3vA n + p.A[e/v]

3 E, n Here e must be new to the proof. Suppose that we have the information that someone in our department is selling trade secrets to a competitor. We don't know who this person is- or, perhaps, who these people are- but we do want to reason from what we know in order to find out. We know that at least one person has been selling secrets; our reasoning and our communication will proceed much more readily if we give this person some n ame- John Doe, say, or just the mole - so that we can refer to him or her in various contexts. We can't simply say, "Someone has been selling our trade secrets. Someone must have j oined the department around the middle of 1981 , because that's when secrets began to leak." Nothing here indicates that the two "someones" are the same. To tie these assertions to the same individual, we must have a way of referring to that person. Introducing a name accomplishes this. It's critical that the name we choose be new. If Sarah Freeland is the head of the department, and we decide to call the seller of trade secrets Sarah Freeland, then utter confusion will result. The system of rules in this chapter is sound in the sense that the rules never lead us astray; they never allow us to prove a formula that isn't valid, or permit us to establish the validity of an invalid argument form. Nevertheless, the existential exploitation rule seems unsound; it j ustifies the inference from 3vA to A[c/v] , so long as c is new to the proof. Our demand for a new constant 2 The American philosopher W. V. Quine first formulated existential exploitation in this way in 1950, in his Methods of Logic (Cambridge: Harvard University Press, 1950, 1982).

111

Q uantified Natural Deduction

prevents the rule from doing any harm. Most o f our rules have been truth-preserving: the truth of the rules' premises guarantees the truth of their conclusions. Existential exploitation, however, is not truth-preserving. We should hardly be able to argue, "Some philosophers have been Nazis. Therefore, ristotle was a Nazi." This form of inference is fallacious. But introducing a new name avoids such problems by containing our use of the name within a portion of the proof. We could not use existential exploitation to establish the validity of the above argument. On the left is the attempted proof within quantification theory; on the right is the English equivalent. l . 3x(Px&Nx)

2. Pa 3. Show Na 4. Pb&Nb

A

A :lE, 1

Some philosophers have been Nazis. Aristotle was a philosopher. Show that Aristotle was a Nazi. b was a philosopher and a Nazi.

It seems clear that we will never be able to reach 'Na'. Because the constant 'a' appears in both lines 2 and 3, we can't use existential exploitation to obtain the instance 'Pa&Na' that we would need to reach the conclusion. Although :lE is not truth-preserving, therefore, it never gets us into trouble. The rule is conservative in that any formula without the new constant that follows from the conclusion of the rule also follows from the rule's premise. Mathematicians very often introduce existential assertions and names for the objects asserted to exist in one breath. Consider these examples from a calculus text: Let J be continuous at a. For every number e > 0 we can choose a number d > 0 so that I J( x ) - J (a) I < e for all x E A with I x - a I < d . . . . Since U is open, there is an open rectangle B with f(a) E B C U. 3 Or, consider this, from a text on Lebesgue integration: Let { sn } converge to s. Take e = 1 in the definition of convergence; then there exists an integer N such that I Sn - s I < 1 for all n =::: N, i.e. s - 1 < Sn < s + 1 for all n =::: N . 4 These passages combine the introduction of an existential assertion with its exploitation in one step. They name an object asserted to exist in the very act of making that assertion: "there exists an integer N," "there is an open rectangle B" and "we can choose a number d." Note that, in every instance, the names introduced in this way haven't appeared before; we have no independent information about N, B or d. It would be very different- and outrageous- if these passages were to say instead, "there exists an integer 43" or "we can choose a number ?T . " The third rule for quantifiers is universal exploitation. If we know that something is true about every object, then we can conclude that it is true for each particular object that we consider. If God loves everyone, then God loves me, you, and the Earl of Roxburgh. If Jane likes everyone she meets, then she likes you, if she's met you; she likes me, if she's met me; and so on. The rule of universal exploitation says that, from a universally quantified formula, we may infer any of its instances. v'E(Universal Exploitation) n.v'vA n + p.A[c/v) v'E, n Here c is any constant. 3 Angus E. Taylor, Calculus ( Englewood Cliffs: Prentice-Hall, 1959), p. 7 1 . 4 Alan J. Weir, Lebesgue Intergration and Measure (Cambridge: Cambridge University Press, 1973 ) , p. 108.

Logic, Sets and Functions

1 12

This rule does not require us to use a new constant. In fact, it will generally be silly to use a new constant in applying VE. There is no point to introducing a new name unless no constants appear in the proof at all up to this line. If we have constants 'a' and 'b' appearing earlier, and a formula '\/xFx' , we can infer 'Fa', or 'Fb', or both. If we also have a formula '\/xGxx' , we can infer 'Gaa' or 'Gbb'. And , if have '\/x\/yHxy', then we can obtain '\/yHay' or '\/yHby' and, in another step, any of 'Haa', 'Hab' , ' Hba' and 'Hbb'. We could also, of course, infer similar formulas with other constants. Unless we are forced to introduce those constants in other ways, however, using them to exploit a universal formula will serve no purpose. To see how these rules work, let's demonstrate the validity of a simple argument: Something's upsetting John; w hatever upsets John upsets Edna; so something's upsetting Edna.

I . 3xFxj 2 . 'v'y(Fyj -Fye) 3. � 3zFze

1:6. [ fFae;!j -Fae) 7.

3zFze

A A

3E,1 'v'E,2 - E,5 ,4 31 ,6

Problems

Use deduction to show that these arguments are valid. l. God created everything. So God created Pittsburgh.

2. God created everything. So God created Himself.

3. God created everything. So God created something.

4. God created everything. So something created God.

5. Nothing coherent ever baffles me. This course baffles me, so this course is incoherent.

6. All writers who express nationalism in their writing are trying to achieve political aims. Some American writers express nationalism in what they write. Thus, some writers trying to achieve political aims are Americans.

7. Everyone who understands the nature of the radical Islamic movement recognizes that it threatens the existing Arab regimes of the Middle East. Some people in the State Department understand the nature of radical Islam, so some State Department personnel realize that the movement threatens existing Arab regimes. 8. Some plants that are widely cultivated in this area are not able to survive on rainfall alone. All plants native to this area can survive on rainfall alone. It follows that some plants that are not native to this area are widely cultivated here.

9. Some utility companies are predicting brownouts in their service regions during this summer. No utilities that can easily and affordably purchase power from other utilities are predict­ ing brownouts for this summer. Hence, some utility companies cannot easily and affordably purchase power from other utilities.

10. Some computer programs used for processing natural language are written in PROLOG. Noth­ ing written in PROLOG relies heavily on the notion of a list. Consequently, not all computer programs used for processing natural language rely heavily on lists.

Quantified Nat ural Deduction

1 13

1 1 . Each person who came to the party was observed by the company president. Some people who came to the party became obviously drunk. Thus, the company president observed some obviously drunk people. 12. Nothing written by committee is easy to write or easy to read. Some documents written by committee are nevertheless extremely insightful. So some extremely insightful documents are not easy to read. 13. Some of the cleverest people I know are clearly insane. Any of the cleverest people I know could prove that this argument is valid. Hence, some people who could prove this argument valid are clearly insane. 14. There are cities in the Sunbelt that are experiencing rapid growth but that are not ranked as very desirable places to live. Every city that experiences rapid growth has to raise taxes. Therefore some cities that will have to raise taxes are not ranked as very desirable places to live. 15. Some analysts insist that we are in the middle of a historic bull market, but others say that the market will soon collapse. Nobody who is expecting the market to collapse is recom­ mending anything but utility stocks. None who believe that M l controls the direction of the economy contend that we are in the midst of a historic bull market. Thus, some analysts are recommending only utility stocks, but some don't believe that Ml controls the economy's direction. 16. The Longhorns can beat everyone who can beat everyone the Longhorns can. Therefore the Longhorns can beat themselves. 17. One prosecutor can convict another only if he or she can convict everyone that prosecutor can. There are prosecutors who can convict each other. So there are prosecutors who can convict themselves. 18. The unrestricted axiom of abstraction states that there is a set of all and only those objects satisfying any open sentence. Bertrand Russell proved the axiom inconsistent by using the open sentence . . . is not a member of itself Show, following Russell, that ':lxVy(y E x ,_. y rJ. y)' is contradictory, by proving its negation. 1 9. Say that an archetypal pig is something such that, if anything at all is a pig, it is. Show that there is an archetypal pig. 20. Say that something is truly ugly j ust in case, if it is beautiful, then anything is. Show that some things are truly ugly. Use deduction to establish the validity of these argument forms. 2 1 . 3xFx :. VxGx __, :lx(Fx&Gx) • 22. :lx(Fx&Gx)Vx(Gx __, ,Hx) :. :lx(,Hx&Fx) 23. :lyFyy:lxVzGxz :. 3x3y(Gyx&Fxx)

·:._� 4._ _':lx :lyFxyVxVy( Fxy ,_. (Gx&,Gy) ) :. 3xGx&3x,Gx 25. 3xGx&3x,GxVxVy(Fxy ,_. (Gx&,Gy)) :. 3x3yFxy 26. Vx(Fx __, Gx) :. 3x,Fx V 3xGx

1 14

Logic, Sets and Functions

27. \:/x\:/y(Fxy

+-+

(Gx& ...,Gy) ) 3x3y(Fxy&Fyx) :. 3x(Gx&,Gx)

28. 3x3y(Fx&Gyx)\:/x'rly(Gxy -> (Hx&Jyx)) :. 3 x 3y((Fx&Hy) &Jxy)

29. \:/x\:/y'rlz((Fxy&Fxz) -+ Fyz) 3x3y(Fxy&--,Fyx) . . ,'rfxFxx

30.

5.2

* 'vz(,H z +-+ Vx(Fx&Gz) )Vx3y(Gy&Fx) :. --,\fxHx

Universal Proof

Introducing a universal formula, in this system, requires a new method of proof. We already have three proof techniques: direct proof, indirect proof, and conditional proof. Quantificational logic adds universal proof.

Universal Proof

n.

n + 1.

ShewVvA ShewA[c/v] [

[:

To prove a universal conclusion, in other words, prove an instance of it. The instance must result from substituting a constant new to the proof for the quantified variable. Since no information will appear anywhere earlier in the proof regarding the new constant, it seems to stand for no object in particular. It represents, as it were, an arbitrarily chosen object. Because the proof p uts no constraints on it, absolutely any object could play this role. Consequently, though we prove something about c, we have shown how to prove it about anything. And this j ustifies our drawing a universal conclusion. This too corresponds closely to mathematical practice. Consider this example of a theorem and proof from a standard calculus text: If a function has a derivative which is zero at each point of an interval, the function is constant on that interval. . . . [Proof:] Suppose f'(x) = 0 for each x on an interval. . 5

Or this, from a well-known high school geometry textbook: Theorem 9-2 . In a plane, two lines are parallel if they are both perpendicular to the same line. Proof. Given that £ 1 1-L at P and L2 1-L at Q. It is given that £ 1 and L2 are coplanar. We need to show that they do not intersect. Suppose that £ 1 intersects L 2 at a point R. Then there are two perpendiculars from R to L. By Theorem 6-4 , this is impossible. Therefore· £ 1 I £2 . 6 These are universal proofs. The former establishes a result about all functions by proving some­ thing about an arbitrary function f. The latter derives a conclusion about any two lines from reasoning about two arbitrarily selected lines, £ 1 and L2 . (Within this universal proof there is an indirect proof. The proof also combines an existential exploitation with another step: ". . . at a point R." ) 5 Michael Spivak, Calculus on Manifolds ( Menlo Park: W. A. Benj amin, Inc, 1965 ) , pp . 13 and 12. 6 Edwin E. Moise and Floyd L. Downs, Jr., Gwmetry ( Menlo Park: Addison-Wesley, 1 967 ) , p . 230.

Quantified Natural Deduction

1 15

It might seem that this form of proof allows us to prove very silly arguments valid . It lets us derive a universal formula from one of its instances. So, can't we show that, if Fred loves the Go-Gos, everybody loves the Go-Gos? Fortunately, no, because of the new-constant requirement. A Fred loves the Go-Gos. 1 .Lfg Show everybody loves the Go-Gos. 2. Show \/xLxg Show a loves the Go-Gos. 3. Show Lag We can't go from the information that Fred loves the Go-Gos to the conclusion that some arbitrarily selected a does. To take an example of a universal proof, consider this inference pattern: All F are G; everything is F; so everything isG. To establish its validity, we construct a universal proof:

1. 2. 3. 4. 5. 6. 7.

'v'x(Fx -- Gx) 'v'xFx � 'v'xGx � Ga Fa [ (Fa-Ga) Ga

A A

'v'E,2 VE, 1 --E,6,5

To show that everything is G, we show that some arbitrarily chosen object a is G.

Problems Using deduction, show that these arguments are valid. l . Anything you think you can achieve, you can achieve. You should try to achieve everything you can achieve. So, anything you think you can achieve, you should try to achieve. 2. Every team that finishes in last place declares its next year a rebuilding year. So, no teams from Chicago will declare next year a rebuilding year only if no Chicago team finishes in last place. 3. No Ivy League colleges have tuitions of under $8 ,000 a year. Every state-affiliated college has a tuition under $8,000 per year. A college is private if and only if it is not state-affiliated. It follows that every Ivy League college is private. 4. No mammals but bats can fly. Every commonly-kept house pet is a mammal, but none are bats. So nothing that can fly is a commonly-kept house pet. 5. Nothing stupid is difficult . Everything you can do is stupid; anything that isn't difficult, I can do better than you. So anything you can do, I can do better. 6. Anybody who is reflective despises every demagogue. Anybody who is a demagogue despises everyone. Thus, since there are demagogues, every reflective person despises someone who, in turn, despises him or her. 7. There are no good books that do not require their readers to think. Every book that has inspired acts of terror has been inflammatory. No inflammatory books require their readers to think. Therefore all books that have inspired acts of terror are no good . 8. Anyone with some brains can do logic. Nobody who has n o brains i s fit to program computers. No one who reads this book can do logic. So no one who reads this book is fit to program computers.

1 16

Logic, Sets and Functions

9. A person is humble if and only if he or she doesn't admire him- or herself. It follows that nobody who admires all humble people is humble.

10. All Frenchmen are afraid of Socialists, and Socialists fear only Communists. Thus, every French Socialist is a Communist.

1 1 . All horses are animals. So all heads of horses are heads of animals.

12. A psychiatrist can help all those who cannot help themselves. So a psychiatrist can help someone who can help himself.

13. All Don Juans love all women. All who have a conscience treat all whom they love well. If some Don J uans have consciences, therefore, all women are treated well. 14. An Olympic athlete could outrun everyone on our team. Since none on our team can outrun themselves, no one on our team is an Olympic athlete.

1 5 . A person is famous if and only if everyone has heard of him or her. So all famous people have heard of each other. 1 6 . If nobody comes forward and confesses, then someone will be punished. So someone will be punished if he doesn't confess.

1 7. Mary was Jesus's parent. Mary's parents were born with the taint of original sin. But Jesus was not tainted by original sin. Therefore, it's not true that everyone who has a parent born with the taint of original sin is also so tainted.

18. Popeye and Olive Oyl like each other, since Popeye likes everyone who likes Olive Oyl, and Olive Oyl likes everyone. 19. I like everyone who likes everyone I like. So there are people that I like.

20. The government chooses to do x rather than y j ust in case it doesn't choose to do y over x. A person has veto power just in case the government can choose to do x over y only if that person doesn't prefer y to x. A person is a dictator just in case the government chooses to do x rather than y if he prefers x to y . Consequently, everyone with veto power is a dictator. These are the syllogistic patterns that Aristotle considered valid, together with their medieval names. Show that each is valid in quantification theory. (Some require extra assumptions; they are listed in parentheses.) 2 1 . Barbara: Eyery M is L; Every S is M; :. Every S is L. 22. Celarent: No M is L; Every S is M; :. No S is L.

23 . Darii: EveryM isL; Some S is M; :. Some S is L.

2 4 . Ferio: N o M is L; Some S is M; .. Some S is not L. 25. Cesare: No L isM ; Every S is M; :. No S is L.

26. Camestres: Every£ is M; No S is M; :. No S is L .

27. Festino: N o L is M; Some S is M ; :. Some S is not L .

28. Baroco: Every L is M ; Some S is not M ; :. Some S is not£.

Quantified Natural Deduction

117

29. Darapti: Every M i s L; Every M is S; (There are Ms;) :. Some S is L .

30. Felapton: No M isL; Every M is S; (There are Ms;) :. Some S is not L. 3 1 . Disamis: Some Mis L; Every M i s S; . . Some S is L.

32. Datisi: Every M is L; Some M is S; . . Some S is L.

33. Bocardo: Some M i s not L; Every M i s S; . . Some S i s not L.

34. Ferison: No M is L; Some M is S; :. Some S is not L.

Medieval logicians added other syllogistic patterns to those Aristotle explicitly held valid. Show that these "subaltern moods" are valid, at least with the added assumptions in parentheses. 35. Barbari: Every M is L; Every S is M ; (There are Ss;) :. Some S is L.

36. Celaront: No M is L ; Every S is M; (There are Ss;) :. Some S is not L . 37. Cesaro: No L is M; Every S is M; (There are Ss;) :. Some S is not L.

38. Camestros: Every L is M ; No S is M; (There are Ss;) :. Some S is not L.

Theophrastus, who succeeded Aristotle as head of the Lyceum, also added additional syllogistic principles. Show that these too are valid, with the added assumptions in parentheses. 39. Baralipton: Every M is L; Every S is M ; (There are Ss;) :. Some L is S.

40 . Celantes: No M is L; Every S is M; :. No L is S.

41 . Dabitis: Every M is L; Some S is M; :. Some L is S.

42. Fapesmo: Every M is L; No S is M; (There are Ms;) .. Some L is not S. 43. Frisesomorum: Some M is L; No S is M; :. Some L is not S.

These arguments both have the form of argumenta a recto ad obliquum, in the terminology of Joachim Junge, who discussed such arguments in his 1638 textbook. They played an important role in leading logicians beyond Aristotelian logic to a comprehensive theory of relations. Show that each is valid. 44. K nowledge is a conceiving; :. The object of knowledge is an object of conception. (Aristotle,

Topics )

45. A circle is a figure; :. Whoever draws a circle draws a figure. (Junge) These principles concern the distribution of quantifiers over sentential connectives. Quine has referred to them as rules of passage. Show that each is valid. (Throughout, A is any formula not containing x.)

46. \fx(Fx&A) - (\fxFx&A) 47. 3x(Fx&A) - (3xFx&A) 48. Vx(Fx V A) 49. 3x(Fx V A)

f-->

f-->

(VxFx V A) (3xFx V A)

Logic, Sets and Functions

118 50. \/x( Fx __, A) ,_, ( 3xFx __, A) 5 1 . \/x(A __, Fx) ,_, (A __, VxFx) 52. 3x(Fx --, A) ,-, (\/xFx __, A)

53. 3x(A __, Fx) ,-, (A __, 3xFx)

Chapter 4 asserted that the order of existential quantifiers within a string of such quantifiers makes no difference; similarly for universal quantifiers. Illustrate this by showing valid both these principles: 54 . 3x3yFxy 3y3xFxy 55. \/x\/yFxy \/y\/xFxy

Although, in general, we can't switch existential and universal quantifiers in order to reach an equivalent formula, we can do this in special circumstances. Without using any rules of passage above, show that each of these switches is legitimate .

. 56.) 3x\/yFxy; . . \/y3xFxy 57.

* \/x3y(Fx&Gy); :. 3y\/x(Fx&Gy)

58.

* \/x3y(Fx V Gy) ; :. 3y\/x(Fx V Gy)

59.

* \/x3y(Fx --, Gy); .. 3y\/x (Fx --, Gy)

60. * \/x3y(Gy __, Fx) ; :. 3y\/x(Gy --, Fx)

Use deduction to establish the validity of these argument forms. 61 . 3xFx __, 3xGx; :. 3x(Fx --, Gx) 62 . 3xFx V 3xGx; .. 3x(Fx V Gx)

63. 3x(Fx V Gx) ; :. 3xFx V 3xGx 64. 3xFx __, \/y(Gy __, Hy) ; 3xJx --, 3xGx; :. 3x(Fx&Jx) -> 3zHz

65. 3xFx V 3xGx; Vx(Fx --, Gx) ; :. 3xGx 66. \/x((Fx&Gx)

->

Hx) ; Ga&\/xFx; . . Fa&Ha

, 67. \/x(3yFyx -> \/zFxz) ; :. \/y\/x(Fyx -> Fxy) . .... -

68. Vx(Fx -> \/y(Gy -> Hxy ) ) ; 3x(Fx&3y,Hxy); :. 3x,Gx 69. ,3x(Fx&Gx) ; :. \/x(Fx -> -.Gx)

70. \/x( Fx -> ,Gx) ; .-. -.3x(Fx&Gx)

1 19

Quantified Nat ural Deduction

5.3

Derived Rules for Quantifiers

The system resulting from adding these three rules and universal proof to our natural deduction system for sentential logic already has all the power it needs. It can demonstrate the validity of any valid argument form in quantification theory. Nevertheless, adding some further rules can increase its efficiency and naturalness considerably. This section will present some derived rules which, while theoretically superfluous, make the proof system more pleasant to work with. First, we can apply quantifier rules several times in a single step. S uppose, for example, that we want to take instances of 'VxVyFxy'. The universal exploitation rule requires us to move first to the instance 'VyFay' , say, and then to its instance 'Faa'. But any constant can substitute for both 'x' and 'y' here, since both are universally quantified. So it's easy to perform the operation in one step. We can move directly from 'VxVyFxy' to an instance of an instance of it , 'Faa'. Similarly, we could move to 'Fab', 'Fba' , 'Fbb' , ' Fae' , etc. We can record that we've exploited a series of n universal quantifiers at once by writing not 'VE' but 'VE n '. In this case, then, the application would look like:

m. m + p.

VxVyFxy Faa VE 2 , m

We can readily do the same for series of existential quantifiers, provided that we replace each quantified variable with a new constant. So we may move immediately from '3x3y3z(Fxy&Fyz&Fzx) ' to 'Fab&Fbc&Fca', citing the rule we've applied as 3£3 . Finally, we can also compress a sequence of universal proofs into a single proof in a similar way. We can prove a formula with an initial string of n universal quantifiers by proving an instance of an instance . . . of an instance of the formula, using n different new constants to replace the n quantified variables. For example, we can prove 'VxVy(Fxy __, ,Fyx)' by proving 'Fab __, --,Fba', where neither 'a' nor 'b' have appeared earlier in the proof. Although these combinations of rules or proofs are very convenient, it's not a good idea to combine exploitations of universal and existential quantifiers. While VE 2 and 3£4 are fairly easy to follow, something like VE2 3E 3VE would be extremely difficult to apply or understand. Second, variables have no independent meanings. The formulas 'VxFx' and 'VzFz' function in logically similar ways; so do ' 3yGy' and '3wGw ' . Indeed, it's easy to show that these pairs are equivalent. Here are the proofs in one direction:

A 1 . \fxFx 2. � \fzFz Fa 3. 4. l[ Fa VE,1

r�

1 . 3yGy A 2. � 3wGw 3. [ Ga 3E,1 4. 3wGw 31,3 The proofs of the other directions follow exactly the same pattern. Thus, we can substitute one variable for another throughout a formula. The only restriction we must observe is that we should not introduce into a formula a variable that is already there; otherwise we could go from the legitimate formula 'Vx\/yFxy' to the very different nonformula 'VxVxFxx'. The derived rule, then, is this.

Logic, Sets and Functions

1 20 V R(Variable Rewrite)

n.A m.A[v/u] VR, n (wherevis foreign toA) Third, it's extremely useful to have a direct way of dealing with negations of quantified formulas. Our rules allow us to attack formulas with quantifiers as main connectives in one step. But, if a quantifier is preceded by a negation sign, the proof strategy becomes much more complicated. Luck­ ily, negations of quantified formulas are equivalent to formulas with quantifiers as main connectives. Two rules, called quantifier negation rules, relate quantified formulas to their negations. In the process, these rules relate the universal and existential quantifiers. In fact, they show how to define each quantifier in terms of the other. QN(Quantifier Negation)

QN , m QN, n QN(Quantifier Negation)

n . -,\fvA m. 3v-.A

QN, m QN, n

Both versions of quantifier negation are invertible. That is, the premise and conclusion are equivalent, so they can be used in either direction . We can infer ' 3x,Fx' from ' ,\:/xFx', and vice versa. Similarly, j ust by adding a negation sign, we can see that ' 3xFx' is equivalent to ' ,\:fx,Fx ' , and that '\:/xFx' i s equivalent t o '-.3x,Fx' . So we can define the quantifiers in terms of each other. Deriving even simple applications of these rules from our basic quantifier rules shows how much work they can save. These two proofs are necessary to show, for example, that '3x-,Fx' and ',\:fxFx' are equivalent.

1 . 3x,Fx A 2. � ,v'xFx 3 . \fxFx AIP 4 . [ ,Fa 3E,1

5 . Fa

I . ,v'xFx 2. � 3x,Fx

3 . ,3x,Fx 4. � v'xFx 5 . [� Fa 6. ,Fa 7. 3x,Fx 8. ,3x,Fx 9.

,v'xFx

v'E,3 A

AIP AIP

3I ,6

R ,3 R,1

121

Quantified Natural Deduciitt>n,,. "'-* \.,-...

Deriving the other equivalence i s similar. Because QN takes the for;:f of an equivalence, the replacement principle allows us to apply it to portions of formulas as well as entire formulas. Each of the following is thus a legitimate application of QN.

-i3xFxx ,VxVyGxy Vx\/y-Nz(Fxz&Fyz) Vx-,Fxx 3x,VyGxy VxVy3z,(Fxz&Fyz)

3x,3yGyx 3xVy,Gyx

This rule, too, can be applied several times in a single step: we can abbreviate

to

,VxVyVz((Rxy&Ryz) --> Rxz) 3x,\/yVz((Rxy&Ryz) --> Rxz)QN 3x3y,\/z((Rxy&Ryz) --> Rxz)QN 3x3y3z, ((Rxy&Ryz) -+ Rxz)QN -,\:fxVy\/z((R,xy&Ryz) -, Rxz) 3x3y3z,(( Rxy&Ryz) -+ Rxz)QN 3

Problems

Establish the validity of these argument forms by means of deduction.

-,\ 1 . \/x(Fx --+ \/y(Gy -+ Hxy) ); \/x(Dx -+ \/y(Hxy -+ Cy) ) ; .. 3x(Fx&Dx) _, \/y(Gy --+ Cy) 2. \/x((Fx V Hx) -+ (Gx&Kx) ) ; ,\/x( Kx&Gx) ; . . 3x,Hx

3. 3x(Fx&\/y(Gy -+ Hxy) ) ; :. 3x (Fx&(Ga -> Hxa) )

4 . \/x\/y(Gxy (Fy -> Hx) ); \/zGaz; :. 3xFx -> 3xHx 5. \/x(3yFxy -+ 3y,Gy); 3x3yFxy; \/x(Gx

,Hx); :. ?JxHx

6. \/x( � x -> Hx); 3x3y((Fx&Mx)&(Gy&Jyx)); 3xHx -> Vy\/z(,Hy -> , J yz) ; :. 3x(Gx&Hx) 7. \/x( 3yFxy -+ \/yFyx); 3x3yFxy; :. \/x\/yFxy

-'ts.

\/x(Fx ;.-; Gx) ; :. \/xFx ;.-; \/xGx

9. \/x(Fx ;.-; Gx) ; :. 3xFx ;.-; 3xGx

1 0 . 3x(Fx --> Gx) ; :. \/xFx -> 3xGx

1 1 . 3x(Fx&\/y(Gy -> Hy)) ; \/x(Fx --> (,Lx --> ,3z(Kz&Hz))) ; :. 3x(Kx&Gx) -> 3xLx

)J

12./ 3x\/y(3zFyz -+ Fyx); \/x3yFxy; . . 3x\/yFyx 13. \/x(Kx -> ( 3yLxy -> 3zLzx) ); \/x(3zLzx -> Lxx) ; ,3xLxx; :. \/x(Kx -+ \/y,Lxy) 14.

,\/x(Hx V Kx); \/x((Fx V ,Kx) -> Gxx); :. 3xGxx

15. \/x(Fxx --> Hx) ; 3xHx --> ,3yGy; .. \/x(Gx -> ,3zFzz) 16. \/x(Fx -> (Gx V Hx) ); \/x(( Jx&Fx) -> ,Gx); \/x(,Fx -> ,Jx); :. \/x(Jx -> Hx)

1 7 . ,3x(Hxa&,Gxb) ; \/x,(Fxc&Fbx) ; \/x(Gdx -> Fxe); :. ,(Hea&Fec)

••

a

18. \/x(3y(Ay&Bxy) --> Cx) ; 3y( Dy&3x((Fx&Gx)&Byx)) ; \/x(Gx -> Ax) ; .. 3x(Cx&Dx) · 19., t \/x\/yVz((Fxy&Fyz) --> Fxz); -,3xFxx; :. \/x\/y(Fxy --> ,Fyx) , 20.)Vx\/y\/z((Fxy&Fyz) -> ,Fxz); :. \/x,Fxx .�.., , ,, .

� ,...

122

� Logic, Sets and Functions

__,, · 2 1\ VxVyVz( (Fxy&Fyz)

_ 22. VxVyVz( (Fxy&Fxz)

23. VxVyVz( ( Fxy&Fyz)

-->

Fxz); VxVy(Fxy --> Fyx); 3x3yFxy; .-.yxx

-->

Fyz); VxVy(Fxy --> Fyx) ; VxFxx; :. VxVyVz ((Fxy&Fyz)

-->

Fxz)

-->

Fxz); VxVy(Fxy --> Fyx); VxFxx; :. VxVyVz((Fxy&Fxz)

-+

Fyz)

24. VxVyVz( ( Fxy&Fxz) --> Fyz) ; . . VxVyVz((Fxy&Fxz) --> Fzy)

25. 3x\ly,Fxy; :. 3xVyVz(Fxz 26. Vx(Fx

Fzy)

VyGy); :. VxFx V Vx,Fx

27. Fa --> ( 3xGx --> Gb) ; Vx(Gx --> Hx); Vx(-,Jx --> -,fix ) ; :. -,Jb --> (,Fa v Vx,Gx) 28. Vx (Dx --> Fx) ; :. Da --> (Vy(Fy -+ Cy) -+ Ga) 29. 3xFx --> Vy( (Fy V Cy) _, Hy) ; 3xHx ; -,\lz-,Fz; :. :lx(Fx&Hx)

30. 3x( Fx&Vy(Ty --> Gy)); Vx(Fx -+ (3y(Ay&Gy) ---> Bxx) ) ; 3z( Az&Tz) ; .-. 3xBxx 3 1 . Vx,Fxc

-->

3xGxb; :. 3x(-,Fxc -+ Gxb)

32. VxFx; . . ,3xGx - -,(3x(Fx&Gx)&Vy(Gy -+ Fy)) 33. 3x(Px&,Mx) --> Vy(Py --> Ly); 3x(Px&Nx) ; Vx(Px --> ,Lx); . . 3x(Nx&Mx) 34 Vx(Fx --> , 3y(Gy&Hxy)); Vx( Fx ---> 3y(Fy&Hxy ) ) ; .. VxVy(Hxy ---> Hyx) -+ Vx,(Fx&Gx) :.

•·;,;;,;. - 35. ,VxVy(Fxy -+ Fyx) ; . . VxVy(Fxy - Fyx) 36. Vx-,Fxx; :. -,:JxVy(Fyx

+--+

3z\fw( (Fwz -+ Fwy)&,Fzy))

37. VxVy ( ( Ax &By) ---> Cxy); 3y(Fy&Vz(H z _, Cyz)); VxVyVz( ( Cxy&Cyz) _, Cx,Vx(Fx ---> Bx); . . VzVy((Az&H y) ---> Czy )

Chapter 6

Identity and Function Symbols This chapter will extend our system o f q uantificational logic to include a new binary predicate and a new way of forming singular terms. The new predicate, identity, represents the familiar notion of equality in mathematics and an equally familiar concept in natural language. English most often expresses this concept using the word is. New singular terms will be formed with the help of function symbols. Function symbols occur frequently in mathematics; '+', ' x ', '' -' , and ' -,- ' , not to mention 'J' and 'o', all represent functions. English expressions such as mother of, diameter of, length of, kinetic energy of, hometown of and grade of also represent functions. We've discussed functions at several points in earlier chapters; we will now develop ways of representing them explicitly in our formal language. Quantification theory as it stands can already represent identity by using a dyadic predicate and adopting some special axioms concerning it. The theory can furthermore represent functions and definite descriptions by using polyadic predicates and some accompanying assumptions. But adding identity and function symbols permits quantification theory to formalize very naturally a wide variety of English arguments and mathematical theories. In certain subtle ways, it also adds to quantificational logic's power.

6.1

Identity

To symbolize identity, we'll use the symbol '=', which mathematicians generally use for the same concept. (For negations of identity formulas, we'll use the symbol 'f=' to abbreviate ' -i . . . = ... '. Thus, we'll generally write ,a = b as a f= b . ) Obviously, having such a predicate in our logic allows us to work with mathematical formulas with a minimum of translation or alteration. It also allows us to render many English sentences very simply and naturally: 1.

( a) Tully is Cicero.

(b) Austin is the capital of Texas.

( c) Ronald Reagan is the President of the United States.

(d) The wealthiest town in the United States is West Hartford, Connecticut. (e) Baltimore was Babe Ruth's birthplace.

In each of these sentences, the verb to be expresses identity. This, of course, isn't that verb's only job. In Dan's car is red, Phyllis is angry and Peter is coming to town tomorrow it expresses predication. Aristotle was perhaps the first person to recognize the distinction between these two 123

124

Logic, Sets and Functions

roles of to be. Philosophers today sometimes refer to this as the contrast between the is of identity and the is of predication. A primary use of the identity symbol in logic is to formalize the is of identity. Adding the identity symbol to our language requires only one small revision in our formation rules. We must add the rule that, where c and d are constants, c = d is a formula. (In fact, it's an atomic formula. ) The required addition to our semantic principles is similarly trivial. A formula of the form c = d is true on an interpretation just in case the interpretation function assigns the same object to both c and d. Identity allows us to translate a wide and surprising variety of English constructions. First, we can easily express numerical determiners. In quantificational logic, the existential quantifier means, in essence, at least one. But how can we express, for instance, at least two ? We can use two existential quantifiers, but this, in itself, doesn't suffice; ' 3x3y(Fx&Fy ) ' doesn't rule out the possibility that 'x' and ' y ' stand for the same object. To say that there are at least two Fs, we need to say that there is an object x and there is another object y , both of which are Fs. We can say that x and y are different by writing 'x -=J y'. So the appropriate symbolization becomes '3x3y(Fx&Fy&x -=J y ) ' . To say that there are at least three Fs, we need t o write three existential quantifiers and three negated identities to guarantee that all the objects are distinct. The correct translation is corre­ spondingly "3x3y3z(Fx&Fy&Fz&x -=J y&y :fa z&x =/= z)'. We can apply the same strategy for any number of objects.

There are at least . . . Fs Translation One Two Three Four

Fx (Fx&Fy&x -=J y) (Fx&Fy&Fz&x -=/ y&y -=/ z&x -=/ z) (Fx&Fy&Fz&Fw& x -=J y&x -=/ z&x -=/ w&y -=/ z&y :f. w&z :f. w) (Fx 1 & . . . &Fxn & 3x1 . . . 3x n n X1 -=J x2&x1 -=/ x3& . . . &x1 :/- Xn&x2 -=/ X3& . . . &x2 -/- Xn& . . . &xn - 1 =/- X n ) We can thus symbolize numerical determiners of the form at least n . Notice, however, that the complexity of the resulting formulas increases very quickly. We can also translate sentences containing determiners of the form at most n. Suppose, for example, that we want to say that there is at most one omnipotent being. This assertion does not imply that there is an omnipotent being; it merely states that there is no more than one. The assertion is the negation of There are at least two omnipotent beings, so we might try to negate a formula of the kind we've j ust seen, obtaining ',3x3y(Ox&Oy&x -/- y ) ' . This is equivalent to 'VxVy( (Ox&Oy) -+ x = y)', which says that, for any x and y, if x and y are both omnipotent, then x and y are identical. It says, that is, that all omnipotent beings are the same. But that means that there is at most one omnipotent being. Similarly, if we want to translate There are at most two superpowers, we can treat it as the negation of There are at least three superpowers, obtaining ',3x3y3z(Sx&Sy&Sz&x -=J y&x -/­ z&y -/- z) '. But this is equivalent to 'Vx'vy'vz( (Sx&Sy&Sz) -+ (x ::; y V x = z V y = z)) ' , which says that, if x, y and z are all superpowers, two of them must be identical. This strategy works for any value of n . 3x 3x3y 3x3y3z 3x3y3z3w

Identity and Function Symbols

125

There are at most . . . Fs Translation One Two

( ( Fx&Fy) -> x = y) ( (Fx&Fy&Fz) -> (x = y V x = z V y = z) ) Three Vx\/y\/z\/w ((Fx&Fy&Fz&Fw) -> (x = y V X = z V X = w V y = z V y = w V z = w)) n Vx1 l . . .'vxn ((Fx1 & . . . &Fxn ) -> (x1 = X2 V X1 = X3 V . . . Vx1 = Xn V X2 = X3 V . . . Vx 2 = Xn V . . . V X n - 1 = X n ) ) Finally, we can translate numerical determiners of the form exactly n. Since There are exactly n Fs is equivalent to a conj unction of There are at least n Fs and There are at most n Fs, we can devise such translations simply by conjoining the translations given in the above tables. But there is an easier way. If we want to say that there is exactly one god, we can say that there is a god, and anything that is a god is identical with it. If we want to say that there are exactly two great American writers, we can say that there are at least two and, moreover, that any great American writer must be identical with one or the other. This suggests the strategy: Vx\/y Vx\/y\/z

There are exactly Zero One Two Three

Vx 3xVy 3x3yVz 3x3y 3zVw

n

3 x 1 . . . 3x nVY

. . Fs Translation

-,Fx (Fx& (Fy -> y = x)) (Fx&Fy&x -:f:. y&(Fz -> (z = x V z = y ) ) ) (Fx&Fy&Fz&x -:f:. y&x 'I z&y =I- z& ( Fw -> ( w = x V w = y V w = z))) (Fx 1 & . . . &Fxn &x1 =f. x2&x 1 f= x3& . . . &x1 =f. Xn&x2 =J x3& . . . &x 2 =/- Xn& , . . &X n - 1 =f. Xn &(Fy -> (y = X1 V . . . V y = Xn ) ) )

Second, identity allows u s to symbolize sentences containing the words other, another and else. Typically, these words are anaphoric in much the way pronouns are. That is, they carry with them an implicit reference to something introduced by prior discourse. If we say John admires himself. But he hates everybody else , we mean that John hates everybody other than John. A good symbolization of the latter sentence, then, would be 'Vx(x =f. j -> Hjx ) ' , which says that everyone who isn't John is hated by John. Of course, the antecedent discourse may have introduced some other referent. If the preceding sentence were John loves Susan, we could take everybody else as meaning "everybody other than Susan" or "everybody other than John and Susan." An.appropriate symbolization would be '\fx (x =f. s -, H j x)' or 'Vx((x =f. s&x =f. j) -> Hjx) '. Unfortunately for the translator, natural languages contain referential ambiguities, and determine the range of possible interpretations in very complex and still poorly understood ways. In many cases, only one interpretation is possible. If we say A passenger shouted, "This is a hijacking!" and another waved a pistol overhead, we clearly mean that a passenger other than the one who shouted did the pistol-waving. Introducing the obvious predicates, then, we could translate this sentence as ' 3x3y3z(Px&Sx&Py&y =f. x&Gz&Wyz} ' with all quantifiers in front, or, more naturally, as ' 3x((Px&Sx)&3y(Py&y =f. x&3z(Gz& Wyz ) ) ) ' . Third, identity allows us t o translate superlatives such as fastest and most interesting into formu­ las containing predicates that represent comparatives such as faster and more interesting. Suppose that 'F' symbolizes is faster than. To express This is the fastest car, we can say that it's faster than every other car. That is, we can say that this car is faster than every car not identical with it. So an appropriate translation would be 'Vx((Cx&x =f. a) -> Fax)', where 'a' stands for this car. (We might also want to add a subformula 'Ca', to say that this is a car . ) If we want to say that this is

Logic, Sets and Functions

126

the slowest car, we can say that every other car is faster than it: '\/x( (Cx&x =f. a) ----+ Fxa)'. Notice that, if this is the fastest car, then it follows that no car is faster than it. But these two statements aren't equivalent. Suppose there are several cars that tie for the title of fastest car in the universe. No car will be faster than any of them, but none will be the fastest car; each will be one of the fastest, but not the fastest. Superlatives thus translate readily into formulas containing dyadic predicates representing cor­ related comparatives. One must be very cautious in translating, however, to avoid confusing the direction of the comparison and mixing up more and less : a is more F thanb Fab Vx(x =f. a ----+ Fax) a is the mostF Vx(x =f. a ----+ Fxa) a is the least F Gab a is less G than b a is the most G Vx (x =f. a ----+ Gxa) a is the least G Vx(x =f. a ----+ Gax) The order of the variable and constant in this formula makes all the difference between more and less, most and least. In our examples above, we had to add a predicate to these schemas; we wanted to say, not that this was the fastest thing, but that this was the fastest car. Finally, identity permits us to symbolize many sentences with the word only. Recall that we can translate sentences of the form Only F G as 'Vx( ,Fx ----+ -.Gx) ' or, equivalently, 'Vx(Gx ----+ Fx)'; as equivalent, that i s , to No non-F G or All G F. Identity allows us to translate as well sentences involving only in combination with singular terms. Consider, for example, Only Elmo got drunk. We could paraphrase this as Only those identical with Elmo got drunk. So, applying the strategy we devised for only -general-term combinations, this should be equivalent to All who got drunk were identical with Elmo, or, in symbolic terms, 'Vx(Dx ----+ x = e) ' . Equivalently, we could say that nobody but Elmo got drunk: 'Vx(x =f. e ----+ -.Dx)'or',3x ( Dx V x =f. e ) ' or 'Vx(,Dx V x = e ) ' . These formulas seem surprising, for they do not imply that Elmo got drunk. Only Elmo got drunk is at least extremely misleading, if not false, if Elmo stayed sober. The translation of Only Elmo got drunk as ' De&Vx (Dx ----+ x = e) '- "Elmo, and only Elmo got drunk"- thus better harmonizes with certain linguistic intuitions.

Problems Symbolize these sentences in Q with identity. If a sentence is ambiguous, explain why. 1 . April is the cruelest month. (T. S. Eliot) 2. Twice no one dies. (Thomas Hardy) 3. Philosophy is the highest music. (Plato) 4. To work for the common good is the greatest creed . (Albert Schweitzer) 5. This poem is the reader and the reader this poem. (Ishmael Reed) 6. Liberty is always dangerous, but it is the safest thing we have. (Harry Emerson Fosdick) 7. Action is the last resource of those who know not how to dream. (Oscar Wilde) 8. The seed ye sow , another reaps; the wealth ye find, another keeps; the robes ye weave, another wears; the arms ye forge, another bears. (Percy Bysshe Shelley) 9. The most precious thing a parent can give a child is a lifetime of happy memories. (Frank Tyger)

127

Identity and Function Symbols

10. That action is best, which procures the greatest happiness for the greatest number. (Francis Hutcheson) 1 1 . The worst-tempered people I've ever met were people who knew they were wrong. (Wilson Mizner) 12. The most valuable executive is one who is training somebody to be a better man than he is. (Roobert Ingersoll) 13. I hold that man in the right who is most closely in league with the future. ( Henrik Ibsen) 1 4 . An executive organization, like a chain, is no stronger than its weakest link. ( Robert Patterson) 15. In cases of difficulty and when hopes are small, the boldest counsels are the safest. (Livy) 16. Behold, the people is one, and they have all one language; . . . and now nothing will be restrained from them, that they have imagined to do. (Genesis 1 1 :6) 17. Once upon a time there were Three Bears, who lived together in a house of their own, in a wood. (Robert Southey) 18.

* He prayeth best, who loveth best all things both great and small.

(Samuel Taylor Coleridge)

Sentences 19, 20, 23 and 24 are cited by J ames McCawley; each i nvolves only in combination with singular terms. 19. Only Lyndon pities himself. 20. Only Lyndon pities Lyndon. 2 1 . Lyndon pities only Lyndon. 22. Lyndon pities only himself. 23. * Only Lyndon pities only Lyndon. 24. * Only Lyndon pities only himself. 25. * Only Lyndon pities only those who pity only Lyndon.

6.2

Deduction Rules for Identity

The two principles of self-identity and the indiscernibility of identicals underlie natural deduction rules for identity. The rule for identity introduction is extremely simple: whenever you wish, in the course of a proof, you may record c = c, for any constant c. Identity Introduction(= I)

n.c = c

=I

Logic, Sets and Functions

128

This rule allows a very quick proof of the principle that everything is self-identical: 1 . � \::/xx=x

r�

2. 3 . [ a=a

a=a

=l

The rule of identity exploitation x states that, for any constants c and d, c tuting c for d, or d for c, in any formula free at that point in the proof.

= d j ustifies substi­

Identity Exploitation(= E)

n.c = d m.A p.A[c//d] (orA[d/ /cl )

=

E, n, m

Here A[c//d] is any result of substituting c for some or all occurrences of d throughout A. If A is ' Fdd' , for instance, A[c//dj could be 'Fed', 'Fdc' , or 'Fee'. This rule allows us to prove that identity is symmetric: that the order of the terms in an identity stat ement , in other words, makes no difference.

1 . � Vx\::/y(x=y ---y=x) 2. � \::/y(a=y --- y =a) 3 . � (a=b-b=a) 4 . � a=b ACP =I a=a 5. b =a 6. =E,5 ,4

We can reach line 4 here either by introducing 'a = a' by identity introduction or by substituting 'a' for 'b' in 'a = b'. Identity exploitation also allows us to prove that identity is transitive.

1 . � Vx\::/y\::/z((x=y&y=z) --x = z) 2. � \::/y\::/z((a=y&y=z) -a=z) 3 . � \::/z((a=b&b=z) ---a=z) 4. � ((a=b&b=c)--a=c) (a=b&b=c) 5. a =b 6. b =c 7. a=c 8.

ACP &E ,5 &E ,5 =E,6 ,7

We begin by using a triple universal proof. We reach the crucial line, 6, by using identity exploitation. We can think of ourselves as substituting 'a' for 'b' in 'b = c' , or as substituting 'c' for 'b' in 'a = b' . Either way, we end up with 'a = c'. To take a final example, suppose we are faced with the argument , John is selfish. But everybody else is selfish too. So all people are selfish. Assuming that John is a person, we can let our universe of discourse consist consist just of people, and so translate this Sj \/x(x =I- j --> Sx) :. \/xSx To show that this argument form is valid, we can construct a universal proof, which contains an indirect proof.

Identity and Function Symbols

1 . Sj 2. 'f,dj - Sx) 'vxSx 3. 4. � S a 5 . -,sa 6 . (-,a=j :- Sa) 7. -,-,_a=J 8 . a=J 9. S a

129

A A

AIP

'vE ,2 ~E* ,6,5 -,-, ,7 =E, 1 ,8

We want to show that some arbitrary person- say Ann- is selfish. So we assume, for purposes of indirect proof, that she isn't selfish. Since everybody other than John is selfish, but Ann isn't, Ann must be John. But John is selfish as well; so if Ann is John, then, by identity exploitation, she's selfish. But this is contradicts our assumption that she isn't selfish.

P roblems

We can say that there is one and only one God, in symbolic terms, by writing the formula '3x\/y(y = x ....,. Fy) ' . Show that each of the following is a consequence of this formula. l . 3x(Fx&Gx) ....,. \/x(Fx -> Gx)

2. \/xFx -> 3x(Fx&Gx)

3. 3x(Fx&Gxx) ....,. 3x3y(Fx&Fy&Gxy) 4. 3x(Fx& (Ga --, Hx)) ....,. (Ga --, 3x(Fx&Hx)) 5 . 3x( Fx&(Gx -> Ha)) ....,. (3x(Fx&Gx) -> Ha)

6. 3x(Fx&,Gx) ....,. Vx( Fx -> ,Gx) 7. 3x(Fx&VyGyx) ....,. Vx3y( Fy&Gxy) 8. \/x\/y( ( Fx&Fy) -> x = y) 9. VxFx

->

\/x\/yx = y

1 0 . 3x3y\/z(x =J. y&(z = x V z = y)) ....,. 3x\/y(y = x ....,. ,Fy)

6.3

Function Symbols

Function or operation symbols combine with constants, variables and other function symbols to form singular terms. They are extremely common in mathematics. The basic arithmetic operations of addition, subtraction, multiplication and division, for example, are all functions. Function symbols also translate many English possessives and of constructions: the length of this line John's birthday Barbara's father the truth value of the sentence J ill's house the mass of an electron somebody's BMW the trunk of my Camaro nobody's honor the President of the United States Each of these expressions translates into quantification theory with the help of function symbols.

Logic, Sets and Functions

128

This rule allows a very quick proof of the principle that everything is self-identical:

r�

I . � 'vxx=x

2.

3 . [ a=a

a=a

=I

The rule of identity exploitation x states that, for any constants c and d, c = d justifies substi­ tuting c for d, or d for c, in any formula free at that point in the proof. Identity Exploitation(= E)

n.c = d m.A p.A[c//d] (orA[d//c] )

=

E, n , m

Here A [c//d] is any result of substituting c for some or all occurrences of d throughout A. If A is 'Fdd', for instance, A[c//d] could be 'Fed', 'Fdc' , or 'Fee'. This rule allows us to prove that identity is symmetric: that the order of the terms in an identity stat ement, in other words, makes no difference.

1 . � Vx'vy(x=y --- y=x) 2. � 'vy(a=y --- y =a) 3 . � (a=b-- b=a) 4 . � a=b a=a 5. b=a 6.

ACP =I =E,5 ,4

We can reach line 4 here either by introducing ' a = a' by identity introduction or by substituting 'a' for 'b' in 'a = b'. Identity exploitation also allows us to prove that identity is transitive.

1 . � 'vx'vy'vz((x=y&y=z) --x=z) 2. � 'vy'vz((a=y&y=z) -a=z) 3 . � 'vz((a=b&b=z) --a=z) 4. � ((a=b&b=c)---a=c) (a=b&b=c) 5. a=b 6. b=c . 7 a=c 8.

ACP &E,5 &E ,5 =E,6,7

We begin by using a triple universal proof. We reach the crucial line, 6, by using identity exploitation. We can think of ourselves as substituting 'a' for 'b' in 'b = c' , or as substituting 'c' for 'b' in 'a = b' . Either way, we end up with ' a = c'. To take a final example, suppose we are faced with the argument, John is selfish. But everybody else is selfish too. So all people are selfish. Assuming that John is a person, we can let our universe of discourse consist consist just of people, and so translate this Sj Vx(x fc j -► Sx) :. VxSx To show that this argument form is valid, we can construct a universal proof, which contains an indirect proof.

Identity and Function Symbols

I . Sj 2. fdj -- S x) 3. VxSx 4. � Sa 5 . -,Sa 6 . (-,a=j:--Sa) 7 . -,-,_a=J 8. a=J 9. Sa

129

A A AIP VE ,2 --E* ,6,5 -,-, ,7 =E,1 ,8

We want to show that some arbitrary person- say Ann- is selfish. So we assume, for purposes of indirect proof, that she isn't selfish. Since everybody other than John is selfish, but Ann isn't, Ann must be John. But John is selfish as well; so if Ann is John, then, by identity exploitation, she's selfish. But this is contradicts our assumption that she isn't selfish.

Problems

We can say that there is one and only one God, in symbolic terms, by writing the formula ' :3x\/y(y = x ....., Fy) ' . Show that each of the following is a consequence of this formula. l . :3x(Fx&Gx) ....., Vx(Fx --> Gx)

2. VxFx --> :3x(Fx&Gx) 3. :3x(Fx&Gxx) ....., :3x:3y(Fx&Fy&Gxy)

4. :3x(Fx&(Ga --> Hx) ) ....., (Ga --> :3x(Fx&Hx) ) 5. :3x( Fx&(Gx --> Ha) ) ....., (:3x(Fx&Gx) --> Ha)

6. :lx(Fx&-.Gx)

Vx( Fx --> -.Gx)

7. :3x(Fx&VyGyx) ....., \/x :ly(Fy&Gxy) 8. \/x\/y( ( Fx&Fy) --> x 9. VxFx

-->

Vx\/yx = y

10. :3x:3y\/z( x -/= y&(z

6.3

= y)

= x V z = y))

....., :lxVy(y

= x ....., -.Fy)

Function Symbols

Function or operation symbols combine with constants, variables and other function symbols to form singular terms. They are extremely common in mathematics. The basic arithmetic operations of addition, subtraction, multiplication and division , for example, are all functions. Function symbols also translate many English possessives and of constructions: John's birthday the length of this line the truth value of the sentence Barbara's father the mass of an electron Jill's house somebody's BMW the trunk of my Camaro the President of the United States nobody's honor Each of these expressions translates into quantification theory with the help of function symbols.

1 30

Logic, Sets and Functions

All functions take a certain number of arguments, i.e., inputs. A function taking just one input is a singulary function. Binary functions take two inputs; in general , n- ary Jun�tions take n inputs. Addition, multiplication, subtraction and division are all binary functions. Taking a square and taking a square root are both singulary functions. Let's agree to call constants, variables, and function terms singular terms, or, more simply, terms. To form a function term, we must concatenate an n-ary function symbol with n constants, variables, or other terms. The string of n terms must be enclosed in parentheses and separated by commas. Where '!', 'g' and 'h' are singulary, binary and ternary function symbols, respectively, we can form the function terms

J (a) f (x) f ( f ( b) ) J ( J (y) ) J (g(a, a) ) J(h(x, y, z))

g(a, a) g(a, x) g(y, b ) g(x, y) g(a, g(a, a)) g( f (x) , h( b , f (y) , z))

h(a, b , c) h(x, y, z) h(a, z , b ) h( f (a) , f ( b ), J(c) ) h(g(x, x), x, x) h(h(x, y, z), f (z), g(z, x))

Notice that function terms may appear inside other function terms. Function symbols of three or more places are usually written in front of their argument terms, with parentheses marking the boundaries. But singulary and binary function terms may be written in other ways. We may write binary function symbols between their argument terms, supplying parentheses only when necessary to avoid ambiguity. Thus mathematicians generally write '2 + 3' rather than ' + (2, 3 ) ' , 'a U b' rather than 'U(a, b)', etc. Indeed, we've done the same in sentential logic. Sentential connectives express truth functions, but we've written the binary connectives between the formulas they link together. Thus we've written the formulas on the left rather than the versions on the right:

PVQ

(P -+ Q)&R R .,_. (P&,R)

v(P, Q) &(--> (P, Q), R) +-> (R, &(P, ,Q))

In fact, since we know which connectives are singulary and which are binary, we don't need the parentheses in the expressions on the right. So we could express the formulas more simply:

PVQ

(P --> Q)&R R +-> (P&,R)

v(P, Q) & ( -> (P, Q) , R) +-> (R, &(P, ,Q))

vPQ

& _. PQR .,_. R&P,Q

The third column shows the form in which computers generally read logical formulas and other instructions. A group of Polish logicians developed and wrote in this notation earlier in this century; for this reason , it's called Polish notation. To incorporate function terms into our language, we need to change its formation rules slightly. Officially, we'll maintain the standard notation for function terms, though we'll allow ourselves to use forms in more frequent use when that proves convenient. Lower-case letters from the middle of the alphabet- 'f', 'g', 'h', 'i', etc. , with or without subscripts- will serve as our function symbols. We'll say that a function term is open if its contains some occurrences of variables, and closed if it doesn't. A closed term is either a constant or a closed function term. We may amend our formation rules to read: l . An n-ary predicate followed by n closed terms is a formula. 2. If t and t' are closed terms, then t

= t' is a formula.

3. If A and B are formulas, then so are ,A, (A&B) , (A v B), (A _. B), and (A .,_. B).

Identity and Function Symbols

131

4. If A is a formula with a closed term t and v is a variable not in A, then 3vA [v /t] and 'v'vA[v /t] are formulas. 5. Every formula may be constructed by means of finitely many applications of rules ( 1)-(4). Semantically, function symbols stand for functions on the domain; closed function terms stand for individual elements of the domain. Function terms thus differ sharply from formulas. Formulas are either true or false; closed function terms stand for objects. Connectives may join together formulas, but not function terms: ' f (a) _, g(b, c) ', which, in English, would correspond to something like If Ann's mother, then Bill and Carlota 's anniversary, makes no sense. Similarly, function terms may flank the identity sign, but formulas may not. A formula such as ' f(a) = g(b, c) ' could represent an English sentence such as Ann's birthday is Bill and Carlota 's anniversary, but ' 3xFx = Ga' is incoherent. The closest English rendering of it would be something like There is a fish is A l is crazy. That function symbols stand for functions on the domain has a very important consequence. Functions are relations between objects. But two things distinguish functions from other relations. First, given some input to the function, we always obtain an output. Any object stands in the appropriate relation to some object. Second, given an input, we always obtain a unique output. Each object stands in the appropriate relation to only one object. Together these requirements imply that, given any input, we must obtain exactly one output. We may refer to the first requirement as the existence requirement; the second, as the uniqueness requirement. Because function symbols stand for functions, we may use a function symbol to represent an English expression only when these requirements, intuitively speaking, are satisfied. Mathematical expressions are easiest to evaluate in light of these requirements. Consider ad­ dition. Any two numbers have exactly one sum, so the requirements are satisfied . Or, consider multiplication. Any two numbers have a unique product, so multiplication is a function, and we may represent it with a function symbol. Subtraction and division are slightly more complex. If we include negative numbers, then sub­ traction is a function: any two numbers, taken in a given order, have a unique difference. Without negative numbers, however, the existence requirement fails; no positive number is 3 - 5. Division is is undefined. It is almost a function, but not quite, even if we include fractions, since, for any n, a function only on the nonzero numbers. Satisfying these requirements is important, because, armed with a function symbol, we can prove existence and uniqueness. The use of a function symbol thus presupposes them. To see this, we need to know how to handle function terms within quantification theory. We need no new rules. Recall our definition:

i

Definition 6.1 A formula

.A.[t/v] is an

instance of a formula 3vA or 'v'vA iff it results from

dropping the main connective of that formula and substituting a closed term t for every occurrence of the variable v throughout A.

We can form instances, then, by dropping quantifiers and substituting closed function terms as well as constants for variables. Both 'Fa' and 'Ff(a) ' count as instances of '3xFx ' . Since 'v'E and 3/ speak o f terms i n general , \I E and 3/ both allow u s to work with any instance of a formula. Universal exploitation permits us to deduce from a universal formula any of its instances, while existential introduction allows us to deduce an existential from any of its instances. We cannot use function terms in applying our existential exploitation rule or our universal proof method, however. Existential exploitation requires that we use a constant completely new to the proof in taking an instance. Universal proof similarly allows us to prove a universal formula by proving an instance with a new constant. Whenever our rules have required a new constant, then, they will continue to do so: 3E and universal proof all require a new constant.

Logic, Sets and Functions

132

It's easy to see that we can prove the existence and uniqueness of values of functions, given any set of inputs in the domain. The existence requirement states that a function must yield a value for any input. We can symbolize this as '\fx3yy = f(x) ' . This is valid, as the following proof shows:

I . � 'v'x3yy= f(x) 2 . � 3yy= f(a) =I 3 . [ f(a)=f(a) 31,3 4. 3yy=f(a)

Notice that, in applying 3/ in the proof, we substituted a variable for a closed function term. The instance of the relevant quantified formula resulted from substituting a closed function term rather than a constant for the quantified variable. Uniqueness is almost as easy to prove valid. The uniqueness requirement states that for any input there is a unique output to the function. We can symbolize this by construing it as stating that there is at most one output to the function, given any input: '\fx\fy\/z((y = f(x)&z = f(x)) -> y = z) '.

I . � 'v'x'v'y'v'z((y=f(x)&z=f(x))-- y=z) 2 . � 'v'y'v'z((y=f(a)&z=f(a))-- y=z) 3 . � 'v'z((b=f(a)&z=f(a))-- b=z) 4. � ((b=f(a)&c=f(a))--b=c) (b=f(a)&c=f(a)) 5. 6. b=f(a) c=f(a) 7. 8. b=c

ACP &E,.5 &E,.5 =E,6 ,7

To see why observing these requirements in translation from natural language is so important, consider these arguments. 2 . Your mother is your parent. Your father is your parent.

:. Your mother is your father.

Although we can reasonably translate mother and father as function symbols, we can't so translate parent; among humans, parents come in pairs. Parent thus violates the uniqueness requirement. Argument (2) is certainly invalid. But it would appear valid if we were to render parent as function symbol. All that we need to establish the validity of the argument form is the principle of the indiscernibility of identicals, expressed in the rule = E:

1. 2. 3. 4.

m(a)=g(a) A f(a)=g(a) A � m(a)=f(a) =E,1 ,2 [ m(a)=f(a)

To see the trouble that arises if we ignore the existence requirement, consider this terrible argu­ ment, surely belied by the baby boom: 3. No babies have children. :. There are no babies. We shouldn't translate child of as a function symbol, since it fails to meet either requirement. Many people have no children, while others have more than one. If we nevertheless symbolize the argument using a function symbol, the resulting argument form will be valid:

Identity and Function Symbols

1 . 'v'x(Bx- -,3yy=g(x)) 2. � -,3xBx 3 . 3xBx 4. Ba 5 . (Ba- -,3yy=g(a)) 6. -,3yy=g(a) 7 . v'y-,y=g(a) 8 . -,g(a)=g(a) 9 . g(a)=g(a)

133

A

AIP 3E,3 v'E,1 - E,5 ,4 QN ,6 v'E,7 =I

Using function symbols in translation is legitimate, strictly speaking, only if the existence and uniqueness requirements are satisfied. This is why many mathematicians demand a proof of existence and uniqueness before they admit a function or operation symbol into a mathematical theory. Nevertheless, we can relax these requirements somewhat when translating natural language ar­ guments. Many function symbols satisfy existence and uniqueness on only a portion of the domain. If we use 'f' to symbolize the Social Security number of, for example, we must restrict its application to constants denoting people. The domain will contain both people and numbers, and we don't want to commit ourselves to talking about the Social Security numbers of numbers. This is an informal restraint; we could impose it formally by restricting our logical rules when we use function symbols. Alternatively, we might let any intuitively silly function term, such as the Social Security number of 17, name some specially designated "dummy" object.

Problems Symbolize these sentences in Q with identity and function symbols. l . A man's judgment and his conscience is the same thing. (Thomas Hobbes) 2. Our birth is but a sleep and a forgetting. (William Wordsworth) 3. The painter's brush consumes his dreams. (William Butler Yeats) 4. If the mind of the teacher is not in love with the mind of the student, he is simply practicing rape, and deserves at best our pity. ( Adrienne Rieb) 5. Nobody loves me but my mother, and she could be j ivin' too. (B. B. King) 6. A foll despises his father's instruction. (Proverbs 15:5) 7. The friend I can trust is the one who will let me have my death. The rest are actors who want me to stay and further the plot. (Adrienne Rich) 8. If you see in any situation only what everybody else can see, you can be said to be so much a representative of your culture that you are a victim of it. (S. I . Hayakawa) 9. In a hierarchy every employee tends to rise to his level of incompetence. (Laurence J. Peter) 10. Common sense holds its tongue. (Proverbs 10:19) 1 1 . Not all those who know their minds know their hearts as well. (La Rouchefoucauld) 12. All that we send into the lives of others comes back into our own. (Edwin Markham) 13. The cleverly expressed opposite of any generally accepted idea is worth a fortune to somebody. (F. Scott Fitzgerald)

1 32

Logic, Sets and Functions

It's easy to see that we can prove the existence and uniqueness of values of functions, given any set of inputs in the domain. The existence requirement states that a function must yield a value for any input. We can symbolize this as 'Vx3yy = f(x) '. This is valid, as the following proof shows:

1 . � Vx3yy=f(x) 2. � 3yy=f(a) ==I 3 . [ f(a)=f(a) 31,3 4. 3yy=f(a)

Notice that, in applying 3J in the proof, we substituted a variable for a closed function term. The instance of the relevant quantified formula resulted from substituting a closed function term rather than a constant for the quantified variable. Uniqueness is almost as easy to prove valid. The uniqueness requirement states that for any input there is a unique output to the function. We can symbolize this by construing it as stating that there is at most one output to the function, given any input: 'VxVyVz( ( y = J(x)&z == f(x)) -. y = z ) ' .

1 . � \lx\fy\fz((y=f(x)&z=f(x))-- y=z) 2. � \fy\fz((y=f(a)&z=f(a)) --y=z) 3 . � \f z((b=f(a)&z=f(a))--b=z) � ((b=f(a)&c=f(a))- b=c) 4. (b=f(a)&c=f(a)) 5. 6. b=f(a) c=f(a) 7. b=c 8.

ACP &E ,5 &E,5 =E ,6 ,7

To see why observing these requirements in translation from natural language is so important, consider these arguments. 2. Your mother is your parent. Your father is your parent.

:. Your mother is your father.

Although we can reasonably translate mother and father as function symbols, we can't so translate parent; among humans, parents come in pairs. Parent thus violates the uniqueness requirement. Argument (2) is certainly invalid. But it would appear valid if we were to render parent as function symbol . All that we need to establish the validity of the argument form is the principle of the indiscernibility of identicals, expressed in the rule = E:

1. 2. 3. 4.

m(a)=g(a) A f(a)=g(a) A � m(a)=f(a) =E,1 ,2 [ m(a)=f(a)

To see the trouble that arises if we ignore the existence requirement, consider this terrible argu­ ment, surely belied by the baby boom: 3. No babies have children . :. There are no babies. We shouldn't translate child of as a function symbol, since it fails to meet either requirement. Many people have no children, while others have more than one. If we nevertheless symbolize the argument using a function symbol, the resulting argument form will be valid:

Identity and Function Symbols

1 . 'v'x(Bx --+ ,3yy=g(x)) 2. � ,3xBx 3 . 3xBx 4. Ba 5 . (Ba --+ ,3yy=g(a)) 6 . ,3yy=g(a) 7 . \fy,y=g(a) 8 . ,g(a)=g(a) 9 . g(a)=g(a)

133

A

AIP 3E,3 \fE,l --+E,5 ,4 QN,6 \fE,7 =I

Using function symbols in translation is legitimate, strictly speaking, only if the existence and uniqueness requirements are satisfied. This is why many mathematicians demand a proof of existence and uniqueness before they admit a function or operation symbol into a mathematical theory. Nevertheless, we can relax these requirements somewhat when translating natural language ar­ guments. Many function symbols satisfy existence and uniqueness on only a portion of the domain. If we use 'f' to symbolize the Social Security number of, for example, we must restrict its application to constants denoting people. The domain will contain both people and numbers, and we don't want to commit ourselves to talking about the Social Security numbers of numbers. This is an informal restraint; we could impose it formally by restricting our logical rules when we use function symbols. Alternatively, we might let any intuitively silly function term, such as the Social Security number of 17, name some specially designated "dummy" object.

Problems

Symbolize these sentences in Q with identity and function symbols.

l . A man's judgment and his conscience is the same thing. (Thomas Hobbes) 2. Our birth is but a sleep and a forgetting. (William Wordsworth)

3. The painter's brush consumes his dreams. (William Butler Yeats)

4. If the mind of the teacher is not in love with the mind of the student, he is simply practicing rape, and deserves at best our pity. (Adrienne Rich} 5. Nobody loves me but my mother, and she could be jivin' too. (B. B . King)

6. A foll despises his father's instruction. (Proverbs 15:5)

7. The friend I can trust is the one who will let me have my death. The rest are actors who want me to stay and further the plot. (Adrienne Rich)

8. If you see in any situation only what everybody else can see, you can be said to be so much a representative of your culture that you are a victim of it. (S. I. Hayakawa)

9 . In a hierarchy every employee tends to rise to his level of incompetence. (Laurence J . Peter}

10. Common sense holds its tongue. (Proverbs 1 0 : 1 9)

1 1 . Not all those who know their minds know their hearts as well. (La Rouchefoucauld)

1 2 . All that we send into the lives of others comes back into our own. (Edwin Markham)

13. The cleverly expressed opposite of any generally accepted idea is worth a fortune to somebody. (F. Scott Fitzgerald)

1 34

Logic, Sets and Functions

1 4 . A clever man is wise and conceals everything, but the stupid parade their folly. (Proverbs 1 3 : 16) 15. The man with no inner life is the slave of his surroundings. (Henri Frederic Amiel) Use natural deduction to verify the validity of these arguments. 1 6 . Nobody's perfect. Consequently, nobody's mother is perfect. 1 7. Tanya and Tatiana are the same person. So Tanya's Social Security number is the same as Tatiana's. 18. Everybody is his or her own best friend. Therefore, each person's best friend is their best friend's best friend. 19. Suppose that the name of the name of a thing is always just the thing's name itself. Then it follows that anything that names anything is a name of itself. 20. Suppose that anything that names anything is a name of itself. It follows that the name of the name of a thing is always just the thing's name itself. 2 1 . For every man there is a priest who is his intermediary with God. Every priest is a man. No one is intermediary between God and God. Therefore, God is neither man nor priest. 22. Augustus is Superman's father. Augustus knows everybody. Somebody can beat up Superman if and only if he or she can do anything. All failure results from a lack of self-knowledge. Therefore, Superman's father can beat him up. 23. No matter who you are, the being with the ultimate power over your actions is God. One person has responsibility for another's actions just in case that person has ultimate power over the other's acts. A person can be blamed, from a moral point of view, only if he or she has responsibility for his or her own actions. Thus nobody but God can be morally blamed. 24. Nothing is demonstrable unless the contrary implies a contradiction. Nothing that is distinctly conceivable implies a contradiction. Whatever we conceive as existent, we can also conceive as nonexistent. There is no being, therefore, whose nonexistence implies a contradiction. Conse­ quently there is no being whose existence is demonstrable. (David Hume) [Note: this is really two arguments: (i) Nothing that is distinctly conceivable implies a contradiction. Whatever we conceive as existent, we can also conceive as nonexistent. :. There is no being whose nonexistence implies a contradiction. (ii) There is no being whose nonexistence implies a con­ tradiction. Nothing is demonstrable unless the contrary implies a contradiction. Consequently there is no being whose existence is demonstrable. To give these a fair hearing, assume that the nonexistence of a thing is the contrary of its existence.] Using natural deduction, show that these argument forms are valid. 25. VxFx; :. Vx(Gxa -+ Fh(x, a) ). 26.

v'x3y3zx = f(y, z) ; :. Vxv'yGJ(x, y) --> v'xGx.

27. v'xVyx

= y; :. Vxx = f(x ) .

= f(a); :. Vxa = f(x) . Vxg(f(x) ) = x ; :. Vxv'y(f(x)

28. v'xx 29.

=cc

y --> g(y) = x ) .

1 35

Identity and Function Symbols 30 . . \/xVy(f(x)

= y -> g(y) = x) ; :. Vxg(f(x)) = x.

3 1 . VxFf(x)x; . . VxF J (f(x) )f( x) .

32. 3x(x -/:- a&Gx); . . 3xGx& (Ga -> 3x3y(Gx&Gy&x -/:- y)) 33. Vxx = f (x) ; :. Vxf(x)

= f(f( J (x) ) ) .

34. 3x:ly(x -:/- y&Vz(z = x V z = y)) ; VxVy( J (x)

35.

= f (y ) -> x = y) ; :. Vx3yx = f(y) .

* Suppose there are functions f and g such that (a) Vxf (x, 0) = x, (b)Vxf(x, g(x) ) (c) VxVyf (x , y) = f(y, x). Show that 3xx = g(x) and that 3xf(x, x) = x.

=

0, and

36. * Suppose that there is a binary function L satisfying the axioms (a) VxVyy = x£y and (b) VxVyx£y = y£x. Show that the domain of such a function must include at most one object:

VxVyx

= y.

Leibniz developed a calculus concerning the combination of concepts using a single function for combination and a single predicate "is in" or "is contained by" , defined in terms of the combination function. Using Leibniz's symbol ' o ' for combination and ' < ' for "is in" , we can present Leibniz's theory in one definition and two axioms. Definition. VxVy(x < y - 3zx o z = y) Axiom 1. VxVyx o y = y o x Axiom 2. Vxx o x = x

Leibniz proceeds to prove the following propositions. Show that they follow from these axioms, together with the associativity principle: Axiom 3. VxVyVzx o (y o z) = (x o y) o z

37. * Vxx < x

38.

* VxVyVz(x < y -> z o x < z o y)

39. * VxVy(x o y = x -> y < x) 40 . * VxVy(x < y -> y o x = y)

41. * VxVyVz( (x < y&y < z) -> x < z) 42. * VxVyVz (x o y < z

->

y < z)

43. * VxVyVz(y < z -, y < z o x)

44. * VxVy((x < y&y < x) -, x = y) 45 . * Vx\/yVz ((x < y&z < y) 46.

->

* VxVyVzVw((x < y&z < w)

x o z < y) ->

x o z < y o w)

Chapter 7

Sets Sets are collections or groups of objects. As natural as this idea might seem to us today, the study of sets is relatively recent. A German mathematician, Georg Cantor ( 1 845-1 918) , wrote a series of papers, beginning in 1874, that began the branch of mathematics known as set theory. Later work by Ernst Zermelo ( 1871-1 953) , Abraham Fraenkel (1891-1965), John von Neumann ( 1 9031 957), Bertrand Russell ( 1872- 1970) , Alfred North Whitehead (1 861-1947), and others demonstrated that set theory suffices for the construction of the entire body of classical mathematics. That is, every mathematical statement can be translated into a statement of set theory; furthermore, every mathematical theorem, under this translation, becomes a theorem of set theory. The study of sets is thus the study of mathematics in an extremely general form. Sets have practical as well as theoretical importance. virtually every branch of mathematics uses sets extensively. Sets similarly form part of the foundation of computer science, linguistics, advanced areas of logic, and other disciplines that emphasize rigor. It is no exaggeration to say that all these realms of inquiry presuppose set theory. In this chapter we'll present a fairly simple theory of sets. It constitutes a fragment of a much more powerful theory, Zermelo- Fraenkel set theory with urelements (ZFU), which is the set theory most commonly used by mathematicians and computer scientists (though not mathematical logicians) . The language of our theory is simple. It includes all the logical symbols of QL, the language of quantificational logic with identity. It contains three nonlogical predicate symbols , a binary predicate meaning "is a member of," and one singulary predicate, meaning "is a set." It also contains a primitive symbol similar to a function symbol that is intuitively understood to pick out collections of objects. Later we'll introduce other predicates and function symbols, but all will be defined with the symbols already mentioned. The predicate "is a set" and the function symbol defining collections, therefore, allow us to define any other idea used in our version of set theory or in the whole of mathematics.

7. 1

Extensionality

The theory we'll present in this chapter is a system of axiomatic set theory. The rules of inference of this theory will include those of our quantified natural deduction system. Unlike a system of natural deduction, however, our theory will have axioms. These axioms convert to new rules that we will be using. They serve as starting points for the theory. All other theorems are proved by starting with the axioms and using the rules. Our theory will divide the world into two kinds of entity: sets and nonsets. We want to be able to talk about collections of grapefruit, or birds, or real numbers, or cities, or other sets. Because we want our theory to be as general as possible, we won't worry about what, specifically, the nonsets are. We'll refer to them as urelements, the basic objects that 137

138

Logic, Sets and Functions

we will count as given. Our theory will have little to say about these objects. It will take them as unanalyzable, and, in particular, as not being decomposable into elements in the way sets are. We will introduce the singulary predicate S to mean "is a set" . The formula Sa thus says that a is a set; -,Sa says that a is not a set, that is, that a is an urelement. The most basic concept of set theory is membership. The binary predicate E , read "is a member of" , "is an element of" or "belongs to" , is the only other undefined predicate in the language ZL. We'll write this symbol between its arguments. Thus, a E b means that a belongs to, or is an element of, b. To abbreviate -,a E b, we'll write a (/_ b. Sets, in general, decompose into elements; urelements don't. So, whenever we know that something is an urelement , we can conclude that it has no members. Equivalently, if something has members, it must be a set. So we'll allow ourselves to apply the rules: Sets(S)

-.Sx Vyy (/_ x Sets(S)

yEx Sx

s

s

Sets, as we've said, are collections of things. The order in which these things appear is irrelevant; so is any repetition. The only thing that makes a difference, in defining a set, is what objects are members of the set. This is the content of the first axiom of set theory, the axiom of extensionality. Axiom 7.1 (Extensionality) Sets with the same members are identical.

Symbolically: Axiom of Extensionality

Vx\/y( (Sx&Sy)

-+

(Vz(z E x +-+ z E y)

-+

x

= y))

This axiom gives us an identity criterion for sets. We have written our axiom of extensionality so that it rules out identifying urelements by means of their elements. If we had not, that would have had disastrous consequences for our theory of urelements: if extensionality held of urelements, there would only be one urelement! That is, we have taken a bold stance and claimed that all objects except for urelements are like sets in that their identity is defined by their members. How do we determine whether set a is the same as set b? We look at their members. This might seem like a trivial matter. But notice that the same collection may be described in very different ways. We might describe the collection containing just the two cities Pittsburgh and Philadelphia, for example, as the set of the two largest cities in Pennsylvania; as the set of Pennsylvania cities with metropolitan area populations of over two million; as the set of cities with National League pennant-winning teams in 1 909 and 1915; as the set of cities with baseball teams finishing first in the National League East between 1974 and 1 979; and so on. No matter which description we use, the set described is the same. Extensionality will yields a useful rule of inference, which we'll call extensionality exploitation. This rule allows us to replace an expression of the form Vz( z E a z E b) with one of the form a = b, p rovided that a and b are sets.

Sets

139

Extensionality Exploitation( Ext) Sa Sb \/z (z E a z E b) a=b

Ext

Of course, we can perform the converse inference, from a = b to \/z(z E a z E b) , solely.- t1siq.g � identity exploitation. Membership is the most important relation between collection. But there are other very frequently­ used notions that can be defined in terms of membership.

Definition 7.1 Something is a subset of another iff it is a set and every member of the it is also a member of the other. Symbolically, \/x\/y(x (x E y

6 . 'v'x'v'y(x

b (j. a and that { {a } , a} 12. * For any four objectsx, y, z and w, {x, y} w&y

=

z) .

=

= {{ b}, b} .

Then a

= b.

{z, w} only if either (x

=

z&y

=

w) or (x

=

13. * Two sets are identical just in case they belong to the same sets. 14.

*

Prove that {a}

= {y : y = a } .

1 5 . * Show that, fo r any sets x and y, {x, y}

7.4

= {z

: (z

= x V z = y) } .

The Null Set

Sets are groups or collections of things. We determine their identity by looking at the things in them, their members. We know how to tell whether sets are the same or different; given some urelements or sets, we know how to form sets by constructing pairs or subsets whose elements satisfy open sentences. Given the abstraction axiom, we can prove that there is a set without any members. Since we want to construct a set with no elements, we need to specify a condition on membership in this set that nothing could fulfill. One such condition is expressed by the open formula 'z -:/- z' . We can require, in other words, that only objects that are not identical to themselves can belong to the set. Since there are no such things, nothing will be in the set. Theorem 7.10 There is a set without members. 3x( Sx&Vyy r/. x) Prbof: By the axiom of abstraction, there is a set containing just those elements of the universe

of discburse that are not self- identical; namely, the set { x : x -:/- x } . Since everything is self­ identical, nothing belongs to { x : x -:/- x } . So there is a set with no elements. We'll call the set without members '0'. Definition 7.7 (The N ull Set) 0

=

{x : x -:/- x }

Given our definition of 0 and our rules for abstraction, we can show: Theorem 7.11 S0&\lyy r/. 0 .

I n proofs, this is the fact about the null set that will b e most useful. That is, rather than adopt an official definition of the null set, we will usually use the symbol '0' in steps of the form:

Logic, Sets and Functions

148

So far we know that there is a set without members. Our use of the constant '0' for the null set, however, needs further explanation; we need to know that there is only one set without members. We need to know, in other words, that the null set is unique. Luckily, we can prove that the null set is unique. Theorem 7.12 0 is the only empty set. \/x((Sx&\/yy (/: x ) - x

The proof uses the axiom of extensionality:

1'

2.

� �

V x ( ( $ x&Vy y ¢ x ) - x = 0 ) C C $ a &Vyy ¢ a ) - a = 0 )

3.

( $ a &V yy ¢ a )

4.



5. 6. 7. 8.

= 0)

Vz( zEa-zE 0 ) � ( b E a ++ b E 0 ) � ( b E a -+ b E 0 ) bEa

AC P

AC P

&E, 3

9.

V yy Et a b Et a

1 0.

-, b E a

VE,8 E¢,9

1 1.

bE 0

!,7, 1 0

1 2. 1 3.



1 4. 1 5. 1 6. 1 7. 1 8. 1 9. 20.

( b E 0 -+ b E a )

bE 0 b Et 0 -, b E 0 bEa

AC P

0 E Et , 1 4 ! , 1 3, 1 5

( b E a ++ b E 0 ) $a

-1,6, 1 2

$0 a= 0

Th 1 0 E x t , 4, 1 8 , 1 9

&E,3

In English: To show that any set without members must be identical to 0, assume that a is set without elements. Everything in a is in 0, since nothing is in a; similarly, everything in 0 is in a. So a and 0 have the same members (namely, none) ; by the axiom of extensionality, therefore, they are identical. Another very useful theorem about sets that we add without proof ( the proof is left as an exercise) is the following: Corollary 7.1 \/x((Sx&x i= 0) - =lyy E x)

We have shown, then, that there is one and only one set without any elements. The null set, introduced by George Boole, has proved very useful in mathematics, much as the development of arithmetic with zero made possible tremendous advances in numerical computation.

Sets

149

Problems Prove these theorems, where a and b are sets. l . The null set is a subset of every set. (!) 2 . The null set is a superset only of itself. 3 . { x E a : X =/- X }

=

{y E b : y =/- y} .

4. 0 is not identical to the unit set of any set. 5 . For any two sets x and y, if x is a su bset of the unit set of y, then either x is the unit set of y , or x is the null set. In symbols, 'vx'vy(x s;:; {y} ._. (x = { y }vx = 0) ) . 6. There are n o unicorns. So, { x E a : x is a unicorn}

= 0.

7. No squares are round. So , { z E a : z is a round square}

7.5

= 0.

Binary Unions , Intersections and Complements

You may be familiar from high school mathematics with the union of two or more sets. Graphically, the union of two circles x and y, construed as sets of points, is the area shaded in this diagram:

y

The union of two sets is the set of all the things in one or the other. The union of several sets is similarly the set of things in at least one of these sets. We'll write the union of x and y as x U y. We can define this set informally in the following way:

Definition 7.8 (Binary union) x U y = {z : z E x V z E y} It then follows from the Union axiom that x U y is a set. We can also prove the following fact abouty U z:

Logic, Sets and Functions

1 50 Theorem 7.13 VyVzVu(u E y U z - (u E y V u E z ) )

Proof:

1 . � \ly\lz\lu(uE(yUz)-( uEyvuEz)) 2 . � \lz\lu(uE(aUz)-(uEavuEz)) 3 . � \lu(uE(aUb)-(uEavuEb)) � (cE(aUb)-(cEavcEb)) 4. 5. � (cE(aUb) - (cEavcEb)) 6. [ cE(aUb) 7. (cEavcEb) 8. � ((cEavcEb) ..... cE(aUb)) 9. [ (cEavcEb) cE(aUb) 10. 11. ( cE( aUb)-( cEavcEb))

ACP U (Def. of Bin. Union),6 ACP U (Def. of Bin. Union),9 -I ,5 ,8

Theorem 7.14 justifies us in adding the following simplified inference rule:

Union(U) x E yUz x E yVx E z

U U

Theorem 7.14 Vx\:/ySx U y

Proof: obvious using the definition of union and Theorem 7.3. We can further generalize the binary function symbol U Definition 7 .9 The union of y 1 . ··Yi is the set containing all the members of Yt t hroughyi . In sym­

bols, Y1 U . . . U Yi = { z : z E Y1 V . . . z E yi} .

The corresponding rule of inference is:

Generalized U nion(un ) X E Y l u . . . u Yi X E Yt V . . . V x E Yi

un

We will also define the familiar binary function of intersection. Suppose that we were to diagram two sets, x and y, as ellipses:

151

Sets

X

y The intersection of the ellipses is the area shaded above. It consists of the points that are in both regions. Analogously, the intersection of two sets is the set of things that are members of both sets. The shaded area thus represents the intersection of x and y, which we'll write as x n y. If x is the set of red things, and y the set of houses, then x n y is the set of things in both x and y: the set of red houses. Similarly, if Z is the set of integers, and R + the set of positive real numbers, then Z n R + is the set of positive integers. We will define the two place function, intersection, in the following way: Definition 7 . 1 0 The intersection of x and y is the set of all members common to x and y.

In symbols, Vx\lyx n y

= { z : z E x&z E y}

The intersection of two sets x and y is just the set of a.II members of x that are also in y:{z z E x&z E y}. The abstraction principle justifies the construction of a binary intersection as a set. We will prove that binary intersections are sets using theorem 7.3. We will do this in several steps. The first will be to show that there is a set of all those objects that x and in y. We can then use abstraction to conclude that the set of all those things in x and y is a set and then we use the definition of binary intersection to conclude that x n y is a set. Theorem 7.15 \/x\/ySx n y

Proof: Use theorem 3 and the definition of binary intersection. Theorem 7 . 1 6 Vx\/y\/u(u E x n y

Proof:

(u E x&u E y))

152

Logic, Sets and Functions 1.



2.



V y V u ( U E ( a n y ) ++ ( U E a & u.E y ) )

3.



V u ( U E ( a n b ) +-+ ( U E a & u E b ) )

4. 5. 6. 7. 8. 9.

1 0. 1 1.

V x V y V u ( u E ( x n y ) ++ ( u E x & u E y ) )



( C E ( a n b )- ( C E a & c E b ) )



( C E ( a (l b ) - ( C E a &c E b ) )

[ CE(anb) (CEa&cEb) [

AC P

n ( B i n a ry l n t e rs e c t i o n ) , 6

� C C cE a & c E b ) - c E ( an b ) ) ( C E a &c E b ) cE(anb)

ACP

n ( B i n a ry l nt e r s e c t i o n ) , 9

( C E ( a n b ) ++ ( C E a & c E b ) )

-1 ,5,8

Using theorem 7 . 1 6 , we can derive the following rule of inference: Binary lntersection(n)

z E xny n z E x&z E y n The intersection of two sets contains those elements the two sets have in common. It's natural to generalize this idea to more than two sets. The intersection of any number of sets is the set of elements all these sets have in common.

Definition 7.11 The intersection of YI . . . Yi is the set of all members of each of YI . . . Yi · In symbols, YI

n ... n Yi = { z E Yl : z E yz & . . . &z E yi }

Again we have the equivalent rule o f inference: Generalized Intersection(nn ) X E Yl n . . . Yi X E Y I& . . . &x E Yi

nn nn

Sometimes it's useful to speak of the objects belonging to one set but not to another. We might, for example, want to talk about the cities with populations over 500,000 that have no National Football League franchises, or the states that have no fewer than 25 electoral votes. In such situations we can speak of the difference between two sets. The difference between sets y and z (also called the complement of z in or relative to y) is the set of things that are in y but not in z. The usual symbol for this difference is 'y - z'.

Sets

153

Definition 7.12 The relative complement of z in y is the set of members of y which are not also members of z . In symbols,

y - Z = {X

:

X

E y&x .j_ Z}

Using the definition of complement and Theorem 7.3, we can derive:

Theorem 7 . 1 7 \/x\:/ySx - y From the definition of relative complement and abstraction, we also derive the following rule of inference: Def. of Complement( - ) xEy-z x E y&x rf_ z

Def ­ Def -

We have now shown that there are many ways of constructing new sets from old ones. At this point it seems appropriate to introduce some properties of sets and relations among sets used throughout mathematics: Two sets that have no members in common are said to bedisjoint.

Definition 7 . 1 3 x and y are disjoint iff their intersection is empty. Or, \:/x\/y(x and y are disjoint

xny

= 0)

djt x and y are disjoint djt djt xny= 0 The set R + , for example, is clearly disjoint from the set of all negative reals, R- . The set of red houses is disjoint from the set of tricycles. And the set of symphonies and the set of ducks are disjoint. This notion, too, can be generalized to any number of sets. The obvious generalization, however, turns out to be not very fruitful. A more useful concept is this. We'll say that any set of sets is pairwise disjoint iff any two sets in the set are disjoint in the sense we've already defined.

Definition 7 . 1 4 A set x is pairwise disjoint iff any two members of x are disjoint. Or \/x(x is pairwise disjoint +--> \:/y\:/z( (y E x&z E x&y i:- z)

-->

y and z are disjoint ) ) )

Logic, Sets and Functions

1 54 PD x is pairwise disjoint VyVz( ( y E x&z E x&y =f z ) --+ y and z are disjoint)

PD PD

To take an example, consider a set of several clubs, construing each club as a set of its members. The set is pairwise disjoint if any two of the clubs are pairwise disjoint. This will hold just in case nobody belongs to more than one club in the set. In such a circumstance, the intersection of the set would be empty. It's easy to see that any pairwise disjoint set of sets has an empty intersection, if it contains at least two sets.

Problems

Prove the following theorems. Assume throughout that a, b, c and d are all sets.

1 . a U 0 = a. (!) 2. a u a = a. (!) 3. a u b

= b U a.

(!)

4. a u (b u c) = (a u b) u c. (!)

,\ (",...,., .if!

\ 5. 1 Suppose that a and b are both subsets of c. Then a U b � c. ( ! ) 6. a U b � a.

7. Suppose that a � b. Then, for any set x, a U x � b U x. (!) 8. Suppose that a � c, and b � d. Then a U b � c U d.

= a - (a - x) ) . Suppose a /. 0. Then ,:3xx = a - x .

9. Vx(x � a --+ x 10.

1 1 . The null set and any set are disjoint. ( ! ) 12. a - b � a. 13. a n a

= a.

(!)

14. a n b = b n a. (!) 1 5 . a n (b n c) = (a n b) n c. ( ! ) 16. (a � c&0 --+ a - b

= a n ( c - b) .

17. Suppose that a is a subset of both b and c. Thena � b n c.

1 8 . (a � c@ - (a � (c - b) 1 9 . a - b = a - (a n b) .

= a. a - a = 0.

20. a - 0

(!)

21 .

(!)

a and b are disjoint) .

155

Sets 22. Suppose that a e E a x b

6.

e E {c} x b

7.

3u3v( e

8.

e = (f, g ) & f E {c} & g E b

3E

9.

f=c

Unit, 8

10.

e = (c, g)

=E, 8, 9

11.

(c, g) E a x b

x * , 3, 8

12.

eEaxb

=E, 10, 1 1

13.

{c} x b � a x b

� ' 4 , Th . ??

14.

{c} x b E P(a x b)

P, 13

= (u, v)

& u E {c} & v E b)

x, 6

(3. ] l. a x b = a x c

2. a f. 0

3. Sa & Sb & Sc 4 . Show b

=c

5.

3y y E a

Cor. 1 1 . 1 , 2 , 3

6.

eEa

3E, 5

7.

Show \fx(x E b ,_. x E c)

8.

Show d E b ,_. d E c

9.

Show _,

10.

dEb

11.

( e , d) E a x b

x * , 6, 10

12.

( e , d) E a x c

=E, 1, 1 1

Relations

1 69

13.

eEa&dEc

14.

dEc

15.

x * , 12

Shew .--

Similar

16.

Ext, 3, 7, 7

b=c

Problems

What is A x B, given these sets as A and B?

l.

2.

A

B

{ 0} { 0}

{1} {0, 1 }

3.

{0, 1 }

{l}

4.

{0, 1}

{0, 1 }

5.

{1 }

{0, 1 , 2}

6.

{0, 1 , 2}

7.

{0, 1 }

8.

{0, 1 , 2}

{ 1 , 2}

9.

{0, 1 , 2}

0

0

{l}

10 .

{ 0}

{0, 1 , 2}

Prove the following theorems.

1 1 . For any x , (x, x)

= { {x } } .

& Sy) -> ( x x y = 0 -> ( x = 0 V y = 0 ) ) ) . \/x(x x 0 = 0 ) . (!)

12 . \/x\/y( (Sx 13.

= y x x & x -/- 0 & y -/- 0 ) -> x = y )). ( !) z ((x x y 1 5 . \/x\/y\:/z(x x x

16. \:/x\/y\/z((Sx & Sy & Sz) -> ((x -/- 0 & x x y y Rxx ) .

O n the set o f people, "is" , "is exactly as tall as" , "is at least as tall as" , and "has the same birthday as" are reflexive. Specifying a set A is important because a relation may be reflexive on some sets but not on others. Consider, for example, the relation "analyzes" on the set of people. Presumably, some people analyze themselves, while others don't, so the relation isn't reflexive on that set. But, if all psychoanalysts analyze themselves, then that relation is reflexive on the set of psychoanalysts. Similarly, "absolute value" is not reflexive on the set of integers-1 - 41 =/- -4-but it is reflexive on the set of positive integers. Irreflexive relations never hold between an object and itself. No number, for example, is greater than, or less than, itself. So < and > ( as well as =I-) are irreflexive. Definition 8 . 1 1 R is irreflexive on A iff Vx(x E A ----> -.Rxx) .

On the set of people, "is taller than" , "is shorter than" , "is a child of" , "is a parent of" , and "isn't the same person as" are all irreflexive. Obviously, not all relations are either reflexive or irreflexive. Reflexive relations always hold between an object and itself, while irreflexive relations never do. Many relations, however, sometimes hold between an object and itself and sometimes don't. "Respect" , for example, is neither reflexive nor irreflexive: some people respect themselves, and some don't. The relation of being a square of, among numbers, is similarly neither reflexive nor irreflexive: 02 = 0 and 1 2 = 1 , but it's not true in general that n2 = n. A relation is symmetric if, whenever it holds in one direction, it also holds in the other. Identity is clearly symmetric: if x = y, then y = x. Other relations on numbers that are symmetric are =/-, "have a common divisor" , "have the same prime factors" and "have the same absolute value" . Definition 8.12 R is symmetric on A iff VxVy(x, y E A ----> (Rxy ----> Ryx) ) .

1 76

Logic, Sets and Functions

Among people, "is related to" , "is a sibling of" , "is a friend of" , "has the same birthday as" and "is exactly as tall as" are symmetric. An asymmetric relation , in contrast, holds in at most one direction between any two objects. If it holds in one direction, then it doesn't hold in the other. Both < and > , clearly, are asymmetric relations. Definition 8.13 R is asymmetric on A iff Vx\/y(x, y E A -> (Rxy -> -iRyx) ) . Among people, "is taller than" , "is shorter than" , "is a child of" , and "is a parent of" are all asymmetric. Antisymmetric relations hold in both directions between objects only if those objects are the same. To put it differently, they never hold in both directions between two distinct objects. Among numbers � and � are antisymmetric: there are numbers n and m such that n � m and m � n, but only when n = m . . Definition 8.14 R is antisymmetric on A iff \/x\/y(x, y E A -> ((Rxy & Ryx) __,, x

= y) ) .

Again, not all relations are either symmetric, asymmetric o r antisymmetric. "Love" , fo r example, isn't symmetric: there are cases of unrequited love. But it's not asymmetric, since some love is not unrequited . "Love" isn't antisymmetric either, for there are cases of mutual love that aren't just cases of self-love. A relation is transitive if, whenever it holds between x and y and also between y and z, it holds between x and z. Among numbers, , �, "divides" and = are all transitive. Definition 8.15 R is transitive on A if! Vx\/y\/z(x, y, z E A ---+ ((Rxy & Ryz) __,, Rxz) ) . Among people, "is at least as tall as" , "is exactly as tall as" , and "has the same birthday as" are transitive. Whenever an intransitive relation holds between x and y and y and z, it does not hold between x and z . "ls one greater than" is intransitive: if n = m + 1 , and m = k + 1 , then certainly n -=f. k + 1 . Definition 8.16 R is intransitive on A iff Vx\/y\/z (x, y, z E A ---+ ( (Rxy & Ryz) -> -iRxz) ) . Among people, "is a year older than" , "is ten pounds heavier than" , and "is four inches taller than" are all intransitive. Many relations are neither transitive nor intransitive. "Have a common divisor" , for example, doesn't always hold between x and z if it does between x and y and y and z. 4 and 12 have a common divisor, for example, and so do 1 2 and 3. Yet 4 and 3 have no common divisor. But it does hold between x and z sometimes: 4 and 12 have a common divisor, as do 12 and 16. But in this case 4 and 1 6 also have a divisor in common. A relation is connected on a set if it holds, in one direction or another, between any two distinct objects in the set. Among numbers, < , >, � . � and i=- are all connected. If n and m are distinct ' numbers, then one must be larger than the other; either n > m or m > n. Similarly, if n and m are distinct, then n i=- m . Definition 8 . 1 7 R is connected o n A if! Vx\/y( x, y E A -> ( x -=f. y -> ( Rxy V Ryx))). This is sometimes called the trichotomy property, because it says that, for any x and y, one of three things must hold: Rxy, Ryx or x = y. A relation is strongly connected on a set if it holds, in one direction or another, between any two objects in the set. This is the same as connectedness, except that the objects no longer have to be distinct. The definition drops the x -::/ y clause:

1 77

Relations Definition 8 . 1 8

R is strongly connected on A iff VxVy(x, y E A ---> ( Rxy V Ryx ) ) .

This can be called the dichotomy property, because i t says that, for any x and y , either Rxy or Ryx. Although -:/-, < , >, :s:;, and � are all connected, only :s:; and � are strongly connected. No matter what n and m are, it must be the case that either n :s:; m or m :s:; n. But it isn't necessarily true that n < m or m < n; n and m might be equal. Notice that strongly connected relations are reflexive. For any x and y in A, Rxy or Ryx. If we choose the same object for both x and y (say b) , then Rbb or Rbb. So strongly connected relations must hold between each object and itself. An example: 1 . R is irreflexive on A. 2.

Show A x A - R is reflexive on A

3.

Vx(x E A ---> -.Rxx)

4.

Show Vx(x E A ---> (x, x) E A x A - R) Show b E A

5. 6.

bEA

7.

-.Rbb

8.

(b, b)

9.

(b, b) E A

--->

(b, b) E A x A - R VE, 3, SL, 6

1R

NI, 7 X

A

(b, b ) E A x A - R

10. 11.

Irrefl . , 1

A x A - R is reflexive on A

Problems

Which of the properties this section has discussed apply to these relations? 1 . E (membership) 2. � (subset) 3. c ( proper subset) 4. are disjoint 5. < (on N) 6. > (on Z) 7. S (on N) 8. � (on Z) 9.

= (on N)

10. -:/- (on Z)

x, 6 Def-, 8, 9 Irrefl., 4

1 78 1 1 . divides (o n N) 12. is a factor of ( on N) 13. is a prime factor of ( on N) 14. is a multiple of (on N) 15. is the power set of 16. is the intersection of 17. is the union of 18. is the complement of 19. is the absolute value of (on Z) 20. is the square of (on N) 2 1 . i s a square root o f (on N ) 22. is the square of (on Z) 23. is a square root of (on Z) 24. is the square of ( on the reals) 25. is a square root of ( on the reals) 26. is a square root of (on the complex numbers) 27. implies ( on the set of formulas) 28. is equivalent to (on the set of formulas) · 29. are mutually satisfiable (on the set of formulas) 30. contradicts (on the set of formulas) Which properties do t hese relations have on the set of people? 3 1 . is the father of 32. is a parent of 33. is a child of 34. is a daughter of 35. is a sibling of 36. is a brother of 37. is a cousin of 38. is a descendant of 39. is related to

Logic, Sets and Functions

Relations Give an example of a relation on a set that is: 40 . both reflexive and irreflexive

41 . symmetric and antisymmetric

42. reflexive, symmetric andtransitive

43. reflexive, antisymmetric and transitive 44. irreflexive, asymmetric and transitive

45. reflexive and symmetric, but not transitive

46. reflexive and antisymmetric, but not transitive. 47. irreflexive, symmetric, but not transitive

48. symmetric and intransitive

49. antisymmetric and intransitive 50. asymmetric and intransitive

Prove these theorems. Let R and S be relations on sets A and B , respectively. 5 1 . If R is intransitive on A, then R is irreflexive on A.

52. If R is symmetric and transitive on A, then R is reflexive on F(R) . 53. If R is strongly connected on A, then R is reflexive on A.

54. If R is strongly connected on A, then R is dense on A.

55. If R is asymmetric on A, then R is antisymmetric on A.

56. If R is asymmetric on A, then R is irreflexive on A.

57. If R is transitive and irreflexive on A, then R is asymmetric on A.

58. If R and S are reflexive on A and B, respectively, the R U S is reflexive on A U B.

5 9 . If R i s irreflexive on A and T 'x((x , or :::; and 2:: , or "parent" and "child" . If n < m, then m > n. Similarly, if Hank is Alice's child, then Alice is Hank's parent. In short, whenever one relation holds in one direction, the other relation holds in the other. Such relations are converses. Definition 8.24 The converse of R, R- 1 , is the collection of all ordered pairs (y, x ) such that

(x, y) E R.

Symbolically,

R- 1 = {(y, x) : ( x, y E R) }

Conv

Conv Every ordered pair in a relation is in its converse, but with the constituents switched. That is, all first constituents become second constituents, and vice versa. Thus, if (John, Bill) E "kick" , (Bill, John) E "kick" - 1 , i.e., "be kicked" . Theorem 8.8 For all R,

1 . V(R) = R(R- 1 ) , 2 . R(R) = V (R- 1 ) , and 3. if R W 'efx(x E A -, (x, x) E R o S) ShE>W b E A -, (b, b) E R o S

bEA Rbb Sbb

:Jx(Sbx & Rxb) (b, b) E R o S

1 . R and S are strongly connected on A 2 . ShE>W R o S is strongly connected on A

3. 4.

Show Vx'efy(x, y E A -> (R o Sxy V R o Syx)) ShE>W b, c E A -> ( R o Sbc v R o Scb)

Refl, 2, 5

Refl, 2, 5

&I, 6, 7, :JI

Comp, 8

Assumption

Logic, Sets and Functions

188 5.

b, c E A

6.

Sbb

St. Conn, 1 , 5, SL

7.

Sec

St. Conn, 1 , 5, SL

8.

Rbc v Rcb

St. Conn, 1 , 5

9.

Case 1: Rbc

10.

R o Cbc

&I, 6, 9, 31 , Comp

11.

R o Sbc V R o Scb

VI, 10

12.

Case 2: Rcb

13.

R o Scb

R o Sbc v R o Scb

14 . 15.

R o Sbc v R o Scb

Problems What is the converse of each of these relations? 1. < 2.


4.

>

5.

=

6.

=I-

7. E 8. � 9. C

1 0 . divides 1 1 . is a multiple of 1 2 . is absolute value o f (on Z )

1 3 . i s the square o f ( on N )

14. is the square of ( on Z) 1 5 . i s taller than 1 6 . i s at least as tall as

&I, 7, 12, 31, Comp VI, 13 VE, 8, 1 1 , 14

189

Relations 17. is exactly as tall as

18. has the same birthday as 19. is parent of

20. is child of

2 1 . is grandparent of

22. is uncle of

23. is cousin of

24. is related to

25. hates

Prove the following theorems. Assume that R , S, T and V are binary relations on the set A . 26. The converse of R i s reflexive on A if R is.

27. The converse of R is irreflexive on A if R is.

(!)

28. The converse of R is symmetric on A if A is.

29. The converse of R is asymmetric on A if R is.

30. The converse of R is antisymmetric on A if R is.

3 1 . The converse of R is transitive on A if R is.

32. The converse of R is connected on A if R is.

33. The converse of R is strongly connected on A if R is. 34. The converse of R is intransitive on A if R is. 35. R o S is reflexive on A if both R and S are.

36. R o S is strongly connected if both R and S are.

37. The converse of the converse of R is R.

38. ( R u s) - 1 = R- 1 u 39.

s- 1 .

(R n s)- 1 = R- 1 n s - 1 .

40. (R - s)- 1 = R- 1 - s- 1 . 41 . R o f/J = f/J.

42. V(S o R) � V(R) .

43. (S o R) - 1 = R- 1 o

s- 1 .

44. T o ( S o R) = (T o S) o R. 45 .

(R >

>

> > >

#




74.

E

C

75 .

E

E

76.

C

C

77.

C

E

78.

parent

brother

79.

brother

parent

80.

child

parent

81.

child

child

82.

brother

child

83.

child

brother

84.

mother

spouse

85.

spouse

mother

86.

spouse

brother

87.

brother

spouse

88.

spouse

spouse

8.7

Restrictions and Images

The < relation is a strict linear ordering of the real numbers. Nevertheless, we've often spoken in this chapter of the properties of < on the natural numbers or the integers. These properties, as we've observed, can differ; < is dense on the real line but not on the integers. Nevertheless, < on the integers is simply a subset of < on the reals. To obtain < on Z from < on the real numbers, in fact, we have only to eliminate all the ordered pairs comprising any nonintegral reals. In short, we restrict both the domain and the range of the relation to the set of integers. We can, and often do , restrict other kinds of relations. The "parent" and "child" examples in this chapter have dealt, implicitly or explicitly, with the domain of people. But procreation far outreaches humanity. The relation makes perfectly good sense when applied to cats, dogs, and rabbits. The parent-child relation among humans, furthermore, is a subset of the broader parent-child relation linking any animals biologically connected in the appropriate way. Formally, we can say that the relation among humans is the restriction of the broader relationship to humanity. In general, we restrict a relation to a set by omitting all ordered pairs in the relation whose first constituents fall outside the set. The restriction of S to a set A is written "S l A" . Definition 8.26 SlA

=

{ (x, y) : x E A & (x, y) E S} .

Logic, Sets and Functions

19 2

1 (Restriction) S l Axy Sxy & x E A

l

The restriction of the parent-child relation to the set of humans has the effect we want, since people never have other kinds of animals as children. In the case of B is bijective (or a one-to-one correspondence) iff f is both

injective and surjective (onto B).

Bijective functions are called one-to-one correspondences because they establish a correspondence between members of A and members of B . Each element of A is mapped into one and only one object in B; every object in B is a value of the function for one and only one object in A.

The doubling function on the reals is bijective; it correlates each number n with one and only one number, 2n. Each number n is also the value of the function for exactly one input, namely, n/2. This is a special case, since the set being mapped and the set being mapped into are identical. A bijection f: A ---> A is called a permutation. Equivalently, a function f: A --> B a permutation iff it is bijective and A = B. The doubling function is a permutation of the real numbers.

206

Logic, Sets and Functions The following are useful derivable rules.

Surj*

Inj* f: A

--+

c, d E A

B is injective

c =/= d

f (c) -/= f ( d)

f: A ---> B is surjective cE B 3x( f (x) = c & x E A) Fune Comp

Biject Conv

f: A --+ B is bijective J -1 : B ----, A is bijective Inj Conv

J : A --+ B g: B ----, C xEA g o f(x) = g(f(x))

f: A ----, B is injective cEA 1 (f(c)) = C & J- 1: R(f) -- A is bijective

r

Proof of Bijec Conv: 1 . f: A ___. B is bijective 2. Show 1- 1 : B

3. f: A

->

4 . f: R(f)

--+

A is bijective

B is injective

->

A is bijective

5. f: A ---> B is surjective

Fune Conv

6. B � R(f)

7. B = R( f ) 8. 1 - 1 : A ---> B is bijective

Def. func, Thl, 6 =E, 4, 7

Prob]ems

Say whether these functions f: A --+ B are injective, surjective, bijective, or none of the above, for the indicated values of A and B.

f(x) 1. X

2. x + l 3.

X -

4 . lx l

l

=

A

B

N

N

N

N

z z

z z

207

Functions 5. l x l

.,

z

N

6. x-

N

N

x2

z

N

R

R

N

R

(assuming ft � 0)

R+

R

(assuming ft � 0)

1 1 . 2x

N

N

1 2 . 2x

z

z

R

R

z+

N

z+

z

N

R

N

N

R

R

N

N

z+

R

2 1 . x/2

R

R

22. -x

z

z

-x

N

z

7.

8. x2

9. ,fi

10. ,fi

1 3 . 2x 14. xx

15. x x 16. fi

17. x 2 + 2x + 1

18 . x2 + 2x + 1 1 9 . x!

20. ln(x)

23.

24 . 1/x

(assuming even roots � 0)

R - {O} R

25 . . sin x

R

R

26. x, rounded to R nearest integer Prove the following theorems.

27. If f: A -----+ B is surjective, then 'D(f- 1 ) = B.

28. I f f: A -----+ B is injective, then 1 - 1 i s a function from R ( f ) t o A.

29. Let f: A -----+ B. Then 1 - 1 is a function from B to A iff l is bijective. 30. If l: A

-->

B is a bijection, then 1- 1 : B

-----+

A is also a bijection.

3 1 . If f: A -----+ B and g: C -----+ D are injective functions and f n g: A n C ---> B n D, then f n g: A n C -----+ B n D is also injective.

208

Logic, Sets and Functions

�... ....,,. 32. tet R be a strict linear order on A. Let f: A Rf(x)f(y) ) ) . Then f is injective.

--->

A be such that VxVy(x, y E A -> ( Rxy ->

33 . Say that a function I= R ---> R is strictly increasing iff VxVy(x < y ----+ l (x) < l(y) ) . Then all strictly increasing functions on the reals are injective. 34. Where < is a strict linear order on A, let f: A ---> A be such that VxVy(x, y E A -> (x < y ----+ l (x) < l(y) ) ) . Then VxVy(x, y E A _. ( f (x) < l (y) _. x < y ) ) . 35. Let f: A ---> A , 1- 1 : A ---> A be bijections. Suppose that, for some x E A, l(x) Then, for some x E A, f(f(x)) -=I= x. 36. If I: A

--->

A and g: A

--->

-=I=

1 - 1 (x) .

A are both injective, then f o g is injective.

* 12. Let I: A ---> B, g: B ---> P(A), and g(x) injection.

= {y E

A: l(y)

= x}.

Then, if J is bijective, g is an

* 1 3 . If f: A ---> B and g: C ---> D are injective functions with disjoint domains and disjoint ranges, then I u g: A U C ---> B U D is also an injective function. *** 14. Let I: P(A) ---> P(A) be such that, for any x and y, x A [n + 1/x]

To take a simple example of such a proof, consider the successive sums of the natural numbers: 0 0

1 1

2 3

3 6

4 10

5 15

6 21

7 28

8 36

n n(n+ l ) /2

The numbers 1 , 3, 6, 10, 15, etc. Maurolycus called triangular numbers. We can state the theorem that this table suggests, then, as either For any n, the sum of the first n+ 1 natural numbers (i. e., of the first n positive integers) is n(n+l)/2 or as For any n, the nth triangular number is n(n+ l)/2. In modern notation, we can also express the theorem as �. L..., i

=

i=O

n(n + l ) --2--

To construct a proof by mathematical induction, we can begin by showing that the theorem holds of 0. Basis: let n = 0. The sum of the first natural number, 0, is just O; furthermore, 0(0+ 1 ) /2 = 0. So the sum is 0 (0+ 1 ) /2. Inductive step: assume for conditional proof that the theorem holds for n. We must show that the theorem holds of n + 1, i.e., that

f i=O

i=

(n + 1 ) ( ( + 1) + 1 ) �

That is, we have to show that the n + 1st triangular number is (n + l ) (n + 2)/2. Now the n + 1st triangular number is just the nth triangular number plus n + 1 : n+ l

n

L i = L i + (n + l ) . i=O

i=O

But, by the assumption that the theorem holds for n ( called the inductive hypothesis) , we can conclude that the n + 1st triangular number is n(n + 1 ) /2 + (n + 1 ) . By algebra, this is (n2 + n +

218

Logic, Sets and Functions

2n + 2)/2 , i.e. , (n2 + 3n + 2)/2. But this is just (n + l ) (n + 2)/2, which is what we need to show. For another example, consider this theorem of Maurolycus: " Every integer plus the preceding integer equals the collateral odd number." In other words, the nth odd number is equal to n + (n - 1 ) . It's easy to see that this holds of the first several odd numbers: 0 0

1 1

2 3

3 5

4 7

5 9

6 11

7 13

8 15

n n+(n- 1 )

Notice that each odd number is two greater than the preceding odd number. To prove the theorem, we can use mathematical induction. Say that On is the nth odd number. Basis: we need to show that the first odd number is equal to n + (n - 1) . The first odd number, 0 1 , is l ; furthermore, 1 + (1- 1 ) = 1 . So the theorem holds of the first odd number. Inductive step: assume that the theorem holds of n. Assume, in other words, that the nth odd number is equal to n + (n - 1 ) . We need to show that the theorem holds of n + l , so we need to prove that the n + 1st odd number is equal to ( n + 1) + ( (n + 1) - 1 ) , i.e., 2n + 1 . In general, On + l = On + 2. By the inductive hypothesis, On = n + (n - l ) , i.e., 2n - 1 . Thus O n+ l = 2n - 1 + 2, so On + l = 2n + 1 . As another example, I will prove the so-called quotient theorem: Quotient Theorem . Let x, y E N, y $ x, x =I 0, y =/- 0. Then there exist q, r E N such that: q =I 0, x = q • y + r, andr k.

Case 1 . c $ k. c $ k , and c i- 0. By inductive hypothesis, there exist m and n such that m =I 0, k = m • c + n, and n < c. k + l = m · c + (n + 1 ) . There are two subcases: n + l < c and n + l � c. Case l a. n + l < c. Then m and n + l are the q and r we need for k + 1 . Case 1 b . n + l � c. Since n < c , we have that n + l = c . So, k + l = m • c + c So, let q = m + 1 , and let r = 0.

Case 2. c > k. Since c $ k + 1 , c = k + 1. So, k + 1

= c • 1 + 0.

Let q = 1 and r

= ( m + 1) • c + 0. = 0.

This completes the proof.

COUNTING PRINCIPLES If a set is finite, then its members can be numbered. There will be a unique natural number which corresponds to the size of the set. Using the theory of functions developed in Chapter Three, we can give a precise definition to the notion of two· sets being equinumerous. Definition 10.9 Sets A and B are equinumerous, A � B, if! there is a function f such that

f:A

--->

B is a bijection.

219

Induction

A natural number n corresponds t o the size (o r "cardinality" ) o f a set A iff A and n are equinu­ merous. Let I A I represent the size of A. Definition 10.10 If A is finite, then I A

I= n

iff n is a natural number and A � n.

As you might expect, no two distinct natural numbers are equinumerous. First , we need to prove the following lemma: Lemma 10.1 If m + l � n + l , then m � n. Proof. Suppose that m + 1 � n + l . Then there is a bijection f : m + 1 ---> n + l . Since f is a bijection, and n E n + 1 , there is a k E m + I such that f(k) = n. There are two cases to consider: k = m, and k =f. m. If k = m, then k i m is a bijection from m to n , and so m � n. (k j m is still l -to- 1 , and it must be onto n, since f was onto n + 1 , and the only value which k i m is missing does not belong to n.) Consequently, we can suppose that k =f. m. Consider the following function g : g = (f - { (k, n) } ) U { (k, f(m)) } . The function g is the result of removing from f the ordered pair which assigns a value of n and replaces it with the value which f assigns to m. The function g j m is a bijection from m to n, as can be easily verified. Thus, m � n. Theorem 10.12 If m and n are natural numbers and m � n, then m = n . Proof. By induction on n .

(i) Base case. Assume m � 0 . Then m must be empty, and s o m = 0. (ii) Inductive case. Assume for all m, if m � k, then m m = k + l.

= k.

Show, for all m, if m � k + 1, then

Assume that m � k + 1 , for m an arbitrary number. k + I is not empty, and so m is not empty. m =f. 0, so there is a number p such that m = p + l . Hence, p + 1 � k + 1 .

By the lemma, it follows that p � k . By the inductive hypothesis, it follows that p p + I = k + l . Consequently, m = k + l.

= k Hence,

On the basis of Theorem 10.12, we can show that every finite set has a unique size. Consequently, our function I A I is well- defined. Corollary 10.9 If m is a natural number, then

I m I = m.

The following are some of the fundamental principles fo r computing the sizes o f complex sets. Theorem 10.13 (The Sum Rule) If A and B are disjoint, finite sets, then I A u B

Corollary 10.10 If A and B are finite sets, then I A U B Corollary 10.11 If A then

=

l=I A j + I B I - I

AnB

l=I

AI +

IBI

I

{ Ao, · · · , An }, a finite family of finite sets, and A is pairwise disjoint,

I A I=

LI � I i=O

Theorem _10.14 (The Product Rule) If A and B are finite sets, then I A x B

l=I A I · I B I

Logic, Sets and Functions

220

Theorem 1 0 . 1 5 (The Exponent Rule) If A and B are finite sets, and C = { f := f is a function

from A into B } , then I C I

=

I B II A I .

Corollary 10.12 ( The Powerset Rule) If A is a finite set, then I P(A) I = 2I A I

.

Proofs. Left as exercises. Problems

Prove each of the following theorems using mathematical induction. 1. The sum of the first n odd numbers is n2 . 2. The sum of the first n even numbers (i�cluding .0) is n2

-

2 , •. ·�,;w,r.-3/The sum of the first n even numbers (excluding 0) is n

n.

+ n.

4. The sum of the squares of the first n positive integers is n(n + 1) (2n + 1 ) /6. 6. The sum of the squares of the first n even numbers is 2n(n + 1) (2n + 1)/3. 7. The sum of the cubes of the first n positive i ntegers is (n 2 8. The sum of the cubes of the first n even numbers is 2(n2

2 9 . The product of the first n powers of 2 is 2 0, and n :::: 0, (a + l) n - 1 is divisible by a.

17.

I:� 1 i(i!) = ( n + 1)! - l .

18. Suppose that f : N v'x(f(x) = xf( l ) ) .

--->

N is such that \/x\/y(f(x + y)

1 9 . Let a > 1 and k be natural numbers. Define f : N J (k) + ak + 1 . Then \/x( f (x) = (a x + l - 1 ) / ( a - 1 ) ) .

=

--->

f(x) + f(y)). Then /(0) N : ( 1 ) /(0)

=

=

1 ; (2) f ( k

0 and

+

1)

=

20. Let V b e an initial segment o f the positive integers, from 1 to n. Then there is no strict partial ordering n on V such that R.nl and, for all i such that 1 S: i < n, R.i (i + 1 ) . 2 1 . S how: for all n E N, n + 3


VxA(x) Proof. Assume A (O) &Vx [Vy(y < x -, A( y)) -, A(x) ] Shew VxA(x) Shew A(b ) Assume -. A ( b) Let c

=

{x E N : -.A (x) }

b E c (by Abstr.) c -.:J 0, and c ,x < d) . -.A(d ) (by A bstr. ) . Shew Vy ( y

< d _, A(y))

Assume e < d Shew A(e) Assume -.A (e) eEc

,e < d e A (y)) -> A (d) , by assumption. A(d) -.A(d) , by Repetition. QED.

Induction

223

Strong Induction on N Show 'v'xA Show A [O/x] Assume 'v'y(y Show A [njx]

< n --. A.[yjx] )

This rule can be generalized to be used in proofs where the base case is some number, like 1 or 2, which is greater than 0:

Strong Induction on N , for n

2: m

Show 'v'x(x 2: m --. A) Show A [m/x] Assume n > m&Vy(m $ y < n --. A[y/x]) Show A[njx]

Notice that the inductive hypothesis is (n > m&'v'y(m $ y < n _, A[y/x] ) ) , rather than the 'v'y(m $ y < n _, A[y/x] ) that the above discussion suggests. This is simply to avoid duplication . If we were forced to take account of the case where n = m in the inductive step, we would have to repeat the proof of the basis case. Again, m represents a minimal value; if we want to prove the theorem for all positive integers, we should start with l . If we want to prove a theorem for all positive integers greater than k, we should start with k + l. To take an example of a proof by strong induction , let's prove the theorem that for every x 2: 2, there is a prime y and a number z > 0 such that x = y · z. Base case: If n

= 2, 2 = 2 • 1 , 2 is a prime and 1 > 0.

Inductive hypothesis: suppose for every x such that 2 :S x < n, there is a prime y and number z > 0 such that x == y · z. Show: there is a prime y and a number z such that n

= y · z.

Either n is prime or not.

Case l . If n is prime, then n == n • 1 , n is prime, and 1 > 0. Let n be y and 1 be z.

Case 2. If n is not prime, then for some a and b greater than 1, n = a · b, and a and b are less than n (show this by induction) . By ind hyp, a = m · q, for some prime m and q > 0. By associativity of multiplication, n = m · (q · b) . Since q > 0 and b > l , (q • b) > l. Let m be y and let (q • b) be z.

Logic, Sets and Functions

224

Problems Leonardo of Pisa ( c. 1 1 70 -1250) , an Italian monk who was also called Fibonacci, devised a sequence of numbers, ( 1 , 1 , 2, 3, 5, 8, 13, 21 , 34, 55, 89, 144, 233, 377, 6 1 0, . . . ), that have many interesting properties. Corresponding to these sequence is a function f : N __, N, where f(0) is the first Fibonacci number, f ( l ) is the second, etc. This function can be defined recursively as follows: f(0) = 1 ; f ( l ) = l ; f(k + 2) = J ( k + 1) + f(k). For problems 1 and 2, prove the corresponding theorems about the Fibonacci numbers. l. I:7=o f(i)

2.

I:;=O

= f( n + 2) - 1 .

f(i) 2

= f( n ) · J( n + 1 ) .

i=O

3. Let the function b be defined recursively as follows: b(0) 2 • b(k + 1) + b(k). Prove that for every n , b( n ) is odd.

10.4

l; b(l)

l; and b(k + 2)

Induction on Sets other than the Natural Numbers

Induction relies essentially on an ordering of the objects in a particular domain. The natural num­ bers, of course, come with this ordering built in. But any set can serve as a ground for inductive proof if it's ordered in a similar way. The ordering must allow for an answer to the question, What is the next member of the set?, or, at least, the question, What are the next members of the set? This requires a well-ordering of the set or of some partition of it. The set, in other words, must be partially well-ordered. A strict linear ordering doesn't suffice. The reals, for instance, are strictly linearly ordered by < . But there is no answer to the question, What is the next real after 1 ?, if we interpret next in terms of the < ordering. Given any real greater than 1, we can find another between it and 1 , because < is dense on the real line. The less-than relation, then, doesn't order the reals in the way that induction requires. It doesn't well-order the real line. Whether the reals can be well-ordered at all is somewhat controversial; the axioms of set theory presented in Chapter 1 don't decide the question. In any case, to performan inductive proof, we need to work with at least a partial well-ordering. That a set can be partially well-ordered isn't enough; we need an actual partial well-ordering to do the proof.

Strong Induction on Sets other than N Let R be any partial well-ordering of set B ( and R � B x B) Show 'v'x(x E B --> .A(x)) Assume c E B Show 'v'y(Ryc --> .A(y) ) --> .A(c) Assume Vy( Rye --> .A(y)) Show .A(c) Suppose, for example, that we are dealing with formulas of sentential logic. We can order these formulas by associating, with each formula, a number- say, that number of connective occurrences

225

Induction

in the formula, or the number of binary connective occurrences, or the number of occurrences of sentence letters, etc. This divides the set of formulas into various groups. If we choose the function mapping each formula into the number of connective occurrences in it, the groups are: those with no connective occurrences; those with one connective occurrence; those with two connective occurrences; etc. Alternatively, suppose we are dealing with finite mathematical trees. We can associate, with each tree, a number specifying the number of nodes on the longest branch of the tree. This also orders the set of finite trees into classes: those whose longest branch has only one node; those whose longest branches have two nodes; etc . As these examples suggest, any function from a set into N partitions the set. It splits the class into the set of those things mapped into O; those mapped into l ; those mapped into 2; etc. Where f is a function from a set A into N, the partition is the set of the inverse images of unit sets under J. The partition, that is, is the set of all nonempty 1 - 1 [ { n}] for n E N. We can view these inverse images as ordered as follows: Vx\fy(3n:lm(n, m E N &x = 1 - 1 [{n}] & y = J - 1 [{m}]) ---> (x < y +-+ n < m) ) . This is a well-ordering of the partition. The ordering on the set A itself is also easy to construct from the function: \fx\fy(x, y E A ---> (x < y ,_. f( x) < J(y) ) ) . This is not a well-ordering but a partial well-ordering. To meet the well-ordering requirement, then, we normally construct a function, which we'll call the index function, mapping each element of the set in question into the natural numbers. This function partitions the set and introduces on it the appropriate ordering. Strong Induction on Sets (Using Index function) Let f : B --, N Show Vx(x E B - --> A(x)) Assume c E B Show J(c) = 0 ---> A(c) Assume f(c) = 0 Show A(c) Assume J(c) = n, n > 0 Show Vy(J(y) < n ---> A(y)) --> A(c) Assume \fy( J (y) < n ---> A(y)) Show A(c) The second requirement pertains to the structure of the set forming the ground of the inductive proof. If this set lacks the right sort of structure, then even a function such as the above will do little good. Suppose, for example, that we want to prove something about all U.S. citizens. We construct a function mapping citizens into numbers by associating, let 's say, each citizen with the number of characters in his or her full name. This function maps James Earl Carter into 1 5 , Sandra Day O'Connor into 1 7 , and Willard van Orman Quine, John Cougar Mellenkamp and Nicholas Michael Asher all into 20. Now, perhaps we want to prove that no citizen can eat j ust one potato chip. The basis case is easy: we need to show that no citizens with names of O characters can eat just one chip. Assuming that every citizen has a name with at least one character, there are no such citizens, so the basis case is trivial. The inductive step, however, is impossible, and not because of the oddity of what we want to prove. Let's say we try strong induction. Then we assume that all

226

Logic, Sets and Functions

citizens with names of less than n characters are unable to eat just one potato chip. We must show that those with names of n characters can't eat just one. To accomplish this, however, we must be able to establish some relationship between the potato-chip-eating abilities of those with names of n characters and the same abilities of those with shorter names. And, in all probability, no such relationship exists. Bluntly, the length of a person's name has nothing to do with how many potato chips that person can eat at one sitting. The ordering we introduce by mapping the members of the set into N, therefore, must correspond in some way to structure in that set relevant to what we want to prove. This is an extremely vague requirement. But we can make it somewhat more precise by considering our examples of inductive proofs on N. To prove our theorem about triangular numbers, we needed to be able to relate the nth triangular number to the n - 1 st such number. In other words, we needed to be able to express a relationship between each triangular number and the one preceding it. We relied on the fact that the n + 1st triangular number is just the nth triangular number plus n + 1: n+ l

n

i=O

i=O

I> = L i + (n + l)

To prove our theorem about odd numbers, we had to do the same: we needed to express the relationship between each odd number and the odd number preceding it (O n + I = On + 2). In general, in order to use weak induction, we must establish a relationship between the value of some function for n and that function's value for n + 1 . If we are dealing with a set other than N, we need to establish a link between objects the index function maps into n and those it maps into n + 1 . The success of weak induction depends on a relationship between the objects in any equivalence class and those in the next equivalence class. The success of strong induction requires a relation between the members of any equivalence class and those in preceding equivalence classes. Consider, for instance, our proof that every positive integer is a product of primes. We relied on the fact that any composite number n can be factored into two smaller numbers m and k such that km = n. We can view positive integers not only as successors of other integers but as sums or products of other, smaller integers. The sum and product operations need not carry us from n to n + l; typically, they carry us from k and m to an n larger than both k and m. When these operations are relevant to the proof, therefore, there is a dependence of facts about any number on facts about smaller numbers. A similar point holds for sets other than N. We can use strong induction to prove facts about members of A with an index function f just in case the relevant properties of A's members in a certain equivalence class under f depend on properties of members of preceding equivalence classes. Normally, as we have already said, inductive proof methods apply to problems involving recur­ sively defined functions. The method of recursive definition, however, also applies to sets. We have already encountered an example of a recursive definition of a set, the recursive definition of the set of formulas. Such definitions specify what a term means by specifying how to construct instances to which it applies. Thus, we've explained formula by saying that all sentence letters are formulas, all negations of formulas are formulas, etc. Any recursive definition of a, set F like the set of formulas consists of three parts: basic clauses, which postulate that entities of certain kinds fall under F; inductive clauses, which postulate that if entities of certain kinds fall under F, then so do entities of other kinds; and an extremal clause, postulating that those are the only objects falling under F. Recall that our definition of formula, in sentential logic, had the form: ( 1 ) Any sentence letter is a formula. (2) If A is a formula, then -.A is a formula.

227

Induction (3) If A and B are formulas, then (A&B), (A V B), (A --+ B), and (A

+-+

B) are formulas.

(4) Every formula can be constructed from a finite number of application of ( 1 )-(3) . Here, ( 1 ) is the basic clause, (2) and (3) are recursive clauses, and (4) is the extremal clause. A recursive definition without any recursive clauses is simply an explicit definition, which could be paraphrased into a universalized biconditional. The general framework for a recursive definition of a set can be stated thus: Recursive Definitions of Sets: To define A, a subset of a given set U: ( 1 ) Basic clause: specify minimal elements. Vx E U(¢ ( x) --+ x E A) (2) Inductive clauses VxVy E AJ(x, y) E A & Vx E Ag(x) E A & (3) Extremal clause A is the intersection of all the subsets of U satisfying conditions ( 1 ) and (2). A very important special case of recursive definition occurs whenever A is freely generated from the set {x : ¢(x) } by functions J and g. Set A is freely generated by this recursive definition just in case the set of basic elements is disjoint from the range of each of the functions 9i , and each of these ranges is disjoint from all the others. Definition. A set A is freely generated from set { x : ¢(x) } by functions

f

and g iff

( 1 ) A is defined recursively in terms of ¢, f and g, as above,

(2) The sets {x : ¢(x) } , R(f) , and R(g) are pairwise disjoint, (3) f and g are injective.

Let's put our definition of SL into this form. Let U be the set of n-ary sequences of symbols from the vocabulary V , i.e. , U = u{v n : n E N } , where v n is the set of functions from n into V . We define £ as a subset o f U by the clauses: ( 1 ) \Ix E U(x : 1 --+ V&x(O) is a sentence letter --+ x E £) (2) Vx1 E £Neg(x 1 ) E £ Vx1 Vx2 E £Conj(x 1 , x2) E £ Vx1Vx2 E £Disj (x1 , X2) E £ Vx1 Vx2 E £Cond(x1 , x2) E £ Vx1Vx2 E £Bicond(x1 , x2) E £ (3) £

= n{A : A P(U) by recursion:

approx(O) approx(k + 1 )

{x E U : ¢(x)} = approx(k) U {z E U : 3x3y (x, y E approx (k) &(x = f(x, y ) V x = g(x) ) }

approx(O) consists of the set of minimal elements of A, i.e., elements of U which got into A by one of the basic clauses of the recursive definition. In the case of the language £ of SL, approx(O) is the set of all 1-ary sequences [¢] , where ¢ is a sentence letter (i.e. , ¢ is p, q, r, . . . ). approx(k + l ) consists o f union o f approx(k) with the set of elements which got into A by applying one o f the inductive clauses of the definition to elements of approx(k). Thus, approx(k + 1) consists of all the elements of A which have been added to A by the time we've reached the k + 1st stage of the recursive process. In the case of SL, approx(k + 1) will include all sentences whose grammatical tree contains no branch longer than k + l. Now, define the index function rnk : UR(approx) -> N as: rnk(x)

=

the least n E N such that x E approx(n)

If x E UR(approx) , then x E approx(n) , for some n . By the fact that l . Inductive hypothesis: Every finite connected graph with fewer than n edges in which every vertex has an even degree has an Euler circuit. By Lemma 10.3, we know that G has at least one circuit C. Let G\C be the sub-graph which is obtained by removing from G all the edges of C and all the vertices on C of degree 2. Removing the edges of C removes an even number from the degree of each vertex of G. Consequently, all the vertices in G\C have even degree in G\C. G\C consists of a finite number of connected subgraphs, H 1 , . . . , Hk . Each of these connected subgraphs Hi has fewer than n edges, so, by the inductive hypothesis, each of them has an Euler circuit, P I . Since G was connected, each of these subgraphs has at least one vertex on the cycle C. For each Hi , choose such a vertex h i on C.

Therefore, G has an Euler circuit, which can be constructed in the following manner. Start with the initial vertex b 1 on C. Proceed along C until you come to the first vertex chosen as one of the h;'s. At this point, leave the cycle C and follow instead the Euler circuit p 1 . Upon returning to h i , proceed again along C until the next chosen vertex is reached. Eventually, all of the edges will be traversed exactly once, and one will return again tob 1 .

Another important type of graph is a directed graph or digraph. If the edges on an ordinary graph can be though of as two- way streets, all the edges on a digraph must be thought of as one-way streets. A digraph is a triple (A, B, f) , where A is a set of edges, B is a set of vertices, and f is a

Logic, Sets and Functions

236

function which assigns an ordered pair (not a pair set) of vertices to each edge. A digraph is usually diagrammed by representing edges as one-way arrows. There are two kinds of paths through digraphs. An undirected path through a digraph consists of a sequence (a 1 , . . . , an ) of edges and a sequence (b1 , • • · , bn + 1 ) of vertices such that for every i :=:; n, either f(ai) = (b; , bi + 1 ) or f(a;) = (bi + 1 , b; ) . A directed path through a digraph consists of a sequence ( a 1 , . . . , an ) of edges and a sequence ( b1 , . . . , bn + 1 ) of vertices such that for every i :C::: n, f(ai) = (b; , b i + l ) . This means that the direction of a directed path must respect the direction of its edges. Let's say that a proper digraph is one with no loops and no parallel edges. There is a very simple way of defining proper digraphs in set theory. Each proper digraph whose vertices belong to set A can be thought of as simply a binary relation on A. Conversely, every binary relation on a set A corresponds to a proper digraph whose vertices belong to A. When there is a directed edge from vertex b to vertex b' in digraph G, this corresponds to the presence of the ordered pair (b, b') in the binary relation R( G) which corresponds to G. We can in fact define a digraph as an ordered pair (A, R) , where A is a set of vertices and R is a binary relation on A, i.e., R � A x A. The next type of structure we will consider is that of a tree. A tree is simply a connected , acyclic graph. Since trees are acyclic, they have no loops or parallel edges. Definition 10.18 A graph is a tree iff it is connected and acyclic. Definition 10.19 x is a leaf of tree T iff x is a vertex of degree one in T.

Theorem 10.19 A finite tree with at least one edge has at least two leaves. Proof. Consider a longest simple path in the tree, with vertex sequence (b 1 , . . . , bn ) . Since the path is acyclic, b 1 -=/ bn , and since the path is one of the longest, both b 1 andbn must be leaves. Theorem 10.20 For every n ? 2, a tree T with n vertices has exactly n - 1 edges. Proof By weak induction. (i) Base case. T has 2 vertices. If T contained more than one edge, it would have to contain either a loop or a pair of parallel edges, which is impossible since it is a tree. (ii) Inductive case. Assume that any tree with k vertices has k - 1 edges. Assume that T has k + 1 vertices. By Theorem 10.19, T has a leaf, bo . Let T0 be the tree which results from removing b0 and the single edge connected to b0 . T0 is a tree with k vertices. S6, by IH, To has k - 1 edges. Therefore, T has k edges. Corresponding to directed graphs, there are also structures called directed trees. A directed tree is simply a connected digraph with no undirected paths which are cycles. If all of the arrows are pointing away from a single vertex, the tree is called a rooted tree, and that vertex is called its root. The class of rooted trees can be defined recursively.

Induction

237

Recursive Definition of Rooted Trees

Let A be a set of possible vertices. (i) Basic clause. If b E B, then (I b I , 0) is a rooted tree, and b is its root. [This is a trivial tree, consisting of a single vertex and no edges.]

(ii) Recursive clause. If (A 1 , R 1 ) , . . . , (A k , Rk ) are rooted trees with roots r 1 , . . . , Tk , u{ Ai : 1 :S i :S k} is pairwise disjoint, r E A , and r is not an element of u { A i : 1 :S i :S k } , then T is a rooted tree and r is its root, where the set of vertices of T is { r } U U{ Ai : 1 :S i :S k , and the set of directed edges of T is ( { r } x { r 1 , . . . , rk } ) u u{ Ri : 1 :S i :S k } . A single vertex is a trivial rooted tree. If one starts with a finite collection of rooted trees, none of which have any vertices in common, and one adds a new vertex which is connected by outwardly directed edges to the roots of the rooted trees, the resulting digraph will also be a rooted tree. If vertex b is connected by an outwardly directed edges to vertices b 1 , . . . , bn in tree T, then b is called the parent of b 1 , . . . , bn , and b 1 , . . . , bn are called the children of b. The depth of a rooted tree can also be defined recursively: Recursive Definition of the Depth of Rooted Trees

Let T

=

(A, R) be a rooted tree, with root r.

(i) Base clause. If A is a unit set (i.e. , T is a trivial, one-vertex tree) , then the depth of T is zero. (ii) Inductive clause. Suppose T is not a trivial tree. Then r has children in T, say r 1 , . . . , rn . Each of these Ti 's is the root of a subgraph Ti , which is itself a rooted tree. Thus, each of the Ti 's has a depth: d1 , • - · , dn . The depth of T is l + Max{d1 , - - · , dn } A binary tree is a rooted tree all of whose parent vertices have at most two children. A regular binary tree is a binary tree all of whose parent vertices have exactly two children. The level number of a vertex in a rooted tree is the length of the unique directed path from the root to that vertex. (The level number of the root itself is 0.) A full binary tree is a regular binary tree all of whose leaves have the same level number. The depth of a rooted tree ( defined recursively above) is also the maximum level number of any vertex in the tree. Definition 10.20 ( A , R, S) is an ordered rooted tree if! (A, R) is a rooted tree and S is a binary

relation on A which linearly orders the children of every vertex in ( A, R)

The final type of structure which we will introduce in this section is that of a list . A list is a kind of finite sequence. We have already defined n-ary sequences in Chapter 8. The n-ary sequence

Logic, Sets and Functions

238

(a 1 , . . . , a n ) is defined in terms of nested ordered pairs: ( ( . . . ( (a 1 , a2 ) , a3 ) , . . . an- 1 ) , an ) . For many purposes, it is convenient to define n-ary sequences in a different way. Given the notion of a natural number, it is useful to think of an n-ary sequence or list as a function from n into some set. Suppose f is a function from some natural number n into A, i.e., f : n -+ A. Remember that n is defined as the set {O, 1, 2, . . . , n - 1 } . This set is f 's domain, and it has n members. The length of a list f is always D( f ) . The first member of the list which f represents is J(O), the second member is / ( 1 ) , and so o n , until we reach the nth member, which i s f(n - 1). I n general, we can define the n-ary list [ao, a 1 , . . . , an - 1 ] as that function f such that J(O) = ao , f( l ) = a1 , - - - , f(n - l) = an - 1 · From this definition, we can derive the following rules:

List Def f: n -> A

J=

[f(O) , f(l ) , . . . , f(n - 1 ) ]

i E N&i < n

It is also useful to have names for certain collections of these lists, analogous to the n-ary Cartesian products of Chapter 8.

Definitions.

= {x : x : n -+ A} = {x : 3y(y E N&y < n&x : y -> A} = {x : 3y(y E N&x : y -> A } 0 An is the collection of all n-ary lists of elements of A . A < n is the collection of all lists of elements of A which are of length less than n. A < N is the collection of all finite lists of A, of any length. A < N = u { A n : n E N } . The empty set is a list, sometimes called NIL.

Now, we must define certain basic operations which can be performed on lists. First, we must define the function which concatenates two lists of symbols into a new, longer list. If f is a list, let f( +n) be the result of shifting the values of f upwards n places, i.e., J( +n) (n) = f (O), f( +n)(n+l) = f(l ) , etc. In general,

\Ix E D ( f ) ( f(+n > (n + x ) = f ( x )) Next, we need to define operations that add an element onto the end of a list , and that turn a single element into a list. The first operation shall be called 'Cons', and the second 'List '.

Induction

239

Definition.

= x U { (1J (x ), y ) }

VxVy(Cons(x , y) Vx(List(x)

= Cons(N IL, x))

Using the function Cons, it is possible to define the set A < N, the set of finite lists of elements of A, recursively. In fact, A < N can be freely generated from the set {NIL} and A by the function Cons, which is injective. Recursive Definition of A < N :

(i) Basic clause. NIL E A rt (a) . By Lemma 1 0. 1 , a E) . D and E have unique phrase structure trees; thus C has the unique tree

C

I\

D

E

Case 5 : C has a biconditional as main connective. Then, for some formulas D and E, C = (D ...... E) . Again D and E have unique phrase structure trees by the inductive hypothesis, and, hence, C has the unique tree

C

I\

D

E

246

Logic, Sets and Functions

In every case, then, C has a unique phrase structure tree. This completes the inductive step, and the entire proof. In order to put the recursive definition of QL from Chapter 4 into the form required by this chapter, we need first to define the su bstitution function. We want the substitution function Sub to be a function from v < N x V x V, that is, the set of ordered triples whose first constituent is a finite string of symbols and whose second and third constituents are symbols, into v < N . Sub(A, v, c) shall be A[v / cl , that is , the result of substituting every occurrence of 'c' in A with an occurrence of 'v'. Since Sub is a 3-ary function, it will be a subset of a 4-ary relation. We can define Sub recursively: For all y, z E VSub(NIL, z, y)

= NIL.

For all x E V < N and all y, z VSub(Cons(x, y) , z , y) = Cons(Sub(x, z, y) , z) . For all x E V < N and alt y, z, w E V [y #- w Sub(Cons(x, y) , z, w) = Cons(Sub(x, z , w ) , y)] .

E

-->

We can then verify the following facts:

Yy, z E VSub(List(y) , z , y)

= List(z)

Yy, z , w E V [y f. z --> Sub(List(y) , w, z)

= List(y)]

Yx, y E V < N\/z, w E VSub(x A y, z , w) = Sub(x, z, w) A Sub(y, z, w) Now we can define QL as follows: Base clauses: l . If x is a sentence letter, then List(x) E QL. 2. If x is an n-ary predicate, and y is an n-ary list of constants, then List(x) A y E QL. Inductive clauses: 3. If x E QL, then Neg(x) E QL. 4 . If x, y E QL, then Conj (x, y ) , Disj(x, y ) , Cond(x, y) , Bicond(x, y ) E QL.

5. If x E QL, y is a constant, z is a variable, x (i) = y for some i , and x(i) f. z for every i E V(x) , then List('\/ ') A List(z) A Sub(x, y, z) E QL.

Extremal clause: 6. QL

=

n{x E v < N : ( 1 ) - (5) [x/ QL] }

Problems

Working within the confines of sentential logic , prove the following theorems. Let lf (A) be the number of left parenthesis occurrences in the formula A; rt(A) be the number of right parenthesis occurrences in A ; p(A) the number of parenthesis occurrences in A; b(A) the number of binary connective occurrences in A; and s(A ) the number of sentence letter occurrences in A.

247

Induction

l . For any formula A, l f (A)

= rt( A ) .

2 . For any formula A, p(A) = 2b(A) . 3. For any formula A , s(A) 4 . For any formula A, s (A)

= b(A) + 1 .

= 1 + rt(A ) .

5. For any formula A, p(A) = 2s(A) - 2 6. For any formula A, any interpretation of A (i.e., assignment of truth values to the sentence letters in A) assigns at least one truth value to A. 7. For any formula A, any interpretation of A (i.e., assignment o f truth values to the sentence letters in A) assigns at most one truth value to A.

Say that two pairs o f parentheses separate each other i ff they occur in the order (k( m )d m , and that a pairing of n left parentheses with n right parentheses is proper iff a left parenthesis is always paired with a right parenthesis to the right of it, and no two pairs separate each other. P rove these theorems (from Kleene 1974) .

8 . If 2n parentheses are properly paired, and one pair is removed, then remaining 2n - 2 paren­ theses are properly paired. 9. A proper pairing of 2n parentheses, where n > 0, contains an innermost pair, i.e., a pair which includes no other parentheses between them. 10. A set of 2n parentheses admits at most one proper pairing. 1 1 . If a set A of 2n parentheses and a consecutive subset B � A of 2m of them both admit proper pairings, the the proper pairing in B forms a part of the proper pairing of A. (That is, each parenthesis of the pairing has the same mate in both subsets.) Consider the language consisting of the two vocabulary items 'a ' and 'b'. The term string is defined inductively: (1) ['a '] is a string. (2) If A is a string, then [' ('J A A" ['b']" [')'] is a string. (3) There are no other strings. Prove these theorems about an arbitrary string S of this language. 1 2 . S contains exactly one a . 13. The number of parentheses in S is twice the number of b s in S. 14. In S, all left parentheses are to the left of a, and all right parentheses are to the right of a. Consider the language consisting of vocabulary items 'O ' and ' 1 ' and defining strings inductively as follows. ( 1) ['O '] is a string. (2) [ ' 1 '] is a string. (3) If A is a string, then [ 'O ' ] 1' A and A" [' l'] are strings. ( 4 ) There are no other strings. Prove these theorems about this l anguage. 15. In any string T, all O s appear to the left of all 1 s. 1 6 . Some strings are syntactically ambiguous. 17. If a string T contains both O s and 1 s, it is syntactically ambiguous. 18. If a string contains only O s or only 1 s, it has a unique phrase structure tree.

Logic, Sets and Functions

248

Consider the '"droid-and-doctor" language consisting of vocabulary items ' � ' and ' It defines strings inductively as foilows: (1) ['

JI '] is a string.

II '.

(2) [' � '] is a string. (3) If A

is a string, then [' • ']" A" [' � '] is a string. (4) There are no other strings. Prove these theoremsabout this language. 19. Any string contains at most one

If .

20. A string contains at least one m if it contains an even number of � s. 2 1 . A string contains no

I[ s if it contains an odd number of � s. O

22. Every string has a unique phrase structure tree.

Consider the "burger-and-brew" language consisting of vocabulary items ' � ' and ' � '. It defines strings inductively as follows: ( 1 ) [' � '] is a string. (2) [' -Q '] is a string. (3) If A is a

-Q

'] is a string. (4) If A is a string, then [' � ' ] " A" [' � '] is a string. (5) string, then A" [' There are no other strings. Prove these theorems about this language. 23. Any string ending in a � also begins with a � . 24. A string contains at least one

-Q

if it contains an even number of � s.

25 . Every string has a unique phrase structure tree.

Consider the "Star-Wars" language consisting of vocabulary items ' � ' and '

(t)) '.

It defines

strings inductively as follows: ( 1 ) [' � '] is a string. (2) [' ((i)) '] is a string. (3) If A is a string, then [' ((ii)) ']"A is a string. (4) If A is a string, then [' � ']" A" [' ((ii)) ') is a string. (5) There are no other strings. Prove these theorems about this language. 26 In any string, the number of � s is less then or equal to the number of

27. Every string has a unique phrase structure tree.

(t)) s

Consider the "bat-and-ball" language consisting of vocabulary items ' � ' and '

defines strings inductively as follows: ( 1 ) [' � '] is a string. (2) ['

plus one.

@ '.

It

@ '] is a string. (3) If A is a

string, then A" [' @ ' ] " [' � '] is a string. (4) If A is a string,A" [' � '] is a string. (5) There are no other strings. Prove these theorems about this language. 28. No string but @ contains no �

249

Induction

29. Every string but @ ends with a � . 30. In any string, the number of .-...-.. s is greater than or equal to the number of @ s minus one.

3 1 . Every string has a unique phrase structure tree.

Consider the "cat-and-mouse" language consisting of vocabulary items ' � ', ' � '

and ' � ' . It defines strings inductively as follows: ( 1 )

I' � '] is a string.

(2) If A is a string,

[' � '] 1' A is a string. (3) If A is a string, then [' � ']1'A 11 [' � '] is a string. (4) There are no other strings. Prove these theorems about this language. 32. Any string contains at least one � . 33. In any string, the number of � is greater than the number of � s.

34. Every string has a unique phrase structure tree.

Let QL be the language of quantification theory without identity. Prove the following theorems about an arbitrary formula A of QL. 35. Every quantifier in A binds some variable in A. 36. If A is a formula, then A [c/d) is a formula.

37. If \lxA is a formula, then Alc/x ] is a formula.

38. No quantifiers on the same variable overlap in scope.

39. Give an example of wffs a: and {3 in the language of SL and lists of symbols b and 'Y such that Conj (o:, {3) = Conj ('Y, b ) , but a: -:/- 'Y-

40. A formula of SL is in conj unctive normal form iff it is of the form (C1 &C2 & . . . &Cn ) , where each Ci is of the form ( L 1 V £2 V . . . L k ) , for some k, and each L1 is either a sentence letter or the negation of a sentence letter (these are called 'literals'). Prove by strong induction that every formula of SL is logically equivalent to some formulain conjunctive normal form.

*"' 4 1 . A formula ¢ is in Horn-clause form iff (a) ¢ is in conjunctive normal form, and (b) and each conj unction Ci in ¢ contains at most one positive (unnegated) literal. Construct an efficient algorithm for deciding whether ¢ is satisfiable.

Appendix A

Plato's Users' Guide A.I

Introduction

Plato is a software application for both Windows and Macintosh computers developed at the Uni­ versity of Texas at Austin. It is intended to assist students who are learning formal logic. It can be used as a resource for any introductory course in formal logic that uses a similar approach.

Plato is a flexible, extensible tool that enables the user to construct proofs in the formal languages of the Sentential Calculus and the Predicate Calculus. It can also be used to construct some formal mathematical proofs, including portions of set theory. Plato is not limited to a fixed stock of examples, and it can construct proofs of great complexity. It is so rapid and accurate that it can assist even advanced students in the construction of complex proofs, yet is so easy to use that it is accessible to the beginner.

Plato helps the user to construct proofs in the system developed in chapters 3, 5 and 6. A proof in Plato is a sequence of formulae in our formal language, either the language of sentential logic (SL) or the language of quantificational logic ( Q) . A proof begins with one or more assumptions and a Show line indicating what is to be proved. The rest of the proof is constructed using deductive rules that apply to the existing proof and that add new lines at the end of the proof. Using Plato, one can construct proofs of several different types; each type of proof can also be used to construct subproofs within a larger proof. For SL, Plato can be used to construct three types of proof: direct proof, indirect proof, and conditional proof. For QL Plato can construct proofs of these three types, and it can also construct universal proofs.

In constructing a proof using Plato, one uses deductive rules that are stored in rule files. Three rule files are provided with Plato: Basic Rules, Derived Rules, and Quantificational Rules. These rules are used to add new formulas to a proof within the framework provided by a proof type. The Basic Rules file includes the following deductive rules: Conjunction Exploitation, Conj unction Introduction, D isj unction Exploitation, Disjunction Introduction, Double Negation, Conditional Exploitation (sometimes called Modus Ponens ) , Biconditional Introduction and Bicon­ ditional Exploitation. The Quantification Rules file includes Existential Introduction, Existential Exploitation, and Universal Exploitation. In addition, the Derived Rules file includes a number of useful rules that can be derived using the rules in the other files. They include rules that correspond to DeMorgan's laws, Modus Tollens, and other rules that are useful in constructing a proof. Plato allows the user to use any rules from any open rule files in constructing a proof. Plato also allows the user to open any number of rule files. 251

Logic, Sets and Functions

252

A.2

Getting Started

In this section, we will describe how to get Plato running on the Apple Macintosh computer (Mac Plus and higher using System 6.0.5 or higher) . Those running Platoin a Windows should use the installer application included on the accompanying floppy disks. Macintosh users simply have to copy the Plato application and the Rule files on to their hard disk, while dragging the Logic fonts folder into their computer's System folder. To use Plato , you should be familiar with the basic operation of your operating system, whether Windows or MacOS, including clicking and dragging with the mouse, and using menus. For use on the Mac, the Plato application requires the following files and folders: 1 . The Plato application itself (the first icon illustrated below). 2 . A suitcase containing three fonts: Terlingua, Salt Flat and Pittsburgh. 3. Some rule set files (the third icon below).

The fonts in the suitcase should be installed on the Macintosh that you are using. If you are using System 6, you can use the Font/DA Mover to add the fonts to the System file, or you can use some other font utility to install these fonts. If you are using System 7 or later, you may simply drag the fonts onto the System Folder or the Fonts folder within the System Folder.

A.3

Starting Plato on the Macintosh

Find the icon for Plato: a picture of a hand holding a pencil to a deductive proof (illustrated above) . Double-click on the icon, or click on the icon and select Open from the file menu. After a moment, the window for Plato will appear on your screen. When it starts, Plato will open an empty document. You can also double-click on a Plato document (a document having the middle icon in Figure 1 . ) to start Plato with that document, or you can drag a Plato document or rule file to the Plato application to open that document or rule file.

The Menus : A n Overview

A.3 . 1

At the top of the screen, you will see the usual Apple, File, and Edit menus as well as six new menus: Proofs, Annotations, Font, Size, and Windows. The menu bar is illustrated below. You can pull down each menu to see what it contains. The various commands in each menu are described in the following sections. In this section, those menus and commands that are not directly related to the construction of a proof are described.

,.

-

• File Edit Proofs Annotations Font Size Windows

Plato Users ' G uide

A.3.2

253

The File Menu

Most of the commands in the File menu are standard M acintosh File menu commands. • New - Creates a new document. (CMD-N) . • Open . . . - Opens an existing document. ( CMD-O) . • Open Rule Set... - This command allows you to open a file containing a set of derivation rules. It presents a standard file dialog from which you can choose the file containing the rule set to be opened. The commands associated with the derivation rules in a rule set are added to the Proofs menu, as a submenu (a menu with a triangle to the right of the menu item) . The submenu will b e named after the rule set, and the menu items i n that sub menu will be named after the rules that they are associated with. • Close - Closes the topmost document. If you have changed the contents of the document, you will be asked if you want to save the document before closing it. • Close Rule Set... - Presents a list of open rule sets and allows you to choose one to be closed. Once you close a rule set, the submenu associated with that rule set is removed from the Pro ofs menu. • Save - Saves the topmost document to a file. You may not overwrite any files that were not created by the Plato program. (CMD-S) . • Save As . . . - Saves the topmost document to a file, prompting for a new file name. • Save As MacDraw. . . - Saves the topmost document as a PICT document, suitable for use with most drawing programs (e.g. MacDraw, SuperPaint, etc.). • Revert to Saved. . . - Reverts the topmost document to the most recently saved version, discarding all changes since the most recent Save or Save As operation. You will be asked if you really want to revert the document. • Quit - Quits the application. You will be asked if you wish to save any changes to open documents. (CMD-Q) .

A.3.3

The Edit Menu

Most of the Edit menu commands are standard M acintosh Edit menu commands. However, the U ndo and Redo commands implement infinite undo and redo. • Undo - Undoes the most recent change made to the topmost document. Once a change is undone, subsequent use of Undo undoes the changes made previous to the most recent change, in reverse chronological order. Repeated use of Undo steps the document back to progressively earlier states. ( CMD-Z) . • Redo - Redoes the most recently undone change to the topmost document. Once a change is redone, subsequent use of Redo redoes the previous most recent undone changes, in reverse chronological order. Redo can only be used after a command h as been undone; if a new command is executed, the Redo command is not available. (CMD-R) . • Copy - Copies the selected formula, or, if no formula is selected, copies the entire table as a picture that can be pasted into another application. The command-key equivalent (CMD-C) works in all dialogs where text is entered.

Logic, Sets and Functions

254

• Cut - Not enabled except for use with a desk accessory. However, the command-key equivalent (CMD-X) works in all dialogs where text is entered . • Paste - Not enabled except for use with a desk accessory. However, the command-key equiv­ alent ( CMD-V)works in all dialogs where text is entered. • S how Clipboard - Shows the contents of the clipboard (if they are either text or a picture) in a window.

A . 3 .4

The Annotations Menu

Plato documents may be annotated with a title and a comment. The title appears at the top of the document, above the proof. The comment appears at the bottom of the document, below the proof. Four menu items in the Annotations menu allow you to change the title and comment of a document. • Edit Title . . .- This command allows you to enter a new title for the document or edit an existing title. Within the dialog box, the standard Macintosh editing command keys work, and one may type a carriage return by typing option-return. Typing a return closes the dialog box, saving the title (this is the same as clicking the "OK" button with the mouse) . Typing command-period (CMD- .) closes the dialog box without saving the title (this is the same as clicking the " Cancel" button) . One may also type option-return to insert a return into the text of a title.

Enter or edi t the comment on t he tabl e bel ow:

( Cancel ]

fi-.._[

iiiiiiiiOiiiiiiii K'iiiiiiiiiiiill,

• Edit Comment... - This command works just as the Edit Title command does, except that it allows you to edit the comment of a document rather than the title. The title of the document appears over the proof, and the comment appears below the proof. • Remove Title - This command removes the title from a document. • Remove Comment - This command removes the comment from a document .

255

Plato Users ' Guide

A.3.5

The Font and Size Menus

The Font menu changes the appearance of your Plato documents; it controls the font in which a document is displayed. Only those fonts that have the proper logical characters are included in the Plato font menu. [Windows users do not have this option - all Plato files produced on the Windows platform employ the Windows version of the Terlingua font, Terlingu. ttf.] The Size menu changes the appearance of your Plato documents; it controls the size of the font in which a document is displayed. Choosing a number in the menu displays the topmost document in the font and the size chosen. The other menu items allow you to change the font size in other ways. The current size is indicated by a check mark next to a number in the menu, or by a check mark next to the Other menu item and a number in parentheses after that menu item if the font size does not appear in top part of the menu. • Larger - This menu command changes the font size of the font used to display a document to a point size one greater than the current font size. (CMD-] ) . • Smaller - This menu command changes the font size of the font used to display a document to a point size one smaller than the current font size. ( CMD- [) . • Other . . . - This menu command presents a dialog box (illustrated below) in which you can type a new font size from 1 to 1024. The document will be displayed in the font size that you type if you click OK or type a return. The font size will remain unchanged if you click Cancel, hit the escape key, or type CMD- .( CMD-period) .

P l e a s e enter t h e f o n t s i z e : ( Cancel

A.4

)

� fi�(iiiiiiiiiCiiiiiiiii l K�

Homework Mode the Windows Version

Since the Windows version of Plato was developed most recently, it has several features that are unavailable on the Macintosh version. The most important of these added capabilities is the Home­ work Mode. When Homework Mode has been enabled, Plato will request the user's name and student ID number. These will be encrypted and stamped into every file produced while the user is in Homework Mode. When you are in Homework Mode, you may open existing Plato files, but only when those files include only assumptions and an initial Show line. If a file includes a proof already in progress, the file cannot be opened while you are in Homework Mode. This prevents you from simply copying someone else's homework and stamping your own identification into their files. To produce a Plato file for homework credit, you must start a proof from the very beginning while in Homework Mode. There is also a Grading Mode, which your instructor may use in checking and grading homework produced by Plato. When submitting homework to your instructor electronically, you should begin the name of every Plato file with your last initial and the last four digits of your

256

Logic, Sets and Functions

student ID. Otherwise, if the instructor receives several copies of a file named "hw l . pla" , the latest file received will simply overwrite all of the earlier submissions. Unfortunately, Homework Mode is available only on Plato for Windows. If you are using a Macintosh, you may use Plato for purposes of self-instruction, but you will not be able to produce submittable homework files. The following table provides Windows users with keystroke combinations that may be used in entering logical symbols in the Terlingua font: Terlingua Windows Keystroke Table Symbol Keystroke Combination Name of Symbol Alt + 0253 Disjunction V -> Alt + 0225 Conditional Alt + 0151 Negation +--> Alt + 0226 Biconditional Universal quantifier V Alt + 0140 3 Alt + 0180 Existential quantifier

A.5

The Windows Menu

• Cascade Windows - This command arranges the open windows within Plato so that you can ea.siJy see the title of the first few windows. It also reshapes the windows to fit the screen nicely. • The remaining items in the Windows menu allow you to bring a window to the top. This menu lists the titles of all open windows in the Plato application. The title of the topmost window is listed in gray text, and the titles of the other windows are listed in regular text in the order that they are stacked on the desktop. To bring a window to the top, you can select its title from the Windows menu. You can also click on a window to bring it to the top, but if the window is not visible, then the Windows menu is sometimes convenient.

A.6

Languages of Logic

Plato allows you to construct proofs using two formal languages, the language of sentential logic and the language of quantificational logic. The language of sentential logic is a sublanguage of the language of quantificational logic. We describe both briefly below. For more information and examples, see Chapters 2 and 5.

A.6.1

The Language of Sentential Logic

To add assumptions and Show lines to a proof in Plato, you must enter grammatical formulas using the Add Assumption or Add Show Line commands (see Section 4.3 below) . A formula consists of a string of logical symbols that obeys the grammar of a logical language. Different logical languages can have different grammars. A formula that is grammatical, or that satisfies the conditions of the grammar for a particular logical language is sometimes called a well-formed-formula, or a WFF. For that part of logic known as sentential logic (SL), the permissible formulas within Plato are governed by the rules laid out in Chapter 2. Remember that parentheses are added whenever one of the four "binary" connectives, & , V , ->, and lef trightarrow , are used to combine two sentences into a compound sentence. This is the only time such parentheses can be added. Since the meaning of a formula can change if the

Plato Users ' Guide

257

parentheses are wrong, and since Plato cannot guess what you really meant to say, you must be careful to follow the rules literally and precisely. When using Plato, you may not arbitrarily add or omit parentheses or other symbols. Plato will not recognize what we have called conventional abbreviations of formulas. Thus, the following strings of characters are not well-formed formulae:

( ,A ) ,(A) ((A ) V (B )) (A ) A&B (A&B) _, A In contrast, the following are welJ-formed sentences of sentential logic:

((A&B ) V (B -> A ) ) ,,-,, A

,,(,A ...., , ( B v ,A ) ) (A v (A&(A ...., A)))

There should always be exactly as many left parentheses in a sentence as there are right paren­ theses, and there should be exactly one pair of parentheses for each binary connective. I nstead of numerical subscripts, as were used in Chapter 2 , Plato allows us to add any number of prime symbols (') to a sentence letter. Consequently, Plato will not recognize P1 as a sentence letter, but it will recognize P', P" , P"' , and so on.

A.6.2

The Language of Quantificational Logic

Predicate logic with identity adds three new logical symbols to our language: V , :3 , and =. The symbols V and :3 are quantifiers; they are the universal and existential quantifiers respectively. In addition, Q uses lower-case letters; the letters a , . . , s are called constants, and the letters t , u, v, w, x, y, and z are called variables. Unlike the language defined in Chapter 4, Plato does not allow us to add numerical subscripts to constants or variables. Instead, any lower-case letter may be followed by any number of prime signs ( ' ) , and the result will also be a constant or variable.

A.7

Systems of Proof

As we mentioned in the introduction, Plato allows the user to use several proof formats, more than a dozen basic rules, and many more derived rules in constructing proofs. In this section we briefly describe the proof system used in Plato. For more information you should consult Chapters 3 and 5.

A.7.1

Proof Formats

In the our system , one always states what one knows before the proof has begun in the form of assumptions. One also states what one wishes to prove before one begins, by means of a Show line. Then, one applies various rules of deduction to any available assumptions to derive intermediate results (if any) and eventually the desired result. Each new result is entered on a new line together with a line number and a notation explaining how the line was derived. When the desired result is obtained, the Show line that contains it is canceled, and the lines below that line are bracketed, indicating that they are no longer accessible. Assumptions are not required in every proof, though a Show line is always required. Assumptions are often not used in conditional, indirect and universal proofs. A sample proof is illustrated below, and the parts of a proof are discussed in more detail in the following sections.

258

Logic, Sets and Functions

I. 2. 3. 4. 5.

6.

7.

8.

(A -B) (B -C) (C- D) � (A -D) [A

B

A A A

ACP - E, 1 ,5

-E,2,6 -E,3,7

C

D

Direct Proof A direct proof is the basic form of proof. It includes the elements discussed above. A finished direct proof has the following form .

1.

A

Notice that the Show line at the top of the proof has been canceled once the formula we set out to prove has been derived. Note also that the lines following the Show line have been bracketed. To cancel a Show line, the formula on that line (¢ on lines n and m in the example above) must not be enclosed in another bracket, and there must be no uncanceled Show lines between the Show line to be canceled and the line on which it has been derived.

Indirect Proof An indirect proof begins with a Show line of the form ¢ which is immediately followed by an assumption line containing -,4>, the opposite, or negation, of what one wishes to show . This assumption is labeled AIP , which stands for Assumption for Indirect Proof. An indirect proof then includes the derivation of a contradiction from that assumption. A contradiction is a pair of sentences in which one is the denial of the other. Thus, it is impossible for both sentences in a contradiction to be true at the same time. Since the denial of what one wishes to show (the assumption) allows us to derive a contradiction, it cannot be true (i.e. it leads to contradiction, and contradictions cannot be true) and thus it must be false. If the assumption is false, then what one wishes toshow must be true. Proofs of this style are sometimes also known as Reductio ad Absurdum proofs since a contradiction is thought to be impossible or absurd. A finished indirect proof has the following form.

Plato Users ' Guide

259

n. � -, qi

qi

AI P

Conditional Proof A conditional proof begins with a Show line, on which a conditional of the form ( --> 1/;) occurs ; it also has as an assumption on the next line, and the consequent 1/; is derived from that assumption. The assumption is labeled ACP for assumption for conditional proof. Below is a picture of what a completed conditional proof looks like.

( qi -+ \jJ ) AC P

The proof formats direct proof, indirect proof and conditional proof, comprise all the formats used for proofs within sentential logic. These proof formats may be used together; one may, for instance, make use of the conditional or indirect proof formats within a direct proof, and similarly a conditional proof may exploit the indirect or conditional proof formats as subproofs.

Universal Proof The universal proof format resembles the conditional and indirect proof formats. It begins with a Show line that has the form of a universally quantified statement, Vv. Immediately following the Show line and enclosed with the proof brackets is another Show line of the form [a/v] , which represents a formula just like the formula represented by , except that every occurrence of the variable v in is replaced by an occurrence of the constant a. An important restriction on the constant a is that it cannot occur in any previous line of the proof in which the universal proof format is being employed. Here is what a universal proof format looks like.

n . � V x qi x

m. A.7.2

[r �a

Derivation Rules

Derivation Rules for sentential logic and quantificational logic provide a means within the Deduction system for transforming assumptions into desired conclusions within the various proof formats j ust

Logic, Sets and Functions

260

described. For sentential logic, the basic rules allow us to introduce assumptions at the beginning of a proof (the Assumption rule) and provide ways of building up more complex truth functional formulas from their constituents that occur on earlier lines in the proof and of breaking down complex formulas into their simpler constituents. For each of the connectives, & V , -, , --. , and ,_. , the basic rules together with the conditional and i ndirect proof formats provide "introduction" rules and "elimination" or "exploitation" rules . These rules all use previous lines in the proof that are accessible. A line n is accessible at line m just in case it is (i) not an uncanceled Show line and (ii) does not occur within a subproof that does not contain line m.

A . 7.3

The Proofs Menu

The Proofs Menu allows you to construct the basic structure of a logical derivation, or proof. This menu contains commands that add new information to a proof, that specify what the proof will show , and that help to construct proofs using various proof strategies. vVe will first look at the commands that are used in constructing proofs in sentential logic.

Basic Commands

• Add Assumption. . . - This command allows you to add an assumption to a proof. It implements the Assumption rule of the natural deduction system. The Add Assumption command presents a dialog box, illustrated below, which allows you to enter a well-formed formula. Within this dialog box, CMD-X, CMD-C, and CMD-V can be used to Cut, Copy, and Paste text from the clipboard. Clicking on the small buttons below the box in which the formula is edited inserts the logical character on that button into the formula at the insertion point. When you click "OK" , the formula is added to the proof as an assumption with the annotation A. Typing an illegal character will cause the system to beep at you. New assumptions may not be added after anything has been derived within a proof (using the other commands) .

261

Plato Users ' G uide

Pl ease e n t e r a new f o rmul a. U s e the keybo a rd t o type s e n t e n c e 1 e t t e rs or vari ab l e s .

1 1GJCDGJGC:JGG80I GJCDGGJ800OO Mo re S y m bol s:

•I

( Cance I )

fi-._(iiiiiiiiiiO K.;;;;;;;;:,,Il

Note that there is also a pop-up menu in the dialog box; it is titled More Symbols. This menu allows you to type other symbols into a formula. These additional symbols are not needed for sentential or quantificational logic. They are required for set theory and other advanced topics. • Add Show Line.. . - This command allows you to add a well-formed formula to a proof as an intermediate goal. The dialog presented is the same as that presented by the Add Assumption command. This command can be used before beginning a proof using any of the methods of proof described earlier. • Cancel Show Line - This command allows you to cancel a Show line once the formula in that line has been derived. To use this command, select the Show line to be canceled by clicking on it or using the arrow keys, then shift-click on the formula which justifies canceling the Show line (that is the line where you have derived the formula on the Show line) . Then select Cancel Show Line from the Rules menu. If the formula on the Show line has been derived without any additional assumptions, then the Show line will be canceled, and the word "Show " will be struck out. This canceled Show line is now accessible for further use in the proof. The lines of the proof below the Show line used to prove it, however, will be bracketed and will not be accessible for further use in the proof. Proof Strategies Commands The following commands are within the Proof Strategies submenu within the Proofs Menu. • Start Conditional Proof - This command allows you to begin a subproof for a conditional formula. To use this command, select a Show line containing a conditional, and then select the command from the menu. A new assumption will be added to the proof. The assumption will contain the antecedent of the formula on the selected line, and it will be annotated ACP for Assumption for Conditional Proof. When the consequent of the formula on the selected

262

Logic, Sets and Functions line has been derived , the Finish Conditional Proof command can be used to cancel the Show line containing the conditional.

• Finish Conditional Proof - This command allows you to finish a subproof for a conditional formula. To use this command, select the Show line to be canceled, and then shift click on the line which contains the consequent of the conditional. Then select the command from the menu. If the Show line is followed by an assumption for conditional proof (ACP) and the consequent of the Show line has been derived without any additional assumptions, then the Show line will be canceled , and the lines below it will be bracketed.

• Start Indirect Proof - This command allows you to begin an indirect proof. To use this command, select the Show line you want to use an indirect proof on, and choose Start Indirect Proof from the menu. A new assumption will be added to the proof; it will be annotated AIP for Assumption for Indirect Proof. If the formula on the selected Show line is a negation, then the new assumption will be the formula on the Show line with the negation removed. If the formula on the selected Show line is not a negation, the new assumption will be the formula on the Show line with a negation added. In either case, the subproof must be completed using the Finish Indirect Proof command.

• Finish Indirect Proof - This command allows you to finish a proof conducted in the indi­ rect style. A proof in the indirect style is complete when a contradiction has been derived. This occurs when lines containing a formula and its negation have been derived. This is a contradiction since it amounts to the claim that something is both true and false, which is impossible. To use this command, select the Show line to be canceled, and shift-click on both of the contradictory lines before selecting the command from the menu. If the Show line is followed by an AIP line and the two secondary selections are indeed contradictories, then the Show line will be canceled and the lines below it will be bracketed.

• Start Universal Proof. . . - This command allows you to begin a proof of a universally quantified statement using the universal proof format described in section 3.2.4. To use this command , select the Show line that contains the universal statement, and choose the command from the menu. You will be presented with a dialog asking you to specify the variable and the constant to be used in the universal proof. You must specify one variable and one constant in this dialog. The variable must be the outermost quantified variable in the universal statement on the selected Show line, and the constant must not appear anywhere else in the proof. Once the proper items are specified and the OK button is clicked, a new line will be added to the proof. This line will be a new Show line with a formula constructed from the selected Show line by removing the universal quantifier and its associated variable, and then replacing all occurrences of the variable with the constant specified in the dialog. The dialog is illustrated below .

Plato Users ' G uide

263

P l e a s e enter a v a ri a b l e a n d a c o n s ta n t to w h i c h the v a ri a b l e s h o u l d b e i ns t a n t i a t e d . Replace the i ndi vi dual with the i ndi vi dual

( can c e I )

�"iiii(__o__K.;;;;;;;;;,,»

• Finish Universal Proof - This command allows you to finish a proof of a universally quan­ tified statement. To use this command, select the Show line to be canceled (i.e. the one containing the universal statement) , and selected the command from the menu. If the selected Show line is followed by a canceled Show line originally constructed using the Start Uni­ versal Proof command, then the selected Show line will be canceled and all the subsequent lines in the proof will be bracketed off.

Other Proof Commands

• Replacement - This command turns the replacement feature of Plato on and off. The re­ placement feature is on if this menu item has a check mark, and it is off if there is no check mark by this menu item. The use of those derivation rules that are equivalences between two formulas is affected by the state of this feature. The descriptions below describe how the derivations work when the replacement feature is off. ·when the replacement feature is on, their behavior is changed in the following way. The user is presented with a dialog box in which one must use the mouse to select the formula to which the rule is to be applied. If nothing is selected, it is assumed that the rule is to be applied to the whole formula in the way that would have occurred had the replacement feature been turned off. Once a formula is selected , the behavior of the rules is the same, except that if the user selects a. subformula of the formula on the selected line, then the result of the rule's application is substituted for the selected subformula in the formula on the selected line when the new line is constructed.

Logic, Sets and Functions

264

P 1 e a s e s e 1 e c t a subform u 1 u t o whi ch the ru 1 e s h o u 1 d b e a p p l i e d_

( C a nc eI )

n�(

iiiiiiiiiiOiiiiiiiiii K� »

• Relations - Like the Replacement command, this command is a flag that you may turn on or off to change the way that Plato operates. When this menu command is checked, Plato allows you to use an extended syntax that is useful in set theory and formal m athematics. This syntax allows capital letters to represent individuals as well as predicates . • Hide Subproof - When the Show line of a completed subproof is selected, the Hide Sub­ proof command will be enabled. Selecting this command will temporarily hide the subproof, replacing it with a small icon. If the Show line of a hidden subproof is selected, the command will change to Show Subproof, and selecting it will remove the icon and display the subproof.

A.7.4

Imported Rules

The only logical operations included in the Plato application are those described in Section 4.3 on the Proofs menu. Those operations are concerned primarily with entering new information into a proof and controlling the structure of your proof. Any actual deduction rules are provided by files that must be loaded into Plato, called rule files. These files may include any number of derivation rules. When they are loaded into Plato, the derivation rules that they include are added to a submenu of the Proofs menu. If the rule set contained in a rule file is closed using the Close Rule Set menu item, then the submenu containing the rules from that file is removed from the Proofs menu. Three different rule files are provided with the Plato application in the attached floppy disks. The Basic Rules file includes those rules that are concerned with the Sentential Calculus. These rules are generally j ustified by arguments outside the formal deductive system. The Derived Rules file includes those rules that are derived within the formal deductive system. Some of these rules are equivalence rules, to which the state of the Replacement menu item applies. The Quantifi­ cation Rules file includes those rules that are associated with the manipulation of quantifiers and individuals in the Predicate Calculus. The Basic and Derived rules are discussed in detail in Chapter 3 above. The use of these rules in Plato is relatively straightforward. One uses the mouse to click on the first line to which the rule is to be applied. Then one uses shift-click to select the second or third lines (if any). Finally, one applies the appropriate Basic or Derived rule from its location in the corresponding s ub-menu. If the rule does not apply to the formulas chosen, or if the wrong number of lines is selected, an error

2 65

Plato Users ' Guide

message will inform you of this fact. If everything is in order, Plato will produce a new line on your proof, automatically applying the selected rule to the selected lines.

A.7.5

The Quantification Rule Set

• Existential Exploitation - This command allows you to eliminate the existential quantifier on a formula. Upon selecting an existentially quantified formula and choosing this command, you will be prompted for a variable and a constant. The variable must be the first quantified variable in the selected formula, and the constant must be new to the proof. If these conditions are met, a new line is created containing the selected formula without the initial quantification and with all instances of the variable replaced by the constant. Lines created using this command are labeled by :3£.

Pl eose e n ter o vori o bl e o n d o c o n s t a n t to w h i ch the v o ri obl e s ho u l d b e i n s t o n t i oted. Repl a c e t h e i n d i vi dual wi th t h e i nd i v i d u ol

( Can C e I )

ti'lli:ii(iiiiOiiiiK�»

• Existential Introduction - This command allows you to introduce an existential quantifier. To use it, select a formula that contains a constant and choose the command from the menu. You will be prompted for a constant and a variable. The constant must appear in the selected formula, and the variable must not appear in the selected formula. A new line will be created, containing the selected formula with all occurrences of the constant replaced by the variable, and with an existential quantifier and the variable inserted in the front. Lines created in this way are annotated :lJ. • Universal Exploitation - This command allows you to eliminate a universal q uantifier. To use this command, select a universally quantified formula, and choose the command from the menu . You will be prompted for a variable and a constant. The variable must be the first quantified variable in the selected formula. There is no restriction on the constant. Once the variable and the constant are entered, a new line will be added to the proof, and it will contain the selected formula with the universal quantification over the specified variable removed, and with all instances of the variable replaced by the specified constant. Lines created with this command are annotated \/ E. • Identity Exploitation - To substitute one constant for another given that the two are equal, use this command. Select the statement to be changed, and shift-click on the identity state­ ment. Then, choose this command from the menu. You will be prompted for the constant

Logic, Sets and Functions

266

to be replaced and the constant to replace it with. A new line will be created containing the selected formula with all occurrences of the first constant replaced by occurrences of the second constant. • Identity Introduction - To introduce a formula stating that a constant is equal to itself, choose this menu command. You will be presented with a dialog asking you for an individual. Once you have entered an individual (you may use no variables - only function symbols and constants) , a new line will be created containing an identity expression with the individual you entered on both sides of the identity symbol.

Pl ease e n t e r a n i ndi vi dual :

( C t:m Ce I )

fi'--( iiiiOiiiiK



"iiiiiiiiiiiiii

• Variable Rewrite - This command allows you to replace a variable in a logical formula with another variable. To use this command , select a quantified formula, and choose the command from the menu. You will be prompted for two variables. When both variables are accepted, a new line will be created containing the selected formula with all occurrences of the first variable replaced by occurrences of the second variable. The second variable that you specify may not appear anywhere in the selected formula before the replacement is performed. This derivation produces the annotation V R.

A.8

Macintosh Keyboard Shortcuts

Plato includes a number of keyboard shortcuts for various menu commands. Those that are specific to Plato and are not shared with other Macintosh applications are listed in the table below. These shortcuts consist of using the command key (which has either an apple or cloverleaf symbol, or both, on it) and one other key from the keyboard. Rules Menu Command Add Assumption . . . Add Show Line . . . Size Menu Command Larger Smaller

Shortcut Command-; Command-' Shortcut Command-] Command- [

Within the various dialog boxes used by Plato, a number of keyboard shortcuts are used. T hese shortcuts are common across all the dialog boxes used in Plato.

Plato Users ' Guide

Dialog Box Command Click "OK" . . . Click "Cancel"

267 Shortcut Enter o r Return Command-. or Escape

If a dialog box includes an area where text can be typed, the keyboard shortcuts for the Copy, Cut, and Paste commends can also be used (CMD-C, CMD-X, and CMD-V) .

A.8. 1

Mackintosh Keyboard Equivalents for Logical Symbols

In Plato, one can use the buttons provided in the formula entry dialogs box to enter any logical symbols that are needed. One can also use the following keyboard sequences to type the logical symbols directly when using the font Terlingua. Except for the existential quantifier the same key combinations work in Pittsburgh and Saltfl.at. Symbol Key Combination V

......

\/ :3 E

Option-shift-hyphen Option-v Option-e, a Option-i, a Option-Shift-q Option-Shift-e ( Option-Shift-r) Option-e, e

Appendix B

Answers to Selected Problems Ch. 7.1 Extensionality # 2. No set is a proper subset of itself. 1 . Skew -,3x(Sx&x C x) AIP 2. 3x(Sx&x C x ) 3E, 2 3. Sa&a C a 4. a � a&a -/- a � E, 3 =l 5. a=a & E, 4 6. a -/- a # 6. 1 . Skew -,3x3y(Sx&Sy&x C y&y C x) AIP 2. 3x:3y(Sx&Sy&x c y&y c x) Sa&Sb&a C b&b c a 3E, 2 3. 4. a � b&a -/- b C Def, 3 5. b � a&b -/- a C Def, 3 6. a=b Th 7.2, 4, 5 &E, 4 7. a -/- b Ch 7.3 Pair sets, unit sets and enumeration # 2. 1 . Vx(Ux Vy(Sy _, x (/. y)) A 2. Shew -,3xUx 3. AIP 3xUx 4. Ua 3E VE, 1 , ...... , 4 Vy(Sy -, a (/. y) 5. Th 7 . 6 6. S{a} 7. a (/. { a } VE, 5,-----> E, 6 a E {a} 8. Th 7.7 # 10. 1 . Shew VxVyx E {x, y } Sh0w a E {a, b} 2. =I a=a 3. VJ, 3 4. a = aVa= b 5. a E {a, b} Pair, 4 Ch. 7 .4 The null set # 2.

269

270 l. Show ...,:Jx(0 2 x&x i= 0) 2. 3x(0 2 x&x i= 0) 3. 0 2 a&a =I= 0 4. ac0 5. Sa&Vy(y E a ---> y E 0) 6. Sa&a =I- 0 7. :lyy E a 8. bEa 9. b E a ---> b E 0 10. bE0 11. b !/ 0 # 4. l. Shew -,:Jx(Sx&0 = {x}) 2. 3x(Sx&0 = {x} ) 3. Sa&0 = {a} 4. a E {a} aE0 5. 6. a (/. 0

Logic, Sets and Functions

AIP 3E, 2

2 Def, 3 C , 4 &E, 5, &E, 3, &I Cor 7. 1, 6 3E, 7 VE, 5 ---, E, 8, 9 0

AIP 3E, 2

Th 7 . 7 = E, 3, 4 0 Ch 7.5 B inary unions, intersections and complements

#

12. 1. Show a - b c a 2. Show Vx(x E a - b ---> x E a) 3. Show c E a - b ---> c E a 4. cEa-b 5. c E a&c (/. b cEa 6. a-bca 7.

ATP

&E, 3 Def -, 4 &E, 5 C , 2, Th 7. 17, A

# 14. x E b n a) . Show a n b = b n a. Both are sets by Th 7.1 5 , so we need Vx( x E a n b ( ---> ) . Assume c E a n b. Then c E a&c E b, by n . By commutativity of & , c E b&c E a, so c E b n a, by n . ( +- ) Similar. So, a n b = b n a, by Extensionality.

~

# 1 6 . Assume a c c and b c c. To show that a - b = a n ( c - b) , since both are sets, we can use Extensionality. ( -, ) d E a - b. So, d E a&d ) Assume c E D(R) . Then (c, d) E R. Then, c E A. ( - ) Assume c E A. Let d E B. Then, (c, d) E A x B, and so (c, d) E R. Therefore, c E V(R) .

Ch 8.4 Properties of Relations

[Whenever a relation is labelled as "asymmetric" , it is also anti- symmetric] #2. Reflexive, Anti-symmetric, Transitive. #4. Symmetric. #6. Irreflexive, Asymmetric, Transitive, Connected. #8. Reflexive, Anti-symmetric, Transitive, Connected & Strongly Connected. # 10 . Irreflexive, Symmetric, Connected. # 1 2 . Reflexive, Anti-symmetric, Transitive. # 1 4 . Reflexive, Anti-symmetric, Transitive. #16. On sets, including 0: Anti-symmetric. On the class of non-empty sets: Irreflexive, asymmetric, intransitive. #18. I nterpret the relation to be: { (x, y) : y = a - x } , where a is a non-empty set. Then, the relation is: irreflexive, intransitive. #20. Anti-symmetric. #22. Anti-symmetric. #24. Anti-symmetric. #26. Anti-symmetric. #28. Reflexive, Symmetric, Transitive. #30. Symmetric. #32. Irreflexive, Asymmetric. #34. Irreflexive, Asymmetric. #36. Irreflexive, Symmetric. #38. Irreflexive, Asymmetric, Transitive. #40. Any relation is both reflexive and irreflexive on 0. Otherwise, this is impossible. #42. Identity. #44. < on reals. #46. is no more than 1 year older than. #48. is either 1 year older than or 1 year younger than. #50. is exactly 1 year older than.

Answers #68. #70. #72. #74. #76. #78. #80. #82

2 75

Irreflexive, symmetric, asymmetric, anti-symmetric, transitive, intransitive. Irreflexive, asymmetric, anti-symmetric, transitive, intransitive, connected. Same as 70. Reflexive, symmetric, antisymmetric, transitive. Anti-symmetric, transitive, connected. Same as 76. Symmetric, connected. Same as 80.

Ch. 8.5 Ordering Relations

#2. Linear ordering. #4. Strict linear ordering. #6. Strict linear ordering. #8. Linear ordering. #10. Strict linear ordering. # 12. Linear ordering. # 1 4. Linear ordering. #16. Partial ordering. #18. Partial ordering. #20. Partial ordering. #22. Partial ordering. #24. Partial ordering. #26. None. #30. Strict partial ordering.

#32. Assume R strictly partially orders A. Show that S is irreflexive and transitive on A. (a) Irreflexivity. Assume that b E A. Since R is irreflexive, -.Rbb. So, -.Sbb. (b) Transitivity. Assume Sbe and Scd, with b, e, d E A . We have Rde and Reb. Since R is transitive, we also have Rdb. Thus,

Sbd.

#34. Show that Q is irreflexive, asymmetric, transitive and connected and R x S. (a) Irreflexivity. Assume for contradiction that Q(a, b) (a, b} . By definition of Q, either Raa or (a == a&Sbb) . Either disjunct leads to a contradiction, since both R and S are irreflexive. (b) Asymmetry. Assume Q (a, b) (e, d} . Either Rae or (a = e and Sbd) . Case 1. Rae. Since R is asymmetric, we have -.Rea. Since R is irreflexive, we have a -=I- e. So, we have neither Rea nor (e = a and Sdb) . By definition of Q, -.Q (e, d) (a, b} . Case 2 . a = e and Sbd. Since R is irreflexive, we have -.Rea, and since S is asymmetric, we have -.Sdb. By definition of Q, -.Q(e, d} (a, b) . (c) Transitivity. Assume Q(a, b) (e, d} and Q{e, d) {e, J) . By definition of Q, we have either Rae or (a == c&Sbd) . Case 1 . Rae. By definition of Q, we have either Ree or (e = e and Sdf) . Case la. Ree. Since R is transitive, we have Rae. So, Q(a, b) (e, J) . Case lb. (e = e&Sdf) . Since e = e , we have Rae. So, Q{a, b} {e, J ) . Case 2. (a = e&Sbd) . By definition o f Q, w e have either Ree or (e == e&Sdf). Case 2a. Ree. Since a == e, we have Rae. So, Q(a, b) (e, J) . Case 26. (e = e&Sdf). So, a = e. Since S is transitive, Sbf. So, Q(a, b) (e, f) . (d) Connectedness. Assume {a, b} -=I- (e, d} . Show that either Q(a, b(e, d) or Q{e, d) (a, b} . Either a -:f. c or a = c. Case 1. a -=I- e. Since R is connected, either Rae or Rea. Consequently, either Q(a, b} {e, d) or Q(e, d} (a, b) . Case 2. a = c. Therefore, b -:f. d. Since S is connected, either Sbd or Sdb. Consequently, either Q(a, b) (c, d} or Q(e, d} (a, b) . Ch 8.6 Relations between relations

276

Logic, Sets and Functions

#26. Assume R is reflexive on A. Let b E A. So, Rbb. By def. of converse, R- 1 bb. So, R- 1 is reflexive. #28. Assume R is symmetric on A. Show R- 1 is. Assume R- 1 ab. By def. of converse, Rba. Since R is symmetric on A, Rab. So, R- 1 ba. #30. Assume R is antisymmetric on A. Show R- 1 is. Assume R - 1 ab and R- 1 ba. So, Rba and Rab. Since R is antisymmetric, a = b.

#32. Assume R is connected on A. Show R- 1 is. Assume a -=/=- b. Since R is connected , Rab or Rba. By definition of converse , R- 1 ba or R- 1 ab. #34. Assume R is intransitive on A. Show R- 1 is. Assume R- 1 ab and R- 1 be. So, Reb and Rba. Since R is intransitive, we h ave ,Rea. So, -.R- 1 ae.

#36. Assume R and S are strongly connected. Show R o S is. Let a, b E A. Since R is strongly connected, we have either Rab or Rba. Since S is strongly connected, we have Saa and Sbb. Case l . Rab. We have Saa&Rab , so (a, b) E R o S. Case 2. Rba. We have Sbb&Rba, so (a, b) E R o S.

#38. S how ( R u s) - 1 = R- 1 U s- 1 . By Extensionality. ( -> ) Assume ( a, b) E (R u S)- 1 . (R u S)ba. Either Rba or Sba. Case 1. Rba. R- 1 ab. So, 1 . Case 2. Symmetrical. (a, b) E R - 1 u ( +- ) Assume ( a, b) E R- 1 U s- 1 . Either R- 1 ab or s- 1 ab. Case 1. R- 1 ab. Rba. So, ( b, a) E R U S. So, ( a, b) E (R u s) - 1 . Case 2. Symmetrical. 1 #40. Show ( R - s)- 1 = R- 1 . By Extensionality. 1 ( -> ) Assume ( a, b) E (R - S ) - . (b, a) E ( R - S) . So, Rba and -.Sba. R- 1 ab and ,S- 1 ab. Thus , 1 . ( a, b) E R- 1 (+- ) Assume ( a, b) E R - 1 - s- 1 . So, R- 1 ab and -.s- 1 ab. Rba and -.Sba. Thus, ( b , a) E R - S. So, (a, b) E ( R - S)- 1 .

s-

s-

s-

#42. Show D(S o R) c D ( R) . Assume a E D(S o R) . So, (a, b) E (S o R) , and Rae and Seb, for some b and c. Since Rae, a E V(R) . #44. Show T o (S o R) = (T o S ) o R. B y Extensionality. ( -> ) Assume (a, b) E T o (S o R) . Then, (a, e) E (S o R)and Tcb. Moreover, Rad and Sdc. Since Sdc and Tcb, (d, b) E (T o S). Since Rad, (a, b) E ( T o S) o R. ( +- ) Assume (a, b) E (T o S) o R. Then Rad and (d, b) E (T o S) . Moreover, Sdc and Tcb. Since Rad and Sdc, we have (a, c) E (S o R) . Since Teb, (a, b) E T o (S o R) .

#46. Show ( S u T) o R = (S o R) u (T o R) . By Extensionality. ( +- ) Assume (a, b) E (S U T) o R. So, Rae and (c, b) E S U T. Either Scb or Teb. Case l . Scb. So, (c, b) E S U T. Since Rae, (a, b) E (S u T) o R. Case 2. Similar. ( +- ) Assume ( a, b) E (S o R) U (T o R) . Either (a, b) E (S o R) or (a, b) E (T o R). Case 1 . (a, b ) E (S o R). Rac&Scb. ( , c, b) E S U T. So, (a, b) E ( S U T) o R. Case 2 . Similar. #50. Show ( A x B) o (A x B) C A x B. Assume (c, d) E (A x B) o (A x B). So, (c, e) E A x B, and (e, d) E A x B. Thus, c E A and d E B, so (c, d) E A x B. #52. Assume R is reflexive on A. Show D (R) = D(R- 1 ) . By extensionality. (-> ) Assume b E D ( R) . So, Rbc, and c E A (since R is a relation on A). Since R is reflexive on A, Rec. So, R- 1 cc, and c E D ( R- 1 ). ( +- ) Similar.

#54. R is asymmetric on A iff R and R- 1 are disjoint. (->) Assume R is asymmetric on A. Assume for contradiction that R and R- 1 are not disjoint. So Rab and R- 1 ab, for some a and b. Thus, Rab and Rba, contradicting the asymmetry of R. ( +- ) Assume R and R- 1 are disjoint . Show R is asymmetric on A. Assume for contradiction Rab and Rba. Since Rba we have R- 1 ab, so ( a, b) E R n R- 1 , and R and R - 1 are not disjoint . #56. S how R is transitive on A iff R o R c R

277

Answers

. ( -> ) Assume R is transitive. Assume (a, b) E R o R. So, Rae and Reb. Since Ris transitive, Rab. ( ) Assume R is reflexive, symmetric, and transitive on A. Show R - 1 o R = Rby extensionality. ( -> ) Assume (a, b) E R - 1 o R. So, Rae and R- 1 cb. Thus, Rbe. Since R is symmetric, Rcb. Since R is transitive, Rab. (