196 70 8MB
English Pages 120 Year 1971
JANUA LINGUARUM STUDIA MEMORIAE NICOLAI VAN WIJK DEDICATA edenda curat C. H. VAN SCHOONEVELD Indiana University
Series Minor,
123
CODING I N F O R M A T I O N IN NATURAL LANGUAGES
by
J O H N W. O L L E R Jr. UNIVERSITY OF CALIFORNIA AT LOS ANGELES
1971
MOUTON THE HAGUE • PARIS
© Copyright 1971 in The Netherlands. Mouton & Co. N.V., Publishers, The Hague No part of this book may be translated or reproduced in any form, by print, photoprint, microfilm, or any other means, without written permission from the publishers.
LIBRARY OF CONGRESS CATALOG CARD NUMBER: 74-182465
Printed in The Netherlands by Mouton & Co., Printers, The Hague.
PREFACE
The debate concerning the Chomskyan variety of transformational theory has centered around the dual question of whether or not language is intrinsically structured for communication, and whether or not it can be effectively analyzed apart from its communicative function. This monograph argues that language must be dealt with in its communicative function in order to account for its structure and its use. This view amounts to a basic departure from the thinking of American structural linguistics since the time of Bloomfield. It suggests an approach to linguistic analysis which is essentially different from the currently dominant variety of transformational generative theory. The present work constitutes an attempt to justify and outline an approach to the study of language as an instrument for coding cognitive information. An attempt is made to clarify certain confusions about the notions 'competence' and 'performance', and the theory of 'innate ideas' is challenged as naively motivated as a result of the failure of transformationalists to treat language as a medium of communication. The development and presentation of a theory is seldom a task which can be credited to any one individual. The theory contained in this monograph is no exception. Among the many active collaborators and kindly critics who have contributed to the thinking embodied here, I am pleased to acknowledge W. B. Newcomb, B. D. Sales, R. V. Harrington, T. W. Schiebel, G. H. Mundell, D. K. Oiler, and D. H. Obrecht. I especially owe a debt of gratitude to my advisor, Dean Obrecht, who gave
6
PREFACE
me many hours of his time, much patience, and numberless valuable suggestions. I also gratefully acknowledge the many ideas I have freely adopted from the conversations and writings of my father, John W. Oiler. In addition to these whom I have named, I would like to thank those other friends, teachers, and scholars whose thoughts I have made use of both knowingly and unknowingly in these pages.
CONTENTS
PREFACE
I. THE PROBLEM 0 . INTRODUCTION 1. META-THEORETICAL ASSUMPTIONS
1.1 The Need for Abstract Theories 1.2 The Need for Empirical Theories and Tests . . 1.3 An Eclectic Position 2. THEORETICAL ASSUMPTIONS
2.1 Language as a Medium of Communication . . 2.2 Natural Languages as Learned Codes . . . . 2.3 Language as Systematic II.
5
9 9 10
11 14 17 18
19 23 25
THE STRUCTURE OF LANGUAGE
35
0 . INTRODUCTION
35
1. PRELIMINARIES
1.1 Syntax, Semantics, and Pragmatics 1.2 On Beginning with the Communicative Function 2. THE SENTENCE
2.1 The Subject 2.2 The Predicate 2.3 The Complement 3. LINGUISTIC CODING
36
36 38 41
45 67 80 85
8
CONTENTS
I I I . LANGUAGE ACQUISITION
88
0 . INTRODUCTION
88
1. BASIS OF THE THEORY
89
2 . APPLICATION
OF THE RULES OF INDUCTION
AND
SUBSTITUTION
91
2 . 1 SIMPLE CONCEPTS
92
2 . 2 MORE ABSTRACT CONCEPTS
95
2 . 3 THE REFINEMENT AND EXTENSION OF CONCEPTS .
.
97
2 . 4 THE LEARNING OF LARGER GRAMMATICAL UNITS
100
2 . 5 PRODUCTIVE VERSUS RECEPTIVE LANGUAGE
104
3. THE ORDER HYPOTHESIS
. . . .
106
I V . CONCLUSIONS
113
BIBLIOGRAPHY
117
r THE PROBLEM
The most characteristic behavior of the human species is language. It is this type of behavior more than any other that distinguishes the human from other animals. Language is the chief means of communication between members of this species, and its remarkable precision and abstractness as compared with the modes of communication employed by other species has made possible the elaborate systems of interpersonal organization we call 'human culture'. Charles Osgood ("Psycholinguistics", 244)
0. INTRODUCTION
The present work is concerned with the way in which linguistic units are organized in relation to the elements of cognitive experience. The viewpoint is eclectic, drawing mainly from the theories of linguists and psychologists, with some appeal to philosophy and symbolic logic. The position advocated is both empirical and theoretical. It is argued that a theory must have an empirical base if it is to be maximally effective, and that data must be theoretically explained and systematized in order to enhance our knowledge. Although the theory to be advanced is concerned primarily with English, it is assumed that its basic principles and assumptions are applicable to any natural language. The following assumptions are considered necessary preliminaries to any adequate theory of natural language: (1) a natural language constitutes a system of human communication;
10
THE PROBLEM
(2) natural languages are learned codes — they are functional only for those who have learned the system of relationships between the units of the language and the elements of cognitive experience about which the language functions to codify information; (3) natural languages are organized on the basis of universal factors of (a) psychophysics, (b) learning, and (c) reasoning, all of which interact and relate in complex ways to effect the universal and non-universal rules of grammar. In the discussion to follow, we will consider first the metatheoretical assumptions (embodied in the opening paragraph of section 0) concerning the need for an abstract theory of language and the need for the empirical vulnerability 1 of such a theory; then, we will proceed to consider the theoretical assumptions just listed. Informal proofs of the necessity for each of the assumptions which might be considered controversial will be adduced by showing the consequences of a failure to accept it. Arguments against a number of currently held counter viewpoints will be included in these negative proofs.
1. META-THEORETICAL ASSUMPTIONS
The questions concerning just how a well-formed theory of language should be constructed, and what set of data it should account for were discussed in the time of the ancient Greeks. In fact, the principal arguments do not seem to have changed essentially since the debate of the Sophists and Aristophanes. Dineen (Introduction, 73) states: Considering the stress given to empirical procedures in modern science, it is interesting to note that there was a strong empirical tendency among the Sophists of the fifth century B.C. They attempted to subject everything to measurement — music, geometry, astronomy, and even language study .... The methods of the Sophists were satirized by Aristophanes 1
The word 'vulnerability' is used here in a non-derogatory way, in the sense that it is used by Joos (English Verb). By it we simply understand the experimental accessibility of theories.
11
THE PROBLEM
in The Frogs, where he describes a contest that Dionysus witnesses upon his arrival in the Underworld. Aeschylus and Euripides, a disciple of the Sophists, have been disputing each other's literary merits. They agree to submit their quarrel to the judgement of Pluto, but Euripides asks Pluto's servants to bring scales, rules, and circles so that he can render an objective, and not just an emotional, evaluation. Bloomfield (Language, 142) seems to have been arguing very much the same problems in his discussion of 'mechanistic' versus 'mentalistic' theories of language. He writes: For the mentalist, language is
THE EXPRESSION OF IDEAS, FEELINGS, OR
VOLITIONS.
The mechanist does not accept this solution. He believes that MENTAL and the like are merely popular terms for various bodily movements, ...
IMAGES, FEELINGS,
Bloomfield's 'mechanists' seem very similar in thinking to the Sophists, while his 'mentalists' seem closer to the viewpoint of Aristophanes. In current language study the controversy continues. The basic question, namely, what is the most desirable balance between theoretical explanation and empirical observation, remains. There are extremists on the side of operationalism and likewise on the side of mentalism. As an example of extreme operationalists, who like the Sophists tend to want to put a visible yardstick on tangible and intangible objects alike, we have Skinnerian behaviorists. At the opposite pole, the proponents of 'innate ideas' seem to argue that the really important issues of linguistic theory are not susceptible to empirical investigation. I believe that following both of these claims to some of their logical conclusions will demonstrate the need for a more moderate position. 1.1 The Need for Abstract
Theories
Skinner (Science, 35) has given the following argument against the need for theoretical constructs: The objection to inner states is not that they do not exist, but that they are not relevant to a functional analysis. We cannot account for the
12
THE PROBLEM
behavior of any system while staying wholly inside it 2 ; eventually we must turn to forces operating upon the organism from without. Unless there is a weak spot in our causal chain so that the second link is not lawfully determined by the first, or the third by the second, then the first and third links must be lawfully related. If we must go back beyond the second link for prediction and control, we may avoid many tiresome and exhausting digressions by examining the third link as a function of the first. In order to refute Skinner's argument, all we need do is show that in a significant number of cases there is a "weak spot in our causal chain so that the second link is not lawfully determined by the first, or the third by the second". This is not as difficult a task as it might appear since it is not necessary to show that there are NO lawful relations existing between ANY of the links, but that the relation which holds between a given first link and a given third link is not one-to-one. The well known facts of ambiguity and synonymy in natural languages are sufficient to show that the relation between any given stimulus and a linguistic response is often, if not usually, indeterminate (i.e., not one-to-one). Suppose, for example, that on hearing part of a conversation we are able to distinguish only the sentence Jack is a small man. There are at least two ways that this sentence can be interpreted as a result of the ambiguity of the word small. It might be taken to mean that (i) someone 2
In saying that "we cannot account for the behavior of any system by staying wholly inside it" Skinner is setting up a straw man. N o one known to the author has limited attention solely to events inside the organism. Even Chomsky, a proponent of the theory of innate ideas, admits that one cannot merely consider internal events. This is made clear in his review of Skinner's Verbal Behavior where he states that "anyone who sets himself the problem of analyzing the causation of behavior will (in the absence of independent neurophysiological evidence) concern himself with the only data available, namely the record of inputs to the organism and the organism's present response, and we try to describe the function specifying the response in terms of the history of inputs" (548). It would seem equally objectionable to limit attention to what goes on outside the organism, which is what Skinner (Verbal Behavior) seems to advocate. Skinner has been criticized on this account by Morris (1958) and Tikhomirov (1959). His attempted refutation of the need for abstract theoretical constructs has also been discussed by Hempel "Theoretician's Dilemma").
THE PROBLEM
13
named Jack is small of physical stature, or that (ii) someone named Jack is a mean (unkind) person. Although (i) and (ii) are both lawfully related to Jack is a small man in the sense that they are interpretations (or paraphrases in this case) of it, neither is determined by it. Without further information, we cannot know for certain which meaning was intended. A still stronger argument for the need for abstract constructs in linguistic theory is the fact that it would be impossible to list even a small number of the 'lawful relationships' between stimuli and responses as suggested by Skinner without positing CLASSES of stimuli and CLASSES of responses. Once any such class is posited, we are immediately dealing with a theoretical construct of an abstract type for which there must be some corresponding inner state. If there were no such inner state, it would be impossible to account for the recognition of identities and similarities. If one does not have an internal representation of previous stimuli associated with an object, how can he recognize future stimuli associated with that object? In other words, without positing an inner state corresponding in some sense with the CLASS of stimuli which are inductively associated with the object in question, it is impossible to account for the simple matter of object recognition. The need for positing inner states becomes still clearer when we focus attention on the novelty of stimuli. Each time a human being sees a familiar object, he actually experiences a novel stimulus. That is to say, the informational input to his sensory mechanism differs with differing circumstances, e.g., changes in position, time, background configurations, etc. It may even be argued that every stimulus associated with an object is necessarily — logically, that is — different from every other. In any case, we may say that practically (and for practical purposes) all stimuli are novel in the sense described. Thus, without positing inner states corresponding to CLASSES, acquired by the internal process of inductive generalization, it is impossible to account for even object recognition. With respect to linguistic phenomena, without positing inner
14
THE PROBLEM
states, it is for all practical purposes impossible to explain even the simplest of nominal functions. In order to name an object, obviously one must have an inner state corresponding to the CLASS of instances of the NAME, and an inner state corresponding to the CLASS of instances of of the OBJECT, and a still more abstract inner state relating these two classes by virtue of the fact that instances of the former can be used to name instances of the latter. It is clear that if it were not for abstract units in natural languages, it would become impossible to learn or use them, therefore, we should expect that theories of language would have to employ abstract theoretical constructs. In short, inner states ARE relevant to a functional analysis. Aside from its other difficulties, a theory claiming to refute the need for theories is in a fairly obvious way self-contradictory . Of course, no known behaviorist, including Skinner himself, has ever attempted to deal with human behavior without positing internal processes of a highly abstract and complex nature. A quick excursion through Skinner (Verbal Behavior) will reveal this fact. For instance, one reads about 'private stimuli', 'internal stimuli', 'stimulus generalization', etc. In view of the above facts, there can be little doubt that theories and their accompanying constructs are essential to any serious attempt to deal with linguistic phenomena. 1.2 The Need for Empirical Theories and Tests
It has recently been argued by Chomsky (Aspects, 17) that an empirical approach to the study of language is unavailable. He states that: it is unfortunately the case that no adequate formalizable techniques are known for obtaining reliable information concerning the facts of linguistic structure ... His statement can apparently be taken in one of two ways. One interpretation would be that there is no algorithm which can be programmed into a machine which will then obtain accurate
THE PROBLEM
15
information concerning the facts of language. Of course, computer technologists are the first to admit that machines cannot at present completely simulate many of the extremely complex activities of which the human being is capable. One of these is linguistic analysis. On the other hand, there are some aspects of linguistic analysis which can be accomplished mechanically and there is a considerable body of literature on this subject.3 Another interpretation, however, is possible. Chomsky's statement might be taken to mean that even human beings are incapable of stating techniques of analysis which can be communicated to others thereby enabling them to employ the same, or similar, techniques of analysis. By this interpretation, his statement is patently false. A strong case in point is the kind of linguistic analysis known as 'sentence diagramming' which is still taught in some elementary schools. In addition, any rigorously designed experimental study of an aspect of language which yields replicable results also constitutes a refutation of Chomsky's statement. For some examples, see Ervin-Tripp and Slobin ("Psycholinguistics"). At this point, it is difficult to say whether the proposition that empirical tests of linguistic theories are impossible is the result of the construction of certain empirically non-vulnerable theories, or whether the construction of such theories is the result of the of the proposition. In either case, there are at least two aspects of current transformational theory which are empirically nonvulnerable. The first is the notion of 'grammatical competence' and the second is the postulation of 'innate ideas' which we will discuss below under section 2.2 on language learning. The transformationalists assume that an important part of the native speaker's linguistic competence — i.e., his knowledge of his language — is described by a transformational grammar. The difficulty here is that their theory of competence is not based on a study of the performance of real speakers, but on the postulated 3 For references on this subject, see The Finite String, Newsletter of the Association for Machine Translation and Computational Linguistics. See also Newcomb, "Directed Graph Model").
16
THE PROBLEM
competence of a postulated ideal speaker. Chomsky (Aspects, 4) states that a fully adequate grammar must assign to each of an infinite range of sentences a structural description indicating how this sentence is understood by the ideal speaker-hearer. Since Chomsky (Aspects, 1) assumes that this 'ideal speaker-hearer' is the primary object of linguistic theory, he argues that it is only after understanding the ideal speaker-hearer's competence that the linguist should become concerned about the study of the performance of real speakers. 4 Thus the transformationalist's theory of competence assumes a position which is empirically non-vulnerable. It is argued that a theory of competence should not be based on what is known of performance; rather, what is assumed about competence should also be assumed about performance. This line of reasoning is apparently the source of the following conclusion by Chomsky (Aspects, 19): It is important to bear in mind that when an operational procedure is proposed [to test a theory of linguistic intuition or competence], it must be tested for adequacy (exactly as a theory of linguistic intuition — a grammar — must be tested for adequacy) by measuring it against the standard provided by the tacit knowledge that it attempts to specify and describe. It would appear that this statement could be paraphrased as follows: before one can attempt to find an answer to certain questions about language, he must make an intuitive judgement, and if an experimentally derived answer fails to agree with his judgement, he should not accept the answer since it must be wrong. 5 What is implied is that an empirical approach to the study 4
For a discussion of the terms 'competence' and 'performance' and their definition, see Oiler, et al. ("Consistent Definitions"). Also, see discussion on the upper limit of sequences pp. 25-28. 5 If Chomsky's statement does not cause distress among the empirically minded, my paraphrase certainly will. One is led to ask whether Chomsky actually had in mind what I have inferred from his statement. Obviously, he cannot support an empirical approach to language study if he does maintain this position. Further reading in Chomsky (Aspects) clearly shows that in
THE PROBLEM
17
of certain questions about language is inapplicable and unnecessary. Within the framework set up by transformationalists, there can be no empirical answers to certain questions (e.g., "What is paraphrase?", and "What is grammaticality?", according to Chomsky, Aspects, 19) since empirical investigation has been ruled out of bounds. One way to refute the transformationalists' position on these issues is to redefine the problems as empirical and to proceed with appropriate tests. This means that we must develop an alternative theory, or modify transformational theory so as to make it testable, or show that in certain respects the latter is already susceptible to testing. 1.3 An Eclectic
Position
The discussion of sections 1.1 and 1.2 seems to support the assumptions that abstract theories are necessary in order to adequately deal with language, and that such theories must be empirically vulnerable to be of greatest value. The justification of these assumptions has employed negative examples to show the undesirable consequences of accepting one without the other. Although such negative arguments may seem contentious, they are nevertheless valid, and may in fact be the only type available. Piatt ("Strong Inference", 350) suggests that the latter is probably the case: fact he does not view language study as an empirical problem (generally speaking, as there are some inconsistencies on his part on this point). Chomsky {Aspects, 19) states: "There is no reason to expect that reliable operational criteria for the deeper and more important theoretical notions of linguistics (such as 'paraphrase' and 'grammaticalness') will ever be forthcoming." And Chomsky (Aspects, 54) states: "Empiricist theories about language acquisition are refutable wherever they are clear, and ... further speculations have been quite empty and uninformative." Leech ("Assumptions") apparently understands Chomsky on these issues much the way that I do here. It seems to me that the following remark by Hartnack (Wittgenstein, 122) on Descartes applies equally well to Chomsky (especially in view of the latter's Cartesian Linguistics): "The Cartesian conceptual apparatus does not make it possible for us to predict what results a particular psychological experiment should, or should not, produce, so experiments can neither confirm nor refute it."
18
THE PROBLEM
As Bacon emphasizes, it is necessary to make 'exclusions'. He says, "The induction which is to be available for the discovery and demonstration of sciences and arts, must analyze nature by proper rejections and exclusions; and then after a sufficient number of negatives, come to a conclusion on the affirmative instances. (To man) it is granted only to proceed at first by negatives, and at last to end in affirmatives after exclusion has been exhausted" (Bacon, 1960). Or, as the philosopher Karl Popper says today, there is no such thing as proof in science — because some later alternative explanation may be as good or better — so that science advances only by disproofs. There is no point in making hypotheses that are not falsifiable, because such hypotheses do not say anything; "it must be possible for an empirical scientific system to be refuted by experience" (Popper, 1959: 41). Insofar as possible, we will subscribe to the method of inductive generalization labeled 'strong inference' and described as follows by Piatt ("Strong Inference", 347): Strong inference consists of applying the following steps to every problem in science, formally and explicitly and regularly: (1) Devising alternative hypotheses; (2) Devising a crucial experiment (or several of them), with alternative possible outcomes, each of which will, as nearly as possible, exclude one or more of the hypotheses; (3) Carrying out the experiment so as to get a clean result; (1') Recycling the procedure, making subhypotheses or sequential hypotheses to refine the possibilities that remain; and so on. While the present study is, for lack of time, limited to theoretical considerations, the theory to be presented is deliberately constructed so as to afford the kind of empirical vulnerability which renders it testable from numerous angles. Some of the possibilities for testing are discussed in the concluding remarks of Chapter IV.
2. THEORETICAL ASSUMPTIONS Having dealt with the metatheoretical assumptions stated in the first paragraph of this chapter, we now turn to a consideration of the theoretical assumptions listed on pp. 9-10.
19
THE PROBLEM
2.1 Language as a Medium of
Communication
The question of whether or not language is a self-contained structure dates from antiquity. There are those who have argued, and some who continue to maintain, that language as a system is independent of its function as a medium of communication and that it can be analyzed and studied without reference to communicative contexts. I believe that the latter position is indefensible for a number of reasons, and assume rather, that the function of language as a medium of communication is an inherent aspect of its nature. In order to justify this assumption let us look at some of the difficulties which result from a failure to accept it. Oiler, Sales, and Harrington ("Basic Circularity") have shown that linguistic theories which fail to consider the communicative function of language develop a vicious circularity. Thus, theories such as that of Harris (Methods) which attempt to discover and define linguistic units without reference to meaning — i.e., communicative function — become untenable. To illustrate, if one were to follow Harris's proposed procedure in attempting to discover the units of a language solely on a distributional basis (that is, by establishing their relative positions of occurrence) he would not know where to begin. As Saussure (Course, 103f) pointed out over fifty years ago: we know that the main characteristic of the sound chain is that it is linear. Considered by itself it is only a line, a continuous ribbon along which the ear perceives no self-sufficient and clear-cut division; to divide up the chain we must call in meanings .... Analysis is impossible if only the phonic side of the linguistic phenomenon is considered.
Following Harris's method, there is no principle on which to base a first segmentation. If we supposed for the sake of argument that the first segmentation were based on meaning, how would the next be established ? The fact is that it is impossible to discover an undiscovered unit by reference to its undiscovered relationships to other yet undiscovered units. Therefore, without considering the communicative function of language, it is impossible to
20
THE PROBLEM
discover the meaningful units of a language. Furthermore, it is impossible to account for the fact that a child in learning a language DOES discover its meaningful units. The definition of linguistic units is also a problem within any framework which fails to consider the communicative function of language. Harris (Methods) proposed to define linguistic units in terms of their patterns of occurrence relative to one another. This suggestion is also unworkable since, even if one could settle on a satisfactory decision as to how to limit the terms of such definitions, they would nevertheless be circular. For instance, if we observed that a set of symbols, a, b, and c, could occur in the combinations abc, acb, bca, bac, cab, cba, a simple statement of this fact would not constitute a definition of the terms in question. The definition of a, for example, by its occurrence relative to b and c, of b relative to a and c, and of c relative to a and b leaves all three terms undefined. No doubt there is a correlation between privileges of occurrence of linguistic units and non-circular definitions of those units, but not even this correlation can be brought to light by a purely distributional analysis. The importance of pointing up the foregoing difficulties is increased by the fact that the presently predominant linguistic theory — that of the transformationalists — is also plagued by them. Chomsky ("Current Issues", 59) states that: a language should not be regarded merely, or primarily, as a means of communication, ... The instrumental use of language (its use for achieving concrete aims) is derivative and subsidiary. Chomsky (Cartesian Linguistics, 13) argues that: in its normal use, human language is free from stimulus control and does not serve merely a communicative function, but is rather an instrument for the free expression of thought and for appropriate response to new situations. Given that transformationalists do fail to make the assumption that the primary object of linguistic theory is the use of language in communication, we would expect the above noted problems
THE PROBLEM
21
to recur within the transformational framework. With some minor variations, they do. For instance, Chomsky (Syntactic Structures, 17) argued as follows: I think that we are forced to conclude that grammar is autonomous and independent of meaning ... Grammar is best formulated as a self-contained study independent of semantics (p. 106). When Reichling ("Principles") challenged the empirical bases of these statements, Chomsky ("Topics", 3) replied that Reichling's comments illustrated: a complete lack of comprehension of the goals, concerns, and specific content of the work that he was discussing, and his discussion is based on such gross misrepresentation of this work [Chomsky's Syntactic Structures] that comment is hardly called for. It is interesting, however, that in spite of Chomsky's distaste for Reichling's criticisms, the grammar was later (see Chomsky, "Current Issues", 5If) revised to include: a syntactic component, a phonological component, and COMPONENT [emphasis added].
A SEMANTIC
Thus it would appear that from 1964 on, at least, the communicative import of linguistic units was regarded by Chomsky as an important aspect of linguistic data. However, Chomsky ("Current Issues", Aspects, Cartesian Linguistics) relies almost exclusively on the earlier work of Katz and Fodor ("Structure"), and Katz and Postal Integrative Theory for the explanation of the semantic component and its relation to the other components of the grammar. Unfortunately, both of the latter works propose a conception of semantics which is independent of the communicative use of language. 6 6 For a discussion of this point, see Oiler, et al. ("Basic Circularity"). While the proponents of the transformational viewpoint now generally recognize the need for a semantic component, Chomsky (Aspects, 78) still insists that it is fruitless to discuss the semantic basis of syntax. Oiler and Sales ("Conceptual Restrictions") have contested this point. In spite of the fact that a semantic component has now been included in the
22
THE PROBLEM
We should expect to find, then, that transformationalists face essentially the same obstacles as did Harris (Methods) — namely, (a) both the discovery of linguistic units by analysis, and accounting for the discovery of linguistic units by children in language acquisition, and (b) the definition of linguistic units. With respect to (a), it is curious that Chomsky (Syntactic Structures) concluded that a discovery procedure was too strong a requirement for linguistic theory. This conclusion probably resulted from his failure to consider the communicative function of language. However, Chomsky (Aspects, 47-59) argues that a theory of language acquisition, or a discovery procedure, is one of the central problems of linguistic theory. His proposed solution to this problem is the postulation of "innate ideas" (Aspects, 59). We will see below, under the discussion of language learning, that this later position also apparently results from an inadequate consideration of the communicative function of language. As it was for Harris, the definition of linguistic units is also a difficulty within the transformational framework. Again, the problem lies in treating linguistic structures as though they were self-contained. That this is done is clearly brought out in the following statement by Katz and Fodor (Structure", 488): IN GENERAL, a sentence cannot have readings [i.e., meanings] in a setting that it does not have in isolation.
This statement implies that the meaning of sentences in communicative events is a fraction of their meaning in isolation (i.e., out of context). Uhlenbeck ("Further Remarks") has given a host of examples showing that exactly the reverse is true. Oiler, et al. ("Basic Circularity") have suggested the following replacement for Katz and Fodor's statement: A SENTENCE CANNOT HAVE ANY MEANING IN ISOLATION THAT IT COULD NOT HAVE IN SOME SETTING.
transformationalists' grammar, it is in a rather odd position. The only input to the semantic component is from the syntactic component, and there is no input to the syntactic component.
THE PROBLEM
23
The latter statement implies that the meaning of a linguistic unit is acquired inductively by the observation of language in communicative events. Any given sentence interpretation is thus viewed as an induction or set of inductions based on previous experience. Thus, the assumption that language is primarily a means of communication, and not a self-contained structure, is shown to be necessary as a result of the undesirable consequences entailed by failure to accept it. 2.2 Natural Languages as Learned Codes Although the assumption that languages are learned has been widely accepted for many years, it has recently been subjected to careful scrutiny as a result of the philosophy of 'innate ideas'. This philosophy was initiated by Chomsky (Aspects, 25) — as far as modern linguistics is concerned — and has been further developed by Katz ( Philosophy), McNeill ("Creation of Language", "Developmental Psycholinguistics"), and Gruber ("Topicalization"). It is argued, according to the theory of innate ideas, that languages are not actually learned (as is commonly thought); rather, they are more or less created by children. The situations of the child's environment which were previously thought to provide the necessary and sufficient information to account for language acquisition are conceived of as mere triggering events which set in motion the language creation process (Chomsky, Aspects, 48). Oiler and Sales ("Conceptual Restrictions") have challenged the psycholinguistic bases of the postulation of innate ideas and have pointed out a fallacy underlying the following statement by Katz (.Philosophy, 250f): Since learning the meanings of the sentences of a language is conceived of as a process of associating semantic elements of some kind with observable features of the phonetic shape of the sentences to which the child is exposed, these observable features must provide a rich enough basis of distinct elements for each semantic component of the meaning of such sentences to have a distinct phonetic element(s) with which it can be correlated by association. If, therefore, we can show that this
24
THE PROBLEM
fundamental assumption is false — that there are, in the case of certain essential semantic elements, literally no observable features of the phonetic shape of sentences with which these semantical [ii'c] elements can be associated — then we will have established that the input to (D) [the language acquisition device] is structurally too impoverished to be derived from it by principles of inductive generalization and association. Katz goes ahead to prove that the child could not possibly develop the complex linguistic system which adults are known to have if his only source of information were the phonetic shape of sentences. The proof, however, is inconsequential. The flaw in Katz's reasoning lies in the fact that the phonetic shape of sentences is certainly not the only information available to the child while learning a language. He and other transformationalists apparently overlook the fact that situational context is constantly available to the language learner. Moreover, the postulation of innate ideas leads us no nearer an understanding of the language acquisition process. It has been suggested that this construct can be compared to the phlogiston of the Old Chemistry. The latter was some unknown quantity supposed to be the source of fire. Similarly, innate ideas are some unknown quantity presumed to be the source of language. Just as the pseudo-need for the phlogiston disappeared with a better understanding of the combustion process, it seems likely that the need for innate ideas will evaporate when a better understanding of the language acquisition process is achieved. Certainly the human organism has built-in capacities which it brings to the language learning situation, but there seems to be no good reason for assuming that these capacities are the result of an inherited knowledge of the nature of language. It would appear much more reasonable to assume rather that language conforms to human capacity, and that in the language acquisition process, the child is actually learning — i.e., acquiring information which it did not previously have. If we discard the concept of innate ideas and base a theory of language acquisition on the information available to the child in communicative events,
THE PROBLEM
25
I believe that we will in fact be much more apt to discover just what language acquisition involves. 2.3 Language as Systematic It is a well attested and widely accepted fact that languages are systematic. The problem is to determine just what are the factors which contribute to its systematicity. Perhaps the question is best formulated in the negative: what are the factors (or types of relationships which can be expressed in the form of rules) which prevent languages from being chaotic, amorphous, or random? In our discussion of this question, we will be concerned with factors which apply to all languages and with factors which apply to some subset (possibly with only one member) of the set of all languages. Obviously no attempt will be made in this section to give an exhaustive treatment of any of the specific issues raised here. My purpose is merely to point up the fact that I have made certain assumptions about the structure of language which are thought necessary preliminaries to what is to follow. Some discussion will be included to give a general idea of why these assumptions seem necessary. 2.3.1 Universal Factors Affecting Language Structure 2.3.1.1 Psychophysical
Factors
These factors have to do with the interaction and interrelation of the physical reality7 which impinges upon the human sensoryrational mechanism and the nature of the human mechanism itself. For instance, due to the nature of the human mechanism, we may state the universal rule that only a finite number of units of finite length can be processed over a finite period of time. The time limitation results from the dual fact that men do not live forever, and that the processing of linguistic units or any other psychological 7
This statement should not be understood as a metaphysical commitment. What is understood by the term 'physical reality' is merely that set of objects and relations between objects (barring certain errors and illusions) that a sane man infers from his waking experience.
26
THE PROBLEM
unit requires time. This universal limitation affects the learning, perception, and production of any and all natural languages. It also carries with it the requirement that all languages consists of a finite number of units which can combine via a finite number of rules into larger units of a finite length. Not only must the length of these larger units be finite, but there must also be an upper limit (albeit a vague one) to their length, i.e., a length beyond which sequences become ungrammatical. This has been denied by Chomsky and Postal. Postal ("Constituent Structure", 34, 76, and "Limitations", 148) has argued that there is no upper limit on sequences which are generated by recursive rules. Similarly, Chomsky (Syntactic Structures, 23) states: We might arbitrarily decree that such processes of sentence formation namely, recursive ones as those we are discussing cannot be carried out more than n times for some fixed n. This would of course make English a finite state language, as, for example, would a limitation of English sentences to a length of less than a million words. Such arbitrary limitations serve no useful purpose, however.
Contrary to what Chomsky says, that such a decree is not in fact arbitrary and that it does serve some useful purposes is pointed out by Yngve ("model") and Newcomb ("Graph Model"). Yngve shows that recursive clause embeddings produce incomprehensible sequences if carried out more than three times in the same sentence. Newcomb has shown that a directed graph version of a finite state syntax can then be constructed which will handle clause embeddings so long as an upper boundary is imposed. Yngve has also argued that sentences with an excessive number of embeddings are thereby ungrammatical. If we are talking about a natural language, as opposed to some hypothetical construct, we KNOW that a speaker cannot utter or understand a sentence requiring a hundred years (of continuous speech) to produce. This is a fact which to my knowledge no one has ever denied. Therefore, somewhere between the length of a sentence requiring a hundred years to produce, and one requiring
THE PROBLEM
27
slightly more than no time, there is an upper boundary on the length of sentences in any natural language. As media of communication, natural languages codify information which can be transmitted from one speaker of a given language to another. The very suggestion of excessively long sequences, or excessive recursions, is contrary to this fact of communication. Such sequences could not be considered messages because they could neither be encoded nor decoded. In fact, the concept 'message' itself implies comprehensibility and therefore entails an upper limit on length. To treat a natural language as though there were no upper boundary on the length of its units is not to deal with a natural language at all. A further psychophysical restriction affecting all languages has to do with the production and discrimination of linguistic patterns sequenced in time. These patterns must be discriminable and recognizable by the human sensory-rational mechanism. They must be discriminable in two important senses: (1) they must be sufficiently distinct from other patterns or stimuli which impinge on the senses to be discriminated from non-linguistic ones. Consider, for example, the set of all syllables S of any language. If utterances in the language are to be discriminable from other objects which impinge on the organism's senses, they must be sufficiently different from the other objects. (2) Linguistic patterns must be sufficiently different from one another to be discriminated as distinct patterns. In order for the set S (the syllables of any language) to contain individuated members, the members must be discriminable one from the other. Still more importantly, linguistic patterns must be recognizable. To illustrate this point, if we take the syllable to be the smallest phonological unit of a language which can be uttered in isolation, any given syllable s must not only be discriminable from all other syllables in the language, but it must be recognizable as a particular member of the set S. If this condition is not met, then it is clear that s or any other member of S could not be identified except when all other members of S were also present for comparison. The latter is obviously not the case in natural languages. These psychophysical rules of
28
THE PROBLEM
discriminability and recognizability hold true even if language is transmitted through a medium other than speech, e.g., writing or sign language. A further psychophysical restriction on the nature of language has to do with the distribution of linguistic units in time. If we again refer to the sound patterns of language, it is clear that certain distinctive features, for example, cannot occur simultaneously — e.g., voicing and non-voicing. The transition between articulatory states requires time. It may seem somewhat trivial to mention this restriction since it seems to derive from the truisms that two physical objects cannot simultaneously occupy the same space, and that motion requires time as well as space. However, the seeming triviality may be removed by noting that in an important sense this same restriction applies to abstract psychological objects as well. Just as two physical objects cannot simultaneously occupy the same physical space, two psychological objects cannot occupy the same psychological space. That is to say, the manipulation of symbols (or other abstract psychological entities) requires time in much the same sense that the manipulation of physical objects requires time. 8 2.3.1.2 Factors of Learning The differences between the cognitive capacities of a normal child and a normal adult, or between a beginning student of mathematics, violin, or whatever, and someone who is an accomplished master of the subject is generally attributed to something called LEARNING. In some partially understood fashion, a person who has learned something has become capable of doing something which he could not have done before the learning took place. This accomplishment may involve the manipulation of physical objects, or of abstract psychological entities. In the case of language, it involves both. At least since the time of Rousseau, and probably long before, it has been known that learning does not proceed in an entirely 8
Russell (Human Knowledge, observation.
210-214) gives convincing evidence for this
THE PROBLEM
29
haphazard way, but systematically, step by step. In Chapter III we will discuss in more detail a rule of induction to be suggested as the basis of the systematic nature of learning. The rule depicts the process of pattern recognition in a summarial fashion. Add to this process the creative capacity of the human being to recombine patterns and I believe that we have a sufficient basis with which to explain even the most complex forms of human rational behavior. Some attempt will be made to substantiate these claims in Chapter III. It is by the rule of induction that the limitations of time and space are to some extent overcome. Through its application, finite units (occupying finite space and time) can be used to express concepts of infinity. It is also by this rule that the finite information handling capacity of human beings can be increased indefinitely. It is essentially the process of induction that Miller ("Magical Number Seven") is referring to when he speaks of 'recoding'. By the process of recoding, instead of dealing with vast numbers of different objects as units, the organism deals with classes or sets of objects as units. This seems to be a psychological application of Occam's razor — the principle that entities should not be multiplied beyond necessity — and Leibniz's principle of the identity of indiscernibles. Within a certain frame of reference, objects which might otherwise be viewed as different are classed together. As long as frames of reference are clearly differentiated, confusions will not result from this sort of generalization. Miller ("Magical Number Seven") explicates the seeming paradox of psychologically dealing with more by paying attention to less with an analogy. It seems if one had a purse which could only hold a certain number of coins, it would not matter to the purse whether the coins were pennies or silver dollars. Thus by recoding many bits of information into larger 'chunks' and by recursively applying this rule, it is not difficult to see how the human rational mechanism could indefinitely extend the practical bounds of its capacity for processing information. The results of the rule of induction and the creative recombination
30
THE PROBLEM
of patterns are to be observed in all natural languages. It is by the rule of induction that a potentially (if not actually) infinite variety of stimulus patterns are reduced not only to a finite number, but to a manageable finite number, which may be codified in linguistic units. It is the recombination of these patterns and linguistic units which results in linguistic creativity. 2.3.1.3 Factors of Reasoning The term 'reasoning' instead of 'logic' has been selected in order to indicate that we are concerned with human logical thinking rather than with the disciplines of symbolic or mathematical logic. Of course, as is recognized by many mathematicians, mathematics is merely a highly specialized form of reasoning. 9 Reasoning requires the use of symbols. Its power lies in the fact that through the manipulation of abstract objects, i.e., symbols, a man is able to select from several strategies for approaching a problem one which is more efficient than any strategy which he could hit upon without reasoning. In fact, many problems could not be identified at all if it were not for man's power of abstract reasoning. One of the requirements for reasoning is that its symbols be defined. If, for example, we were given a symbol X but had no knowledge of its meaning or any of the facts relating to that meaning, we would be powerless to use X in reasoning. Often a vague or partial definition may be sufficient, and in dealing with the symbols of natural languages, this is in many cases the only type of definition available or even possible. It should be remembered, however, that 'vagueness' is a concept which is meaningful only within a relatively well defined perspective. The statement that reasoning requires definitions of the symbolic forms to be employed is very similar to some of the statements made above on the topic of psychophysical restrictions on language. T o say that definitions of symbols are required is equivalent to saying that they must be associated with discriminable and 9
See Lotz ("Natural Languages") and Church ("Need").
THE PROBLEM
31
recognizable objects, in specifiable ways. In natural language symbolization (employing the term 'symbolization' in the sense of Chafe, "Language as Symbolization"), the associations which constitute the meaning of symbol involve syntactic, semantic, and pragmatic factors. (We will have more to say on these matters in Chapter II.) The symbol itself must also be discriminable and recognizable as an object in its own right. If, for example, there is a symbol a and another symbol b, the rational mechanism must discriminate a from b and vice versa, and it must recognize a as a and b as b. Moreover, it must recognize that a and b are of the same type; that is, they are both symbols of the system having a and b as its elements. We may refer to these three processes, respectively, as THE DISCRIMINATION OF DIFFERENCE, THE RECOGNITION OF IDENTITY, a n d THE RECOGNITION OF SIMILARITY.
These universals of reasoning must affect all natural languages since the latter are defined as symbolic systems. As such, all languages are equipped to express the relations of identity, similarity, and difference. 2.3.2 Universals of Language Structure To this point, we have discussed certain reasons for the systematicity of language. We now turn our attention to certain universal aspects of this systematicity, and in section 2.3.3 to non-universal aspects of linguistic systems. One of the most widely recognized universals of language structure is that all existing natural languages are spoken. Other forms may be employed in rather strict correspondence with spoken forms, but speech is basic to all natural languages. In the present study, we will be concerned with an analysis of the forms of speech, and an analysis of their meanings. From the beginning, two axes of structure are recognized: the PARADIGMATIC, and the SYNTAGMATIC. The syntagmatic axis contains the units of language as they are strung together in speech, while the paradigmatic axis contains the various units which might be employed to formulate a particular stream of speech.
32
THE PROBLEM
In dealing with the syntagmatic axis, we are concerned with a sequence of units in time. These units also occupy space in the sense that the hearing mechanism responds to them as objects of sound.10 We will refer to the study of relations between linguistic units on the syntagmatic axis as SYNTAX. The basic syntactic process is MODIFICATION — which results in relationships between units on the syntagmatic axis allowing them to function as units of higher levels on the paradigmatic axis. In dealing with the paradigmatic axis, we are concerned with meaningful units which are differentiated from other meaningful units of the same type. Only one such unit can be present at any given position on the paradigmatic axis at any given time. We will refer to the study of this axis of language structure as SEMANTICS. Semantics automatically includes syntax since any meaningful unit regardless of its size or complexity must occur on the paradigmatic axis, though it does not necessarily enter into any syntactic relationships. The basic semantic process is SIGNIFICATION — a relation holding between linguistic units and meanings, i.e., the objects of cognitive experience. Since we have demonstrated the importance of the communicative function of language, it follows that we must deal with the structure of language as it is employed in communication. This means that we must consider the coding of information by speakers of the language. We will refer to the study of this coding process as PRAGMATICS.11 It logically includes the study of semantics and syntax.12 2.3.3 Non-Universals of Language
Structure
Certain non-universal aspects of language structure have often been used as examples to demonstrate the arbitrariness of linguistic 10
See also the discussion of section 2.3.I.I, pp. 25f, on the psychophysical restrictions of simultaneous occurrence. 11 The term 'pragmatics' is used here in the sense now current with psychologists working on coding problems. See Slobin ("Grammatical Transformations") and Wason ("Response", "Contexts"). 12 For further discussion of the several concepts introduced here, see Chapter II.
THE PROBLEM
33
structures. For instance, we have the fact that the word for a certain canine in English is dog, while in Spanish it is perro, in French it is chien, and so on. This sort of capriciousness is apparently to be found on all levels of language structure. It is not that the objects of experience, which we call "dogs", are so very different across cultures, but that the particular means for naming or referring to that class of objects is different. It would seem that all differences of this type could be compared with the alternative spellings of some words. In both cases, some form of representation is involved. The object of the spelling is to represent a pattern. It does not really matter HOW the pattern is represented as long as it is represented. Note that there is a certain amount of non-arbitrariness here. Although it is not crucial that a representation take one form as opposed to another, it is absolutely necessary that it take some adequately discriminable form. Thus, we should expect that if there are dogs in all lands, and if all the peoples of the world come in contact with them, there will be some way of referring to these animals in all languages. Although the means for referring is an arbitrary matter, the fact that reference must be made is non-arbitrary. We may distinguish at least two basic types of differences between languages. The first type is the one which we have just noted, and which we may conveniently refer to as THE ARBITRARINESS OF 'SPELLINGS'. The second type of difference is of an entirely different sort. Consider, for instance, the fact that the language of a five year old Arab child is fairly unlikely to contain any linguistic unit for referring to the object (or substance) which practically any five year old child of English speaking background would call "snow". This is an environmental accident which has nothing to do with the capriciousness of language structure. For this reason, this type of difference across languages is of little interest to the linguist. The more interesting type of arbitrariness, that of spelling (in the sense of this word given above), is clearly present in phonology and syntax. It is probably also prevalent in semantic structures and coding processes though perhaps less obviously so.
34
THE PROBLEM
It is doubtful that there is any considerable degree of arbitrariness in the elements of experience themselves except insofar as the structure of language may influence the general nature of cognition (see Chapter III, section 3).
II THE STRUCTURE OF LANGUAGE
What I am saying is that language would be impossible if the physical world did not in fact have certain characteristics, and that the THEORY of language is at certain points dependent upon a knowledge of the physical world. Language is a means of externalizing and publicizing our own experiences. A dog cannot relate his autobiography; however eloquently he may bark, he cannot tell you that his parents were honest though poor. A man can do this, and he does it by correlating 'thoughts' with public sensations. Bertrand Russell (Human Knowledge, 60J
0. INTRODUCTION
In this chapter we will be concerned with the way in which information is coded in the units of a natural language. At least partial answers to the following questions will be required: what is the relationship between syntax and semantics? What are the units and relationships which constitute a grammar? What is meaning? How does the grammar of a language enable a speaker of that language to employ it in communication? In Chapter I we stated the assumption that language is a medium of communication, and we presented some arguments as to why this assumption is necessary. It remains to be seen just what the structure of language in communication is actually like.
36
THE STRUCTURE OF LANGUAGE
1. PRELIMINARIES 1.1 Syntax,
Semantics,
and
Pragmatics
The most common sort of organization for a discussion of the structure of language involves a division of the subject matter into a section on syntax and a section on semantics (pragmatics is usually left out entirely). This is particularly true of studies by linguists, 1 less so of studies by philosophers. 2 Such an approach is not employed here because the author is convinced that syntactic, semantic, and pragmatic 3 aspects of language are integral components of a single basic structure. To treat them separately is to fail to adequately comprehend the nature of language. We may represent the relations between these aspects of the structure of language, in terms of their scope, as in Fig. 1. W e see that the study of pragmatics entails the study of semantics which 1
In fact, it has only been in recent years that American linguists have again become interested in semantics. Since the time of Bloomfield the very topic had been excluded from the domain of linguistics. The Bloomfieldian trend reached its extreme in the distributional analysis proposed by Zellig Harris. Now, linguists are returning to some of the semantic problems discussed by Sapir and others as early as 1921. Some anthropologists maintained an interest in semantics since Sapir, but they were clearly in the minority in terms of dominant trends until very recently. It is somewhat surprising to note that even the most popular form of linguistic analysis, & la Chomsky, almost completely ignored semantic issues until about 1963. It was then that Katz and Fodor published the beginnings of a semantic theory. At the time, they conceived of semantics as that part of language structure which is left over when syntax is subtracted. This view is now generally regarded as incorrect though attempts are still being made, as we noted in Chapter I, to keep the semantic and syntactic components of the grammar apart. 2 While linguists have been somewhat reluctant to deal with the problems of meaning, philosophers have been studying semantics for many years. For an excellent analysis of certain aspects of conversational language, see Reichenbach (Elements). For a summarial sketch of more recent thinking see Quine ( Word). 3 The pragmatic aspects of language structure are apparently still ignored by the majority of linguists. Psychologists, on the other hand, are rapidly turning up more and more data which point to the necessity of considering pragmatics. Also, there are independent logical grounds for doing so. Some of them are discussed in Chapter I, above, and by Oiler, et a!. ("Basic circularity"). For relevant studies by psychologists, see note 11, Chapter I.
THE STRUCTURE OF LANGUAGE
37
pragmatics semantics syntax
Fig. 1
also entails the study of syntax, and, in general, the converse also holds. In order to begin to understand any of these aspects of language, one must begin to understand them all: this is accomplished by analyzing language as a medium of communication. The first step is to start breaking the code. I suggest that one cannot penetrate to the syntax of a language independently of its semantics and pragmatics. To attempt to deal with syntax without semantics, or with semantics without pragmatics, and in either case to call it a study of the structure of language is something like studying hydrogen in the absence of oxygen and calling it a study of the structure of water. Although basic elements may be observed, the structure of the whole is bound to remain obscure. In the past, linguists often tried to describe and explain the structure of language by beginning with a basic unit and attempting to build up the whole structure. This was particularly true of the Bloomfieldians and of the earlier {Syntactic Structures [1957]) Chomskyan approach. As we noted in Chapter I, Chomsky and his followers originally attempted to keep the study of syntax free of any semantic considerations. This proved impossible, and in later formulations, a semantic component was added to the transformational grammar. Presently, transformationalists are still trying to keep the syntactic and semantic components of the grammar apart, but they are much less certain about the possibility for the success of these attempts than they apparently were in 1957 and shortly thereafter. It is now widely acknowledged that
38
THE STRUCTURE OF LANGUAGE
the attempt to build up all of the grammatical sentences of a language on the basis of a set of kernel sentences plus transformations was an error. The basic problem was, I believe, a failure to give adequate attention to the fact that language functions as a system of human communication. I suggest that it will eventually be recognized that any attempt to treat language apart from its communicative function is seriously misguided. 1.2 On Beginning with the Communicative
Function
The suggestion that the structure of language can best be understood by beginning with the study of its units as used in communication is supported by some thought on the CREATIVE aspect of language emphasized by Chomsky. The primary motivation for the description of language in terms of a generative grammar is the indisputable fact that a speaker is capable of uttering or understanding sentences which he is never observed to utter or understand. For this reason, it is assumed that the speaker's LINGUISTIC COMPETENCE — that is, his capacity to speak and understand a language — is something different from his LINGUISTIC PERFORMANCE — the sentences (and non-sentences) that the speaker says and understands. Thus, the linguist must not only account for the observable output of a speaker, which is actually only a small part of linguistic performance, but he must also account for the speaker's linguistic competence.4 In fact, it is the creative generative capacity of the speaker which is of primary interest to the linguist. This capacity encompasses both the utterance of appropriate combinations of linguistic units in new situations, and the understanding of previously unencountered combinations. A bit of reflection on the possibilities for generatively combining 4
There have been some serious misunderstandings of the concepts 'competence' and 'performance' which we do not discuss here. For an elaboration of the issues involved, see Oiler, et al. ("Consistent Definitions"). These misunderstandings, however, do not detract from the importance of distinguishing between what a speaker does and what he is capable of doing. For a slightly different viewpoint see Fromkin ("Speculations").
THE STRUCTURE OF LANGUAGE
39
linguistic units at the various levels of structure which linguists generally recognize will show that a study which begins with full fledged communication is apt to reach a solution sooner than an approach which seeks to build up communicative language out of some assumed sub-structure. For instance, if we consider the level of phonology, 5 it is clear that all of the mathematically possible combinations of distinctive features, phonemes, syllables, or whatever unit we choose, simply do not occur. In fact there are fewer phonemes than there are mathematically possible combinations of distinctive features; there are fewer syllables than there are mathematically possible combinations of phonemes; and there are, indeed, fewer units at each higher level than would be produced by generating all of the mathematically possible combinations of any lower level unit. What these observations demonstrate is that there is an increasing number of restrictions on the possible combinations of units at each higher level. Therefore, an approach which begins by studying a higher level of structure would seem likely to lead sooner to an understanding of the structure of language than an approach starting at a lower level which does not make a conscious systematic effort to take account of information available through the observation of higher levels. The approach beginning at the higher level simply has fewer alternatives to account for. 6 On the basis of these considerations, the present analysis begins with a study of the structure of language as it is used in communication. W e define a communicative event as any finite time period during which there is a flow of information from one conscious human organism to another through the medium of language. Restricting our attention to the aspect of communication involving language seems desirable at present, but should other Phonology is not discussed in detail in the present work, but it is assumed that phonology is a realizational system (in the sense of Lamb, Outline) which gives physical substance to grammatical units. 6 Chomsky (Syntactic Structures, 18) was apparently aware of this fact as it applies to the description of phonemics, morphemics, and phrase-structure, though he did not apply the generalization to the levels of semantics and pragmatics. 5
40
THE STRUCTURE OF LANGUAGE
factors have to be taken into account later, this should create no essential problems of principle. We will for the moment, therefore, assume that the boundaries of the communicative event can be determined in a non-arbitrary way through the observation of the passage of time, change in topics, the introduction of new speakers, etc. We will further assume that the basic linguistic unit of communication is the SENTENCE. We will be concerned primarily with the DECLARATIVE SENTENCE or PROPOSITION. Although the latter terms are equated, it would be an error to treat the terms SENTENCE and PROPOSITION alike in all cases since SENTENCE is more general in scope.7 It was suggested in Chapter I that the structure of language can be viewed as a system of grammatical units construed along two axes — the SYNTAGMATIC and the PARADIGMATIC — of an abstract space. The term SYNTAX was suggested for the study of the relationship between units along the syntagmatic axis, and SEMANTICS for the study of units occurring along the paradigmatic axis. The basic syntactic process was referred to as MODIFICATION, and the basic semantic process, as SIGNIFICATION. Fig. 2 is a schematic representation of the abstract space within which we will describe language structure. The factor of time and the PRAGMATIC process of CODING are indicated in the diagram.
7
The danger in equating the terms 'proposition' and 'sentence' rather than 'proposition' and 'declarative sentence' lies in the fact that not all sentences assert a situation (as does a proposition). This has been made painfully clear by Roller ("Hornbook") in a reply to Lamb. It is obvious that a question, for instance, or an imperative, does not make an assertion as does a declarative sentence. Koller, however, apparently overlooks some arguments for the other side. When someone asks the question "Where is the pencil ?", he is at least implying that a pencil exists and that it is somewhere. He is also implying that he does not know where it is, and that he has some reason to believe that his listener(s) may know where the pencil is. Therefore, Roller's objections to Lamb's point of view are not entirely justified. There is a certain informativeness about questions, imperatives, etc., which is not essentially different from the information conveyed by declaratives.
41
THE STRUCTURE OF LANGUAGE
C 0 D 1 N G
MODIFICATION syntagmatic axis ' ? 1 C
time
G N I F y
Nma ^
=
isNma = (ii. •••, u >
and m ^ 1, m < n. Formula (xvii) is read: the functional 'some (Nma)' signifies that the predicate to follow is asserted as true of each member of some subset of the instances of the substance named by the Nma in question. This subset is a proper subset of the set of all instances of the substance. In NP (o), the big green block, we encounter a string of adjectives for the first time. By converting (o) into the most general form, we obtain 'the (Adj-2 (Adj-1 (Nsg)))' the meaning of which we may represent as follows : (xviii) the (Adj—2 ( A d j - 1 (Nsg))) a Pf (oi), where e Onpi F
Adj-1Adj-2
S 0Npl
"Adj-1
£
Onpi = {oj, ..., on}, and n ^ 2, and o, is pre-specified or is specified in its concurrent context. Formula (xviii) is read: the functional 'the (Adj-2 (Adj-1 (Nsg)))' signifies that the predicate to follow is asserted as true of a particular specified member of the set of objects named by the pluralization of the Nsg in question. NP (p), the very extraordinarily beautiful young woman, is a bit more complex than the preceding example, but it can be handled in a precisely analogous fashion:
64 (xix)
THE STRUCTURE OF LANGUAGE
the (Int (Adv (Adj-2 (Adj-1 (Nsg))))) a Pf ( o j where o, e 0 N p l A a j
Adj—2A .
s Int
Npl AdJ 1 A ° Npl Adi-l "J-1Adj-2Adv ~ ° Adj-1 Adj-2 ~
° N P'Adj-i ~ ° N p ' = {° ' •••' °n}>
an
d
n 2, and o t is pre-specified or is specified in its concurrent context. Formula (xix) is read: the functional 'the (Int (Adv (Adj-2 (Adj-1 (Nsg)))))' signifies that the predicate to follow is asserted as true of a particular specified member of the subset of objects which are named by the Int (intensifier), Adv (adverb), Adj-2 (adjective), Adj-1, Npl (the pluralization of the Nsg in question) collocation. The latter subset is included in the subset named by the Adv Adj-2, Adj-1, Npl collocation which in turn is included in the subset named by the Adj-2, Adj-1, Npl collocation. The latter subset is included within the subset named by the Adj-1, Npl collocation, which is included within the set of objects named by the Npl in question. Looking back to NP (p), the very extraordinarily beautiful young woman, we see that formula (xix) would
interpret it as follows. This NP refers to an object which is a unique member of the set of very extraordinarily beautiful young women. This set is included within the set of extraordinarily beautiful young women which is included within the set of beautiful young women, and so on. The sort of conceptual sub-classification involved here can be represented in a Venn diagram as in Fig. 9 where included areas represent included sets. Before proceeding to a discussion of the predicate, we should perhaps consider a possible objection to the similarity of the analyses of the functionals of formulas (xviii) and (xix). While formula (xix) contains adjectival modifiers, formula (xix) contains both adjectivals and adverbials. Although the modifiers treated
THE STRUCTURE OF LANGUAGE
65
in the two cases are of essentially different types, their analyses turn out to be precisely parallel. An objection might be raised against this parallelism since adverbials of the type in question often have more limited scopes of modification than adjectivals. This is certainly true in the cases of the functionals of formulas (xviii) and (xix), as can be seen in the more complete structural analyses of Figs. 10 and 11.
2
1 2
1 2|
Det
Adj Adj
Nsg
the
big green
block Fig. 10
66
THF, STRUCTURE OF LANGUAGE
In Fig. 10, we see that the Nsg block is modified by an Adj green. The unit thus constituted, green block, is modified by the Adj big forming a new unit, big green block. This unit is modified by the Det the which completes the NP the big green block. In Fig. 11, we see a different relational structure. 1 2 2 2
1 1
1
Det
Int
Adv
the
very
extraordinarily
2 Adj beautiful
Adj
Nsg
young
woman
Fig. 11
The Nsg woman is modified by the Adj young to form the unit young woman which in turn is modified by the unit very extraordinarily beautiful. The latter unit consists of an Adj beautiful which is modified by the unit very extraordinarily which consists of an Adv extraordinarily modified by an Int very. The unit formed then by the modification of young woman by very extraordinarily beautiful is further modified by the Det the. Certainly, there is a similarity of structure between the NP of Fig. 10 and the NP of Fig. 11 as far as the modification of the respective Nsgs is concerned; however, the construction of the adjectival very extraordinarily beautiful clearly distinguishes the two phrases. How is it then that the functional analyses of formulas (xviii) and (xix) are essentially the same ? The answer is fairly simple. The EFFECT of the various modificational relationships involved in the functional of Fig. 11 is the same in terms of the semantic limitations imposed, as it would be if the modifiers were all adjectives — e.g., in the NP the small extraordinary beautiful young woman. The reason that this holds true is that since in the case of the NP the very extraordinarily beautiful young woman
THE STRUCTURE OF LANGUAGE
67
"beauty" is a property of the young woman, when we modify that "beauty" by saying that it is extraordinary, we also automatically modify a characteristic of the young woman. Thus, the modification of one modifier by another, is equivalent in effect to the modification of the unit constituted by the first modifier and the unit which it modifies. In the light of this observation, the would-be objection to the parallelism between formulas (xviii) and (xix) does not stand. 2.2 The Predicate In discussing the declarative sentence (Fig. 5), the subject subroutine has been expanded and worked over in some detail. The predicate remains to be considered. In our treatment of the predicate, we will first specify a generative mechanism, and then proceed to discuss the meanings of the linguistic units which it generates. This requires an expansion of the predicate state (marked Prd) of Fig. 5 into a sub-routine. The expansion is given in Fig. 12 where VP = verb phrase and Cmp = complement.
^Prd^ Fig. 12
We will leave the discussion of the Cmp until section 2.3 and will concentrate here on the VP. As we see in Fig. 12, it is possible to by-pass the complement state in the predicate and still complete a declarative sentence. The VP state is expanded into a sub-routine as shown in Fig. 13 where Vs = verb, third person singular; V = verb; Ved = verb, past tense; Vert = past participle; Ving = present participle, and P = passive.
68
THE STRUCTURE OF LANGUAGE
As in the case of the NP treated earlier, we will not attempt an exhaustive analysis of the VP. We will attempt instead to achieve the more modest goal of showing that a particular form of analysis is applicable to a large number of VPs in English. On the basis of this analysis, we will merely suggest that all VPs in English might be handled in a similar fashion, likewise verbal constructions in other languages. The following VPs are selected for analysis: (a) (b) (c) (d) (e) (f)
is was goes (or any other Vs or V) went (or any other Ved) does V didW
THE STRUCTURE OF LANGUAGE
69
(g) is Ving (h) was Ving (i) is Ven (j) was Ven (k) has been (1) has Ven (m) had been (n) had Ven (o) has been Ving (p) had been Ving (q) has been Ven (r) had been Ven (s) has been being Ven (t) had been being Ven (u) Modal V (v) Modal be Ving (w) Modal be Ven (x) Modal have Ven (y) Modal have been Ving (z) Modal have been Ven (a') Modal have been being Ven Before beginning an analysis of any specific VP, we may make several general statements which will apply to all cases. In the first place, all VPs signify some state-of-being. This state is asserted of the referent of the subject. Secondly, the conceiving of a state-of-being by a speaker allows for considerable flexibility in at least two ways. The state may involve the continuous occupation of time, or it may consist of a series of relatively discrete events or states. Also, the state may be conceived of and referred to in the same way whether it is of relatively short or long duration. The relevance of each of these points will be pointed up in the discussion of the specific examples to follow. Starting with VP (a), is, we may represent its meaning in Fig. 14 where the line marked "t" represents the dimension of time. Points to the left of the intersecting line labeled "present" indicate
70
THE STRUCTURE OF LANGUAGE
time, and points to the right, FUTURE. The set of vertical lines intersecting t indicate the occupation of time by the state in question (in this case, the state of being, or existence). The
PAST
past
present
future
Fig. 14
relative heights of the vertical lines intersecting t indicate the probability that the state in question was still or will be still occupying time at that point. We see that this probability (where the term PROBABILITY is understood either in its non-technical sense or in the sense of Reichenbach, Experience) generally decreases as we proceed away from the present in either direction. THE PRESENT is defined as THE SEGMENT OF TIME COMTEMPORARY WITH THE UTTERANCE WITHIN
ITS PARTICULAR
PERSPECTIVE.23
An example may help clarify Fig. 14 as well as the generalizations concerning VPs stated two paragraphs earlier. If we take the sentence Mary is, we may interpret it as meaning that the referent of the subject, Mary, exists. Earlier, we stated that THE STATE-OF-BEING (OR THE STATE-OF-EXISTENCE) UNDERLIES ALL
Does this generalization hold in the present instance? Yes. That is to say, the state of Mary's existing must occupy time (i.e., exist) in order for Mary to exist. A similar argument can be produced for any VP thus showing that our generalization also holds in all other cases. Another generalization concerned the conception of a state OTHER STATES.
23
A speaker may change the perspective of an utterance by saying something like "Imagine that you are now living in the time of ancient Rome". If he should go on to say "Caesar is now the ruler of the Roman Empire", a hearer would have no difficulty in understanding that Caesar is not actually NOW the ruler of the Roman Empire, but WAS at one time. Thus, in defining the PRESENT, it is necessary to take perspective into account. See note 18 above.
THE STRUCTURE OF LANGUAGE
71
as continuous or as a series of discrete states. Again referring to the sentence Mary is, we may think of the existence of Mary from birth until now — her existence in this case being a continuous state — or, we may conjure up special circumstances in which Mary's existence in a particular state actually consists of a series of discrete states. Consider, for instance, a situation in which someone might ask the question Who is at work nearly everyday ?, and someone else might respond Mary is. In this case, the state-of-being signified in the sentence Mary is actually consists of a series of otherwise discrete states. This fact can be visualized as in Fig. 15.
Fig. 15
In the figure, S represents the inclusive state (Mary's being at work from day to day), while each si represents the discrete state on any given day during which Mary is at work. The symbol d p , which appears at the point in time marked "present", is the day of utterance, d p _ 1 , the day before the time of utterance, d p + 1 , the day after, and so on. The state S, thus, is actually equivalent to the series si through s n (where n is an indefinitely large number). The other generalization which we should consider concerns the fact that a state may be conceived of and referred to in the same way whether it is of relatively short or long duration. Compare, for instance, the sentence Mary is with the sentence The universe is. Presumably, both states of existence had a beginning and will have an end, but the time segments which they occupy are vastly different. Mary will indeed be fortunate to endure for
72
THE STRUCTURE OF LANGUAGE
a century. The universe, on the other hand, has been and probably will be for millions and millions of centuries. In spite of their diversity, however, the states involved in the respective cases are equally well represented by the schematic of Fig. 14. It is equally true of both states that their probability of existence decreases as we proceed away from the present in the direction of past or future. Returning now to the specific VPs selected for analysis (see pp. 68f), we may represent the meaning of (b), was, as in Fig. 16. This figure is in all respects like 14 except for the fact that the center of the state-of-being is shifted into the past, and is separated from the time of utterance (marked "present") by an indefinitely long period x (where x is greater than 0). Otherwise, the interpretation of was is like that of is.
past
present
future
Fig. 16
VPs (c), goes, and (d), went, can be understood in a way parallel to (a), is, and (b), was, respectively. The important difference between the pairs (c) (d) and (a) (b) is that the first pair signifies a state of 'going' while the second signifies 'being' or 'existence'. The states may be defined and differentiated by employing the notion set or class. If we treat 'being' as a property of a set of objects, we may give it an extensional definition by pointing out some of the objects of which it is a property, and we may give it an intensional definition by describing the property itself, e.g., saying that 'being' or 'existence' amounts to the occupation of time by an object, event, or state. 'Going' may similarly be defined both extensionally and intensionally. The problem of defining specific states, however, is of less interest to us than the more general problem of locating states (as they are referred
THE STRUCTURE OF LANGUAGE
73
to by VPs) on the dimension of time. Therefore, we will concentrate on the more general problem. VPs (e), 'does V', and (f), 'did V', have essentially the same meaning as (c) and (d) with the exception that (e) and (f) stress the truth of the assertion in question. Compare, for instance, the sentences He goes and He went with He does go and He did go. In the latter pair, the truth of the assertion of someone's having gone or going is stressed. It is interesting that the form of the VPs (e) and (f) is also employed where a negation or a question is involved. In the case of negation, the truth of a certain assertion is being denied, and in the case of a question, we have a request for the affirmation or denial of the truth of an implied assertion. In each case, some sort of emphasis is afforded the truth value of an assertion. Examples (e) and (f) are unique in our treatment of VPs thus far since they consist of more than one unit each. In order to be consistent with the principle of analysis used in the treatment of NPs above, it is necessary to determine the DIRECTION of the relation of modification holding between the constituents of (e) and (f). There are several reasons for assuming that the V is the modifier and does or did the unit modified. The primary justification for this assumption is that we have sentences like (1) Mary does or (2) Mary did. In these instances, does and did clearly function as the central VP element, since they are the only VP element. They also carry tense and person markers. However, we do not have sentences like (3) * ' M a r j V' (with the same intonation as sentences 1 and 2). Out of context, (1) and (2) are ambiguous as to what exactly is predicated of Mary. By the addition of a V (as in Mary does talk), the ambiguity is partially resolved. The fact that it is the function of a modifier to limit meaning, or to disambiguate, in addition to the other facts already noted leads us to treat the V as the modifier and does or did as the units modified. Thus, employing the notational conventions used in the analysis of NPs, we arrive at the following functional: (e) (does) V (f) (did) V
74
THE STRUCTURE OF LANGUAGE
For examples (g) and (h) we have: (g) (is) Ving (h) (was) Ving The meanings of these VPs are parallel to is and was, respectively, as far as location on the time dimension is concerned. They may thus be represented as in Figs. 14 and 16. They are also ambiguous with respect to signifying a continuous state or a series of relatively discrete states. (See Fig. 15.) A fact which may differentiate the meanings of (g) and (h) from (e) and (f) is that (g) and (h) are presumably used much more frequently than (e) and (f) with the meaning of a continuous on-going state (as depicted in Figs. 14 and 16) rather than a series of discrete states (as in Fig. 15). The series meaning for (g) and (h), however cannot be ruled out as a possibility. Examples (i) and (j) can be analyzed: (i) (is) Ven (j) (was) Ven If the V in question is transitive, then (i) and (j) are passive. That is, the action signified by the V is received by the referent of the subject. For example, if the V were take, the VP is taken would have the meaning of the VP takes plus the passive meaning. If the V in (i) or (j) is intransitive, e.g., go, the Ven (in this case, gone) specifies a perfect or completed state. Examples (k)-(n) may be analyzed as follows: (k) (1) (m) (n)
(has) (has) (had) (had)
been Ven been Ven
The forms has and had are the forms modified while the various past participles are modifiers. (The rationale for selecting has and had as head-units is the same as that given for does and did on p. 73.) If for an example of VP (k) we take the sentence
75
THE STRUCTURE OF LANGUAGE
Mary has been, we may represent its meaning as in Fig. 16. This means that the sentence Mary has been is in some sense synonymous with Mary was. There is a noteworthy case, however, when the two sentences are not synonymous. If we say Mary was here for two years and Mary has been here for two years, the meanings are clearly distinct and are easily visualized. The meaning of the former can be depicted as in Fig. 16 as long as the state of Mary's being here is shown to have existed for a two year segment of time up to the beginning of x. The meaning of the latter can be depicted in the same way with the stipulation that x be equal to zero. If for an instance of VP (1) we select the sentence Mary has gone, its meaning may also be represented as in Fig. 16 as long as x is greater than zero. Thus, the state of Mary's going is shown to have been completed at some time prior to the utterance of Mary has gone. In the cases of VPs (m), had been, and (n), had Ven, the meaning changes only slightly and can be represented in Fig. 17. The figure shows that relative to some past point of reference p, which is an indefinite distance x from the PRESENT, S (the predicated state) existed until time p' which is an indefinite distance x' (where x' is greater than zero) from p. For example, if we said S !
past
P'
P Fig. 17
I
present
future
Mary had been a princess for fifteen years before she became queen, the state of being a queen would have begun at p' and would have existed through the segment of time marked x'. Prior to the point marked p', Mary was a princess. VPs (o) and (p) have the following relational structures: (o) ((has) been) Ving (p) ((had) been) Ving
76
THE STRUCTURE OF LANGUAGE
Their meanings may be represented in Figs. 18 and 19, respectively. In the case of (o), has been Ving, the predicated state, S, occupies
present
past
future
Fig. 18
a segment of time prior to and inclusive of the present. In the case of (p), had been Ving, the predicated state, S (Fig. 19), which is referred to by the present participle occupies a segment of time which terminates at some point p which is x distant from the present (where x is greater than zero). We note that VP (o), S
P present
past
future
Fig. 19
has been Ving, is at least partially synonymous with VP (g), is Ving. (Compare Figs. 14 and 18.) The difference between them lies in the fact that the point of reference for (o) is toward the end of the state predicated, while for (g), the point of reference (which in this case happens to be the PRESENT) is at the center of the state. A similar comparison can be made between (h) and (p). VPs (q), has been Ven, and (r), had been Ven, are parallel in structure to (o) and (p). If the V of the Ven in question happens to be transitive, (q) and (r) are passive. Otherwise, the Ven is to be considered as a predicate modifier where the perfect state that it signifies is asserted as a property of the state-of-being predicated of the subject's referent.
THE STRUCTURE OF LANGUAGE
77
VPs (s), has been being Ven, and (t), had been being Ven, have the following structure and also have the possibility of the passive meaning: (s) (((has) been) being) Ven (t) (((had) been) being) Ven In fact, an attempt to generate a VP with the form of (s) or (t) without the passive meaning meets with some difficulty. While it is possible to say Mary has been being gone a great deal these days, the sentence Mary has been gone a great deal these days, has essentially the same meaning and is more economical. We might expect, therefore, that non-passives with the form of (s) or (t) would be extremely rare. The explanation for the rarity of passives of this sort goes back to something said earlier — namely, that the state-of-being underlies all other states. In saying Mary is gone, we automatically imply that she is 'being gone'. Thus, Mary is being gone is somewhat odd. By the same token, the sentence Mary has been gone implies that 'Mary has been being gone'. In short, the meanings of (s), has been being Ven, and (t), had been being Ven, are equivalent respectively to (o), has been Ving, and (p), had been Ving, plus the passive meaning. Or, (s) and (t) are equivalent to (o) and (p) plus a Ven as a modifier where the Ving of (o) and (p) is being. The remaining types of VPs all involve Modals. They have the following relational structures: (u)
Modal
(v)
Modal
V
1 |—12 II i
(w)
Modal
(x)
Modal
be
Ving
12 i r~i 2 be
have
Ven
Ven
78
THE STRUCTURE OF LANGUAGE
i
m a
(y)
Modal
have
been
1 |
I 2
(z)
Modal
have
been
1|
I2
(a')
Modal
have
been
Ving
Ven
being
Ven
Let us now consider the effect of selecting different modals to fill the Modal position for each VP. Suppose we select will. By filling the V position with the word go, for VP (u) we have will go. We may represent the meaning of this VP in Fig. 20.
past
present
future Fig. 20
In the figure it is shown that the asserted state is expected to take place in the future. The state, S (in this case 'going'), will begin to occupy time at some point p which is an indefinite distance x from the present. In the cases of each of the other VPs which are included within a VP involving the Modal will a similar analysis applies. The Modal will simply asserts a future rather than a present or past state. Thus, the meanings of VPs (v)-(a') can be represented by the corresponding diagrams of the modifying VP (that is, the VP which follows the Modal) by merely shifting the asserted state into the future. It is clear will be going is the future form of is going;
THE STRUCTURE OF LANGUAGE
79
will have gone is the future of has gone; will have been going of has been going; and will have been being taken of has been being taken. Each of the other Modals has a meaning which functions in a way analogous to will. The Modal would, for instance, asserts a state-of-being which is dependent upon the existence of some other state.24 VPs involving would assert that if a given state materializes, the state asserted in the clause containing would will also materialize. If someone says I would go if he went, the 'going' signified in the main clause will take place if the 'going' signified in the //"-clause takes place. The Modal can asserts the present real possibility of some stateof-being. The sentence Mary can go to work asserts that there are no real (foreseen) factors which will prevent Mary's going to work. The Modal could asserts the hypothetical possibility of some state-of-being and like would may be dependent on the materialization of some other state. The sentence Mary could go to work asserts the hypothetical possibility of Mary's going to work. In some cases, it may be that it is actually no longer possible for the asserted state to materialize. For example, the sentence Mary could have gone to work yesterday, but she did not may be quite true even though it is impossible for Mary now to go to work yesterday. The Modal should asserts a moral (or other) obligation for the materialization of a state.25 It is synonymous with ought to. The Modal may has at least two possible meanings. It may assert permission for the actualization of a state as in You may go now. Or it may assert the possibility of a state while allowing for other possibilities, e.g., He may or may not go. Insofar as it asserts possibility, may is synonymous with can and sometimes 24
It is necessary here again to mention the preliminary nature of the analyses under consideration. The reader will have no difficulty in conjuring up VPs containing any one of the modals to be discussed below which will not fit the interpretation suggested. Such examples, however, do not constitute problems of principle, rather of detail. At any rate, it is assumed that the analysis sketched here could be extended to cover all cases. 25 The pedantic use of should in a way which is synonymous with would, and sometimes will, is not considered here. See note 24.
80
THE STRUCTURE OF LANGUAGE
the one may or can be substituted for the other. In choosing the Modal may as opposed to can the speaker calls attention to the fact that other possibilities exist. The Modal might is even weaker in its assertion. It emphasizes the remoteness of the possibility of the existence of a state. The Modal must, at the other extreme, is the strongest of the Modals in its assertion. It asserts that the state in question is a necessity in view of other facts. In fact, the predicated state is asserted as a foregone conclusion. 2.3 The Complement In this section we will be concerned only with transitive verbs. We assume that the semantic relation of the complement to the state signified by the predicate is unique and specifiable. Both direct and indirect objects have rather specific meanings which are given substance by their positions in sentences. Consider for example the following sentences and note how the meanings change with alterations in the position of NPs in the complement: (1) (2) (3) (4)
John John John John
hits hits hits hits
the ball to Mary. Mary to the ball. Mary the ball. the ball Mary.
While sentences (1) and (3) are synonymous, as are (2) and (4), sentences (2) and (4) seem a bit strange — not, however, because there is anything wrong with their syntax. The problem is one of plausibility (i.e., the subjectively judged probability that the situation signified by the sentence would actually exist). That the problem is not one of grammaticalness is strongly supported by the simple fact that sentence (2) has precisely the same syntactic structure as does sentence (1). Sentences (3) and (4) are similarly parallel. It should be clear, therefore, that the undeniable strangeness of (2) and (4) must be due to the implausibility of the situations they signify. At any rate, it is obvious that sentences (1) and (3) mean something different from (2) and (4). It is also clear
THE STRUCTURE OF LANGUAGE
81
that the difference is due to the order of units. This brings us to the central problem in the analysis of the complement — namely, to determine the meaning of the direct and indirect object slots, their interrelationships, and their relationship to the VP. The following figure is an expansion of the complement state (Cmp) shown in Fig. 12 above. In Fig. 21, below, 10 = indirect object, DO = direct object, Vprt = verb particle, and OPr = object pronoun.
Fig. 21
82
THE STRUCTURE OF LANGUAGE
In the grammar, states which are enclosed in dotted lines are equivalent. That is to say, one can be substituted for the other without a change in meaning. Now, reading from right-to-left and from top-to-bottom on Fig. 21, let us generate the possible Cmp types and place them in a sentence context for analysis. (a) (b) (c) (d) (e) (f) (g)
The boy The boy The boy The boy The boy The boy The boy
hits Mary the ball. hits the ball to Mary. hits the ball. turns in the paper. turns in the paper to the teacher. turns the paper in to the teacher. turns the paper in.
The simplest of the exemplified Cmps are those which only contain a DO. We note that an IO without a DO, however, is quite rare. 26 It seems that in most cases, in order for the IO to receive any of the action signified by the VP, the DO must first be acted upon. Before attempting to determine the structure of the sentences (a)-(g), let us first inquire into the meanings of the DO and IO positions in the sentence. If we think of the situation signified by sentence (a), we conceive of an action which displaces the referent of the DO. The displacement is such that the object referred to moves through space until it arrives at the place occupied by the referent of the IO. Thus, we can extract three physically specifiable ingredients of the VP-DO-IO relationship at least for the sentence in question. They are (i) action performed by the referent of the subject, (ii) displacement of the referent of the DO, and (iii) movement of the referent of the DO through 26
Consider the sentences, He turned in to the teacher, He turned over to the police, He gives to charity, etc. Certainly the last example is a perfectly good sentence, and it would probably be an error to rule out the first two as ungrammatical. Although we have not allowed the generation of an IO which is not accompanied by a DO, this would have to be provided for in a grammar aiming at completeness.
THE STRUCTURE OF LANGUAGE
83
space to the location of the referent of the 10. If we were to examine a large number of sentences with the structures of (a), we would probably find it possible to identify a set of physical relationships similar to (i), (ii), and (iii) which would fully specify the meaning of the VP-DO-IO relationship. At worst, we would expect to find a set of sets Si, S2, ..., S n of such relationships which would enable us to explain the meaning of any YP-DO-IO relationship via at least one of the sets Si, S2, ..., S n . Let us consider a few further examples to see if conditions (i)-(iii) are at all close to the general meaning of the VP-DO-IO relationship. (5) The man wrote the letter to the woman. (6) They turned the man over to the police. (7) The judge gave the man two years. In (5) the referent of the DO is an object (i.e., thing) created by the referent of the subject and is intended for the referent of the IO. In (6) the referent of the DO is acted upon by the referent of the subject which causes it to come into a specific relationship with the referent of the 10. In (7) the referent of the DO is brought into a relationship with the referent of the IO such that after the action of the subject's referent, the referent of the IO receives the prison sentence of two years. In spite of the considerable diversity of the relationships between the various VP-DO-IO units, it seems fairly clear that the relationship between the VP and DO brings about the relationship between the VP-DO and IO. Thus, it may be assumed that the state signified by the VP has a direct effect on the referent of the DO, and this effect is then passed on in some way to the referent of the 10. In any case, this definition will suffice for the present purposes, and it will account for the passive meaning which we posited earlier. It also motivates the following analysis of the relational structures of the example sentences:
84
THE STRUCTURE OF LANGUAGE
12 12
1I
(a)
21 The
1 boy
I I Mary
hits
21 the
ball.
"12 1|—
ir
2T (b)
The
boy
1
2j (c)
|l
The
boy
hits
12
1
2r hits
The
12
the
1
(A)
1 l ball
the
H
11.
turns
the
paper.
1 """
I2 _LC
1 2 r
P[N ( C / b a l l / , „ C / B A L I . / ) p | n
C/ball/
C/BALL/
Plballh >•••, P[ball)n
P[BALL]j ,..., P[BALL]n
'balli' ,...,
'balln'
BALL Fig. 24
2.3 The Refinement and Extension of Concepts Now, let us inquire into the nature of the knowledge that the child has gained via the inductions summarized in formulas (i)-(iv). To begin with, he has learned to recognize two physical patterns, the object BALL, and the word ball, and he has learned to recognize a pattern consisting of a relation between a word and an object.9 The number of actual patterns which the child 9
See note 8.
98
LANGUAGE ACQUISITION
has thus learned to deal with is indefinitely large. At any given time, there will be a set of percepts which the child has previously encountered, and from which he has induced a concept which will enable him to recognize an indefinitely large number of instances of the pattern in question in the future. There is also, of course, a margin of error to contend with. Suppose, for instance, that we replaced the child's little red ball every five minutes with another one just like it without letting him know. There is no reason to assume that the child would not mistakenly believe that the ball in his crib was the same ball that he had seen and handled in the previous five minute period. In fact, it would probably be possible to replace the little red ball with a slightly larger one, or a slightly smaller one, every so often without the child's becoming aware of the fact that there was more than one ball. In the same way that the child fails to recognize minute differences, we should expect that he would also fail to recognize gross similarities. Suppose that, instead of substituting a slightly different little red ball for the one in the crib, we substituted a basketball. There is no reason to expect that the child would mistake the basketball for his own little red ball, but, in a sense, this is precisely what he will later have to learn to do. At present, the word ball has the value of a proper noun, i.e., it has a single referent. However, the child will later learn that the word ball may be used to signify any member of an indefinitely large class of objects — including the basketball which at first would be clearly excluded as a possibility. What the child must and in fact obviously does do, is to refine and extend gross patterns which he has already acquired. Undoubtedly, there will be refinements in the perception of the word ball such that it will eventually become distinguishable from similar phonological patterns, e.g., wall, fall, mall, bar, bon, bomb, bowl, bull, bell, etc. More importantly, however, refinements and extensions will have to occur in the perception of extralinguistic objects. Not only will it become necessary to distinguish between slightly different little red spheres, but it will also be
LANGUAGE ACQUISITION
99
necessary for the child to recognize the similarity of his little red ball and a basketball, say. Let us see how this sort of refinement and extension might take place via the rule of induction. The child has learned that a number of experiences which share certain properties are caused by a single object in his environment. This knowledge gives him a certain predictive power over future occurrences. It enables him to classify a never-before-encountered experience as an instance of /BALL/. Whenever such an instance occurs, all of the information previously gained about /BALL/ is brought to bear. The child will know (albeit unconsciously) that /BALL/ is an object which rolls, bounces, has a certain texture, produces certain images if you look at it, etc., etc. It is this sort of information which enables the child to learn to classify percepts of the little red ball in the first place. Now, when he is confronted with a different object, like a basketball, some of the information which was previously associated with the little red ball will also be associated with the new object. That is, the percepts caused by the little red object which lead to the induction of the concept /BALL/ will be similar in some respects to the percepts caused by the basketball which will lead to the induction of a concept /BASKETBALL/. The two concepts will be different to the extent that each will have its own distinct set of percepts. The similarities of the two sets of percepts, however, will be drawn into focus by the association of the little red ball and the basketball resulting from the use of the word ball. It is likely that the child will hear the linguistic object ball many times in connection with both the concept /BALL/ and /BASKETBALL/. The result is that eventually the child will associate the word ball with both objects, and ultimately with many others, including many colors, sizes, and shapes. The class of objects which he will eventually refer to as balls will include a number of objects which he will have previously experienced, and it will contain an indefinitely large number of similar objects which the child will not have previously experienced. Here we see clear evidence for the statement in Chapter II that both extensional and intensional definitions are necessary
100
LANGUAGE ACQUISITION
to account for language behavior. In the learning process, the child experiences a finite number of percepts which he inductively classifies into concepts which may apply to innumerable percepts not yet encountered. The percepts which are past experience constitute the evidence for the extensional part of the definitions of psychological entities, while the percepts which the organism is capable of dealing with at any time but which have not as yet been encountered constitute the evidence for the intensional part of the definition. 2.4 The Learning of Larger Grammatical
Units
In Chapter II we defined the GRAMMAR of a language as the specification of the range of meanings of its functionals — and FUNCTIONAL, as any meaningful unit of the language. So far in this chapter, we have discussed a child's grammar which contains the functional 'ball', the range of meanings of which we have specified, and we have shown how the child might acquire this functional and its meanings. It remains to be seen whether the rule of induction will enable us to account for the learning of more complex functionals. As stated at the beginning of this chapter, we will not by reason of the complexity of the problem attempt to show in detail how a child might proceed to acquire the entire grammar of a mature speaker, however, we will attempt to achieve the more modest goal of showing how the child might in principle, accomplish such a feat. To do this, we will select several functionals as examples which are not essentially different from the functionals which constitute the rest of the grammar. Let us suppose that in addition to the concepts /BALL/, and /BASKETBALL/ the child has acquired the concepts /SELF/, /MOTHER/, and /FATHER/. Let us further suppose that the child's mother and father often play with him in a situation where the one or the other rolls either the little red ball or a basketball to the child. The child is thus exposed to numerous percepts of the following types:
LANGUAGE ACQUISITION (v)
P rR rc 1
c
101
>i, where R = ROLLING.
L
/MOTHER/p, /BALL/p'J
V ( VH 0 ( )
P[R (c/FATHER/p, C/BALL/p)] P[R (C /SELF/ ^ C/BALL/^)]
(viii)
PrR ^ LR (C ^ /MOTHER/pJ r /BASKETBALL/p
(i x )
P[R (C/FAJHER/^^ c/BASKETBALL/p)]
(X)
' [R (C/SELF/pJ C/BASKETBALL/p):l
By the rule of induction, then, it is clear that the relation of ROLLING is itself subject to induction. The process may be summarized as follows: (»)
P
CR (c /x/p> c / Y / p ) V
P[R (c / x / ^ c / Y / p ) ] n + , ->
C /R where X and Y are suitable objects between which R may hold. When this induction has occurred, the child is capable of recognizing any number of situations in which a BALL or BASKETBALL is rolled by MOTHER, FATHER, or SELF. Now, let us suppose that the child has observed all of the above situations except one in which MOTHER has the relation ROLLING to the BASKETBALL. We should expect if he has already induced the relational concept /ROLLING/ that the child would be able to immediately recognize a situation where his mother is rolling a basketball, also, we should expect him to be able to imagine such a situation. The rule of induction by itself cannot account for this fact. Here is where the rule of substitution is helpful. Given the similarity of the basketball and the little red ball which the child has seen his mother roll, a substitution can be made. (xii)
C/BALL/ ^--/BASKETBALL/
The concept /BASKETBALL/ is substituted for the concept /BALL/.
102
LANGUAGE ACQUISITION
If the physical object percepts /MOTHER/, /FATHER/, and /SELF/ are associated with the linguistic units which signify them via the rule of induction as was done in the case of the linguistic concept ball, the child may induce certain more complex grammatical functionals. Undoubtedly, during the many playtimes when the child's parents are rolling the ball, he will be exposed to the sentences: (1) Mommy rolls the ball. (2) Daddy rolls the ball. (3) Billy rolls the ball. (Assuming his name is Billy.) It is clear that the linguistic unit rolls is a percept in much the same sense that the relation ROLLING was, and it also is subject to induction. The association of the linguistic concept /rolls/ with the extra-linguistic concept /ROLLING/ will be facilitated by the fact that the former occurs in sentences which are associated with situations of which the latter is part. The linguistic concept /rolls/ will be associated not only with the relation /ROLLING/ which holds between such objects as /MOTHER/ and /BALL/, but it will also be associated with the linguistic units which occur on either side of it, e.g., Mommy and the ball. Eventually, the linguistic concept {rolls} will be induced in a manner similar to the induction of {ball}. Its extensional definition will consist of a number of occasions in which the child will have observed the act of ROLLING in close contiguity with the word rolls in the context of a sentence. The intensional definition of the concept will consist of the common properties of the events in which ROLLING was observed. When the child has learned the necessary associations to enable him to understand sentences (l)-(3), he has acquired a simple grammar. It may be represented as a finite state generative 10 mechanism as in Fig. 25. 10 The term 'generative' is not to be equated with the term 'productive' which is used below (see section 2.5). The former can be read as 'fully explicit', and applies equally to the grammars of speakers and hearers.
LANGUAGE ACQUISITION
103
The grammar is dependent for its functioning on prior inductions, and the rule of substitution. In the child's use of the grammar, we find need for the rule of substitution in order to account for novelties either in production or reception. 11 Suppose, for example, that the child had experienced (either had understood or uttered with understanding) sentences (l)-(3) above, as well as the sentence: (4) Billy rolls the basketball. By applying the rule of substitution, he might create the new sentences: (5) Daddy rolls the basketball. and: (6) Mommy rolls the basketball. The creation of these sentences can be accounted for in the following applications of the rule of substitution: (xiii)
- ^ M (Daddy)
(xiv)
C(Billy) {Mommy}
Increasing complexities in the grammar with the addition of an indefinitely large number of new categories and far more subtle distinctions can be explained in a similar way, and thus create no new problems of principle. 11
See section 2.5, below.
104
LANGUAGE ACQUISITION
2.5 Productive Versus Receptive Language Initially, the grammar of the child will be purely receptive. That is, the child will be able to understand sentences (or words at least) without being able to utter them. 12 The reason that productive (active) language often lags behind receptive (passive) language is that the productive aspect is, in some sense, the more difficult. In order to learn to speak, it is not only necessary for the child to make the appropriate associations between words and things, but he must also learn the motor skills involved in speech. A good deal of study has already been carried out in this area of the child's learning at a rather superficial level by Skinnerian psychologists.13 The adjective 'superficial' is justified by the fact that Skinner and his followers have deliberately avoided, as much as possible, discussions involving the postulation of any internal mechanisms.14 However, it is widely recognized, if perhaps only tacitly by Skinnerians, that a child does bring to the learning situation a highly complex and extremely sensitive perceptual mechanism as well as a powerful rational mechanism for organizing sensory data. In spite of having dismissed the need for discussing internal mechanisms, Skinner and his associates have nonetheless demonstrated that the learning of productive language depends very greatly on the consequences which follow bits of a child's verbal behavior. It is clearly beyond doubt that certain consequence promote learning (in this case, the acquisition of categories and motor skills), and others inhibit it. Skinnerians do not attempt to explain this fact by discussing its underlying causes, they merely accept it. To do this, however, is to condone a rather unscientific approach which actually encourages the avoidance 12 A study by Fraser, Bellugi, and Brown ("Control of Grammar") has shown that children are able to imitate sentences that they apparently cannot produce cognitively (i.e., utter on their own with understanding), and that they can understand sentences which they cannot produce cognitively. 13 For an uncritical review of the literature on this subject, see Krasner ("Studies"). For less friendly remarks, see Chomsky (review 1964). 14 See Skinner's introductory chapter to his book Verbal Behavior (1957).
LANGUAGE ACQUISITION
105
of what is probably one of the most important problems of human psychology. In any case, in the present approach, we will attempt to incorporate the virtues of reinforcement theory while eschewing its blindness to certain essential problems. One of the virtues of Skinnerian psychology which we should wish to include in the present theory is the assumption that the direction of learning is very much determined by environmental factors. Though this observation is illuminating with respect to certain external and observable events, by itself it leaves us in a quandary with respect to that part of the environment which happens to be inside the organism. This, of course, at least partly explains the success of behaviorists in dealing with productive language, which can be directly observed, and their predictable lack of success in dealing with receptive language, which is most often totally unobservable. A t least part of the shortcomings of the Skinnerian system may be eliminated by postulating a rule of induction. Such a rule enables us to account for the recognition of pattern which is the basis of a receptive language repertoire. Since induction itself may be a highly reinforcing event, it may be largely unnecessary to look for other environmental consequences which would encourage the development of receptive language. In accounting for the development of a productive repertoire, external events are bound to play a more important role. However, they can by no means override the importance of the pattern recognition process. Reinforcement, in fact, depends on the recognition of pattern in many cases. For instance, in his babblings, the child will occasionally hit upon a phonological sequence which is a high frequency pattern in his environment. His own recognition of this sequence (if and when he does recognize it) may be a highly reinforcing event. In addition, the reinforcement by an attending adult who may praise the child or repeat the sequence several times is also dependent on pattern recognition. Thus, the development of receptive and productive language proceed semi-independently, but in the final analysis both depend
106
LANGUAGE ACQUISITION
on the same principles of learning. For this reason, we will not again refer to the differences between receptive and productive repertoires, but will proceed to consider problems of relevance to both.
3. THE ORDER HYPOTHESIS
The discussion of section 2.0 strongly suggests that learning of language proceeds in a step-by-step fashion. Thus it would seem reasonable to inquire into the possibility of a fairly strict order for the acquisition of conceptual categories, including words and word classes. With one very important qualification, we will see that the present theory would clearly point us in the direction of such a possibility. The qualification has to do with the semi-independence of the development of concepts having to do with extra-linguistic objects, and concepts relating to grammatical units (words included). For words with very non-abstract meanings, it seems reasonable to expect that the extra-linguistic concepts would have to be learned BEFORE those words could be used meaningfully. That is, the child might learn to say the word mama, for instance, before he became aware of the fact that this word might be used to summon his mother, but he could in no case use that word communicatively until he had induced the concept /MOTHER/ and the concept /mama/. The problem of order in acquisition becomes even more complex when we come to more abstract grammatical units, e.g., the, a, no, etc. How much extra-linguistic information is necessary before one can understand the meaning of the word thel Below, we will see that this question, though not at all an easy one, can probably be answered adequately in terms of the theory suggested in this chapter. Certainly, the learner will have acquired the notion 'class' in some sense, as well as the notion 'pre-specified member of a class'. Here, we make mention of this problem only to be able to formulate more clearly the following
LANGUAGE ACQUISITION
107
question: is extra-linguistic information always necessary prior to language learning (i.e., the acquisition of linguistic units), or does language itself cause the learner to attend to certain extralinguistic facts? An answer to this question should help resolve some of the disagreements in recent years between those whose sympathy lies with the Whorfian viewpoint and those who hold to the notion of linguistic universals (and other psychological universals as well). 15 The principles of learning which are proposed here, would suggest that an extreme adherence to the Whorfian conception or to the position of universalists would probably be unwise. If the theory of learning advanced here is at all correct, it is clear that some concepts related to extra-linguistic objects can develop quite independently of linguistic concepts. Moreover, it is also obvious that the development of linguistic concepts is in an important sense bound up with and dependent upon the development of extra-linguistic ones. For example, a child certainly cannot use the word elephant correctly in a referential sense until he has learned what an ELEPHANT (the animal in the real word) is. To the contrary, however, the child may know perfectly well what an ELEPHANT is, long before he learns the word elephant. This would seem to rule out extreme Whorfianism. That is to say, it is not the word which causes the child to see an object in a particular way, it is rather the object in its peculiar relationship to the word which enables the child to fully understand the word. While it is true that the acquisition of linguistic concepts cannot proceed far without the development of extra-linguistic concepts, the reverse is not true. A child may become quite sophisticated (according to the present theory) in his capacity to recognize recurrent patterns in the world around him, without yet knowing a single word. In fact, it is this possibility which makes it difficult to predict a strict order of acquisition of words and word classes. Provided that the child has attained the necessary extra-linguistic 15
For the Whorfian viewpoint, see his selected writings edited by Carroll (Language). For a criticism of it, see Haugen ("Language Contact").
108
LANGUAGE ACQUISITION
concepts, he may break into the linguistic system of his language at any one of a number of levels. It is at least possible that a child's first meaningful linguistic concept should be something as abstract as {here} used as a question. The relative accessibility of other linguistic concepts would seem, however, to make this possibility extremely remote. Thus, while the present theory allows for a great deal of flexibility in the order of concept attainment, IT SEEMS GENERALLY TRUE THAT A CHILD LEARNS LESS ABSTRACT UNITS BEFORE MORE
We will refer to the latter statement as the 'order hypothesis'. It will be difficult to test for at least two reasons in addition to the flexibility of the learning process already noted. One difficulty lies in the fact that it is not always possible to tell whether or not a child understands what he hears. Another difficulty is the fact that it is not always possible to determine what a child means (if anything) when he says something. In spite of these difficulties, there are, nonetheless, some cases in which it is possible to tell in a fairly unambiguous way whether or not a child understands a given linguistic unit. By observing instances of this sort, it should be possible to test the validity of the order hypothesis. While the detailed specification of such a test — which requires at least the careful observation of one or more children learning their language over an extended period of time — is beyond the scope of the present work, we will nevertheless suggest some predictions entailed by the hypothesis, and some supportive evidence. We have already noted that nouns are often the first words learned by a child. This would tend to support the order hypothesis, and seems to suggest that we might go further to predict the following order of acquisition for some of the major grammatical categories: ABSTRACT ONES.
(a) (b) (c) (d)
nouns adjectives and verbs prepositions and other connectors determiners and adverbial s
LANGUAGE ACQUISITION
109
The list is not assumed to be complete, and in all cases subclasses with sub-ordering among them seem possible. For example, in category (a) we might separate proper nouns and count nouns, assuming that the former would be learned ahead of the latter. If we refer to the amount and type of information which would have to be available to the child before the respective word classes could be induced, it is quite clear why we should expect the child to learn proper nouns ahead of count nouns. In order to learn a proper noun, he must become able to recognize an extralinguistic object, e.g., M O T H E R , and must associate this object with a word which be must also be able to recognize, e.g., Mama. T o learn a count noun, on the other hand, the child must associate a word (or words, if we consider plural formations) with a whole class of extra-linguistic objects. Since count nouns are more abstract, if it should indeed take the child longer to learn them, this fact would support the order hypothesis. Let us go on now to consider an example which involves ordering across the major categories listed above. W h y should we expect, for instance, that adjectives should be learned ahead of determiners? Suppose we take the adjective red and the determiner the, beginning with a discussion of the information necessary for the learning of the meaningful linguistic concept {red}. In the first place, the child must become able to recognize instances of the color R E D , and instances of the word red. The induction of the linguistic concept /red/ is not essentially different from the learning of other linguistic concepts, and thus presents no new problems. The induction of the color concept /RED/, however, merits some special consideration. While the child will undoubtedly be stimulated by the colordistinctiveness of, say, his little red ball, there is no compelling reason for him to separate the redness of the ball from its roundness, bounciness, or other properties. That is, except insofar as the color R E D contrasts with the colors of other objects. Certainly, if all the objects in his environment were the same shade of R E D , there would be no way for the child to notice the redness of the little ball. There are, however, at least two ways that a configuration
110
LANGUAGE ACQUISITION
of color patterns in his environment might cause the child to notice the distinctiveness of the color RED. One possibility is for the child to be exposed to a number of red objects in his environment in order for him to note their color-similarity. Another possibility is for two or more objects to be alike in all respects except color. In either case, the color RED could be noticed as a distinctive pattern, and hence could be induced. Here, language may play an important role by directing the child's attention to the commonality of otherwise diverse situations. For example, the fact that he hears the word red on a number of different occasions, coupled with the fact that he knows that certain other words have the function of signifying certain objects or aspects of the environment, may cause him to be predisposed to look for the commonality of those occasions. If this is so the Whorfian hypothesis seems somewhat vindicated. The induction of the concept {red} may be summarized as follows: (XV)
S (P[red]1> •••> P[red]1+n) ~~* C / r e d /
(xvi)
S (P[RED]jS •••> P[RED]1+n)
C/RED/
(xvii) S (P [N ( c / r e t V c / R E D / p ) V .... P [N (c / r e d / p ,c / R E D / p ) ] l + n C{red}
Now, let us consider the information necessary for the induction of the determiner the. When followed by a singular count noun, the word the has the function of limiting the object referred to, to a pre-specified or pre-selected (unique) member of a set (where the set is given by the pluralization of the singular noun in question). By comparison to red-ness, the-ness — in the sense just given — is a good deal more abstract. In fact, before the child will be able to learn what the-ness is, or better, particularity (in the sense of "prior specificity"), he will have to have associated objects into sets. We have already suggested how such associations might be developed via the rule of induction. The remaining problem is how the child might learn to use the word the in order to refer to a particular member of a set of objects.
LANGUAGE ACQUISITION
111
It is quite impossible for his parents or any other native speakers of English to point to an object and say "This is the-ness (i.e., particularity)". The meaning of the simply does not relate to an obvious property of things in the world as did the less abstract meaning of red. Although it seemed possible for the concept {red} to be induced by proceeding either from the object world to the linguistic concept, or vice versa, it seems likely that the concept {the} can only be induced by proceeding from the linguistic unit the in connection with other units to the meaningful concept. In other words, the linguistic unit the will have to be experienced in numerous communicative situations before it will be associated with a property of contexts at issue. Certainly, the child will be exposed to numerous situations where the is used to signify the particularity of an object referred to, e.g., the phrases the ball, the rattle, the bottle, etc., etc. While it is true that only a relatively small number of objects may be called 'red', any object whatever may be referred to as 'the object'. Not only is the concept of particularity more abstract than the concept of redness, but it is also more general. There are simply more particular objects than there are red ones. In the same way that particularity is a more general property of objects than redness, the word the is a more general modifier of words which refer to objects than red is. It is interesting to note that the determiner the obligatorily precedes any other possible modifier within the noun phrase. A possible route for the induction of {the} is summarized as follows: (xviii) S (P [ t h e ]
..., MtheJ,
)
'-/the/
Formula (xviii) summarizes the process whereby the child learns to recognize recurrences of the word the. The concept of particularity, which the word the signifies, is somewhat more difficult to achieve. By treating the as a modifier via an application of the rule of substitution, the communicative contexts in which the is used may be summarized as follows:
112 (xix) v /
LANGUAGE ACQUISITION
M (C p C (Nsg)), \ (N (IN(C' /the/, [PARTICULARITY]'' where Nsg = any suitable singular noun, M = the grammatical relation of modification.
Given such a context, the percept [PARTICULARITY] can be singled out and induced: (XX)
S ^[PARTICULARITY]^ •••> ^[PARTICULARITY]j + p ) ^/PARTICULARITY/
Once the latter induction has occurred, the concept {the} is achieved by substituting the concept /PARTICULARITY/ for the corresponding percept of formula (xix). The concept {the} arises from the naming (or signifying) relation which holds between the concept /the/ and /PARTICULARITY/: (xxi)
S ( P [ N (C /the/ ^, c /p A R T I C U L A R i T Y/ p >]l' " • '