309 35 10MB
English Pages 864 [854] Year 2019
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
ENGLISH GRAMMAR
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
O X F O R D H A N D B O O K S IN LI N G U I S T I C S Recently published
THE OXFORD HANDBOOK OF UNIVERSAL GRAMMAR Edited by Ian Roberts
THE OXFORD HANDBOOK OF LANGUAGE AND SOCIETY Edited by Ofelia García, Nelson Flores, and Massimiliano Spotti
THE OXFORD HANDBOOK OF ERGATIVITY Edited by Jessica Coon, Diane Massam, and Lisa deMena Travis
THE OXFORD HANDBOOK OF WORLD ENGLISHES Edited by Markku Filppula, Juhani Klemola, and Devyani Sharma
THE OXFORD HANDBOOK OF POLYSYNTHESIS Edited by Michael Fortescue, Marianne Mithun, and Nicholas Evans
THE OXFORD HANDBOOK OF EVIDENTIALITY Edited by Alexandra Y. Aikhenvald
THE OXFORD HANDBOOK OF LANGUAGE POLICY AND PLANNING Edited by James W. Tollefson and Miguel Pérez-Milans
THE OXFORD HANDBOOK OF PERSIAN LINGUISTICS Edited by Anousha Sedighi and Pouneh Shabani-Jadidi
THE OXFORD HANDBOOK OF ENDANGERED LANGUAGES Edited by Kenneth L. Rehg and Lyle Campbell
THE OXFORD HANDBOOK OF ELLIPSIS Edited by Jeroen van Craenenbroeck and Tanja Temmerman
THE OXFORD HANDBOOK OF LYING Edited by Jörg Meibauer
THE OXFORD HANDBOOK OF TABOO WORDS AND LANGUAGE Edited by Keith Allan
THE OXFORD HANDBOOK OF MORPHOLOGICAL THEORY Edited by Jenny Audring and Francesca Masini
THE OXFORD HANDBOOK OF REFERENCE Edited by Jeanette Gundel and Barbara Abbott
THE OXFORD HANDBOOK OF EXPERIMENTAL SEMANTICS AND PRAGMATICS Edited by Chris Cummins and Napoleon Katsos
THE OXFORD HANDBOOK OF EVENT STRUCTURE Edited by Robert Truswell
THE OXFORD HANDBOOK OF LANGUAGE ATTRITION Edited by Monika S. Schmid and Barbara Köpke
THE OXFORD HANDBOOK OF ENGLISH GRAMMAR Edited by Bas Aarts, Jill Bowie, and Gergana Popova
For a complete list of Oxford Handbooks in Linguistics please see pp. –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.........................................................................................................................................
ENGLISH GRAMMAR ......................................................................................................................................... Edited by
BAS AARTS, JILL BOWIE, and
GERGANA POPOVA
1
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
3
Great Clarendon Street, Oxford, , United Kingdom Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and in certain other countries © editorial matter and organization Bas Aarts, Jill Bowie, and Gergana Popova © the chapters their several authors The moral rights of the authors have been asserted First Edition published in Impression: All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by licence or under terms agreed with the appropriate reprographics rights organization. Enquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above You must not circulate this work in any other form and you must impose this same condition on any acquirer Published in the United States of America by Oxford University Press Madison Avenue, New York, NY , United States of America British Library Cataloguing in Publication Data Data available Library of Congress Control Number: ISBN –––– Printed and bound by CPI Group (UK) Ltd, Croydon, Links to third party websites are provided by Oxford in good faith and for information only. Oxford disclaims any responsibility for the materials contained in any third party website referenced in this work.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
A
...................................................................
There are many people who have helped shape the present volume, to whom we owe a debt of gratitude. First and foremost, we would like to thank our contributors for their inspiring work and their patience with the revision process. We are also enormously indebted to them for reviewing contributions to the handbook other than their own. Without their expertise and dedication a volume like this could not have come into being. We would also like to thank the colleagues who refereed individual chapters for us. Their scholarship, careful engagement with the rationale of the volume, as well as their constructive and detailed feedback were invaluable and given selflessly. We are also extremely grateful to colleagues at Oxford University Press and to the production team for their professional help and patience during the gestation of the volume. Bas Aarts Jill Bowie Gergana Popova
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
C ............................... List of figures and tables List of contributors Introduction
xi xv xxiii
PART I GRAMMAR WRITING AND METHODOLOGY . Conceptualizations of grammar in the history of English grammaticology
M T
. Syntactic argumentation
B A
. Grammar and the use of data
J S C T. S̈
. Grammar and corpus methodology
S W
PART II APPROACHES TO ENGLISH GRAMMAR . Cognitive linguistic approaches
J R. T
. Constructional approaches
M H
. Dependency and valency approaches
T H
. Generative approaches T L L H
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
viii
. Functional approaches
J. L M
. Modern and traditional descriptive approaches
R H G K. P
. Theoretical approaches to morphology
A S
PART III SUBDOMAINS OF GRAMMAR . Inflection and derivation
A S
. Compounds
L B
. Word classes
W B. H
. Phrase structure
R D. B
. Noun phrases
E K
. Clause structure, complements, and adjuncts
P D
. Clause types and speech act functions
E K̈
. Tense and aspect
I D A T
. Mood and modality
D Z
. Subordination and coordination
T E
. Information structure G K̈
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
ix
PART IV GRAMMAR AND OTHER FIELDS OF ENQUIRY . Grammar and lexis
D S̈
. Grammar and phonology
S H I C
. Grammar and meaning
A A
. Grammar and discourse
J B G P
PART V GRAMMATICAL VARIATION AND CHANGE . Change in grammar
M H
. Regional varieties of English: Non-standard grammatical features
P S
. Global variation in the Anglophone world
B K
. Genre variation
H D A W
. Literary variation
L J
References Name index Subject index
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
L .......................................................................................
Figures . Lumping of nouns and pronouns
. Reassignment of adjunctizers to the preposition class
. Three illustrations of the types of information available from acceptability judgements
. The visual logic of factorial designs, illustrated using the whether-island effect design
. Three types of evidence
. The A perspective in corpus linguistics
. Example of a Key Word in Context concordance for the lexical item school in ICE-GB, showing adjacent word class labels
. A Fuzzy Tree Fragment for an adjective phrase (AJP) containing a general adjective head (AJHD, ADJ(ge)) with at least one premodifier (AJPR) and one postmodifier (AJPO)
. A simple grammatical concordance
. Examining any line in the concordance displays the sentence and phrase structure tree, showing how the FTF matches a tree in the corpus
. Comparing frequency distributions over different time periods
. FTF for retrieving a positive (‘¬neg’ = not negative) auxiliary verb, verb or VP (‘∨’ = ‘or’), followed by a tag question with a negative auxiliary or verb
.a Dependency representation of in the kitchen
.b Dependency tree diagram of in the kitchen
. Stemma representing the structural order of ()
.a Phrase structure (constituent structure) of ()
.b Dependency stemma of ()
.c Dependency representation of ()
. Dependency stemma of ()
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xii
. Representation of transfer in the case of de Pierre and Pierre’s
. Stemma for Then she changed her mind
. Dependency representation of ditransitive give in Word Grammar
. Dependency representation of coordination in Word Grammar
. Semantic dependency of Leo sent a letter to Alan
. Syntactic and semantic dependencies of complements and modifiers of w
.a Analysis of grammatical non-projective structure That pizza I will not eat with crossing lines
.b Analysis of That pizza I will not eat involving rising
.a Analysis of () without rising
.b Analysis of () with rising
. The ditransitive construction
. Structure of whose book everyone said they had enjoyed
. Diagrams of NPs with Determiner-Head and Modifier-Head function fusion
. Diagram of a fused relative NP with Head-Prenucleus function fusion
. Diagram of the noun phrase this once which contains a fused Modifier-Head
. The basic structure of clause types
. Lambrecht’s () assumed mental representation of discourse referents (alternative terms by Prince and Chafe in brackets)
. Network of choices associated with her
. Inverted T-model
. A partial schema network for a verb (adapted from Taylor : )
. The grammaticalization chain for the modal idiom (had) better in the Late Modern Period
. Being to V and semi-modal having to V (frequency per million words) in the OBP corpus
. Proportion of periphrastic constructions and mandative subjunctives in the Brown family of corpora
. Diachronic development of core modals in the Brown-family of corpora
. Diachronic development of semi-modals in the Brown-family corpora
. Morphosyntactic distinctions along a continuum of ‘individuality’
. Correspondences between non-standard and standard tense use
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xiii
. Map ‘Give it me’, taken from An Atlas of English Dialects by Clive Upton and J. D. A. Widdowson (: )
. Global network for the entire WAVE feature set (N = )
. Tense and Aspect network in WAVE
. NeighborNet clustering of L varieties in WAVE
Tables . Contingency table of frequencies exploring the interaction between the polarity of question tags and the polarity of preceding verb phrases, extracted with FTFs and then manually reviewed. The verb phrase and tag never both have a negative polarity in all cases
. Word class and feature matrix
. The major clause types
. Clause types, meanings, and literal forces
. Simple and progressive forms of the ‘tenses’
. Vendler’s Aktionsart classes and their defining features
. Main functions of finite subordinate clauses
. Main functions of non-finite subordinate clauses
. Coordination in and of phrases
. Prince’s () model of hearer-status and discourse-status and correspondences with Lambrecht () and Prince (a)
. Word class-phonology generalizations (adapted from Monaghan et al. : –)
. Inflectional paradigm for OE bīdan ‘await’ (based on Hogg and Fulk : )
. Full-verb, mixed, and auxiliary syntax of dare in interrogative and negated sentences in the Brown-family corpora of American and British English (AmE and BrE)
. Logical types of universal statement (following Greenberg), taken from Evans and Levinson (: )
. Asymmetrical paradigms (adapted from Anderwald : )
. Seventy-six varieties in WAVE: world regions
. Domains of grammar covered in WAVE ( features in all)
. Geographical composition of the clusters in the global network
. Vernacular angloversals: top ( %)
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xiv
. Vernacular angloversals: top runners-up ( %)
. Top diagnostic features of L varieties (AR %, AR difference %), sorted by AR difference
. Top diagnostic features of traditional L varieties (AR %, AR difference %), sorted by AR difference
. Top diagnostic features of high-contact L varieties (AR %, AR difference %), sorted by AR difference
. Top diagnostic features of L varieties (AR %, AR difference %), sorted by AR difference
. The most diagnostic features of English-based pidgins and creoles (AR of P/Cs %, AR difference %), sorted by AR difference
. Diagnostic morphosyntactic features per Anglophone world region (AR difference region – rest of world %; * for medium-frequency features, ** for high-frequency/pervasive features)
. Top distinctive features for the British Isles
. Top distinctive features for Africa
. Top distinctive features for the Australia Pacific region
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
L
......................................................................
Bas Aarts is Professor of English Linguistics and Director of the Survey of English Usage at University College London. His publications include: Syntactic Gradience (, OUP), Oxford Modern English Grammar (, OUP), The Verb Phrase in English (, edited with J. Close, G. Leech, and S. Wallis, CUP), Oxford Dictionary of English Grammar (edited with S. Chalker and E. Weiner, nd edition , OUP), How to Teach Grammar (with Ian Cushing and Richard Hudson, , OUP), as well as book chapters and articles in journals. He is a founding editor of the journal English Language and Linguistics (CUP). Ash Asudeh is a Professor in the Department of Linguistics and the Director of the Center for Language Sciences at the University of Rochester. He has held positions at Carleton University, in the Institute of Cognitive Science, and at Oxford University, where he was Professor of Semantics in the Faculty of Linguistics, Philology, and Phonetics and a Senior Research Fellow at Jesus College. His research interests include syntax, semantics, pragmatics, language and logic, and cognitive science. He has published extensively on the syntax–semantics interface, particularly in the frameworks of Lexical Functional Grammar and Glue Semantics. Laurie Bauer FRSNZ is Emeritus Professor of Linguistics at Victoria University of Wellington, New Zealand. He is the author of more than twenty books on linguistic topics, particularly on morphology and word-formation. He was one of the inaugural editors of the journal Word Structure. The Oxford Reference Guide to English Morphology, which he co-wrote with Rochelle Lieber and Ingo Plag, received the Linguistic Society of America’s Leonard Bloomfield Prize in . In he was awarded the Royal Society of New Zealand’s Humanities medal. Robert D. Borsley is Professor Emeritus at the University of Essex, where he worked from to , and Honorary Professor at Bangor University, where he worked from to . He has published extensively on the syntax of English and Welsh, and on other languages, including Breton, Polish, and Arabic. He has worked mainly within the Head-driven Phrase Structure Grammar framework and has made a variety of contributions to its development. He was a Journal of Linguistics Editor from to . Jill Bowie is an Honorary Research Fellow at the Survey of English Usage, University College London, where she previously worked on the AHRC-funded projects ‘The changing verb phrase in present-day British English’ and ‘Teaching English grammar
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xvi
in schools’, led by Bas Aarts. She holds a PhD from the University of Reading and a BA and MA from the University of Queensland. Her research interests include recent change in English and the grammar of spoken discourse. She has co-authored papers with Survey colleagues on clause fragments and on changes in the English verb phrase. Ian Cushing is a lecturer in the Department of Education at Brunel University London. He has a broad range of teaching and research interests, including applied cognitive linguistics (especially in educational contexts), critical language policy, and pedagogical grammar. He is the author of Text Analysis and Representation (, CUP), Language Change (, CUP), and a co-author of How to Teach Grammar (, OUP, with B. Aarts and R. Hudson), as well as various journal articles and book chapters. Ilse Depraetere is Professor of English Linguistics at the University of Lille. She is a member of the research group Savoirs, Textes, Langage (UMR STL). She has published widely on tense, aspect, and modality, the semantics/pragmatics interface being in the foreground of her publications. She is the co-author with Chad Langford of Advanced English Grammar: A Linguistic Approach ( (nd edn), Bloomsbury) and she co-edited with Raphael Salkie Semantics and Pragmatics: Drawing a Line (, Springer). Heidrun Dorgeloh is Senior Lecturer in English Linguistics at the Heinrich-HeineUniversity Düsseldorf with a specialization in syntax, discourse analysis, and professional varieties of English. She wrote her dissertation on English word order, notably subject–verb inversion as it is used in different genres, followed by a series of research on the function and meaning of grammatical constructions in professional contexts. Her research interests include the interrelationship of non-canonical syntax and discourse, the evolution of genres, and registers and genres of various professions, such as science, medicine, and law. Patrick Duffley is Professor of English Linguistics at Université Laval in Quebec City. He has published monographs on the infinitive, the gerund-participle, and complementation in English, as well as a number of articles on modal auxiliaries, wh-words, negative polarity, and indefinite determiners. His work utilizes concepts inspired by cognitive grammar and Guillaumian psychomechanical theory in order to develop a semantico-pragmatic approach to grammar and syntax. He recently published a monograph with John Benjamins applying this approach to the phenomenon of subject versus non-subject control with non-finite verbal complements in English. Thomas Egan is Emeritus Professor of English Linguistics at Inland Norway University of Applied Sciences. His research interests encompass topics within the areas of corpus linguistics, contrastive linguistics, cognitive linguistics, and historical linguistics, including grammaticalization. He is the author of a monograph on complementation, entitled Non-Finite Complementation: A Usage-Based Study of Infinitive and -ing Clauses in English (, Rodopi). More recently he has (co-)authored some dozen articles
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xvii
contrasting various prepositional constructions in English and French and/or Swedish and Norwegian. Liliane Haegeman is Professor of English Linguistics at Ghent University in Belgium and is a member of the DiaLing—Diachronic and Diatopic Linguistics research group. From to , she was Professor of English Linguistics at the University of Geneva (Switzerland), and between and she was Professor of English Linguistics at the University of Lille III. Haegeman has worked extensively on the syntax of English and Flemish and has also written a number of textbooks for generative syntax. Her latest monograph is Adverbial Clauses, Main Clause Phenomena, and the Composition of the Left Periphery with Oxford University Press. Sam Hellmuth is Senior Lecturer in Linguistics in the Department of Language and Linguistic Science at the University of York. Sam earned her MA and PhD at SOAS University of London, and specializes in the study of prosody (stress, rhythm, and intonation), and the modelling of variation in prosody within and between speakers, dialects, languages, and contexts, in a laboratory phonology approach (using quantitative and qualitative methods, on both naturally occurring and experimental data). Thomas Herbst is Professor of English Linguistics at the Friedrich-Alexander Universität Erlangen-Nürnberg (FAU). He studied English and German at the Universities of Erlangen-Nürnberg and Oxford (St. Edmund Hall). He has taught at the universities of Reading, Augsburg, and Jena. The focus of his interests and of his publications lies in the fields of valency theory, collocation studies, cognitive and constructionist theories of language, pedagogical construction grammar, and linguistic aspects of film dubbing. He is one of the editors of the Valency Dictionary of English () and co-editor of Zeitschrift für Anglistik und Amerikanistik and Lexicographica. Martin Hilpert is Professor of English Linguistics at the University of Neuchâtel. He holds a PhD from Rice University. His interests include cognitive linguistics, language change, construction grammar, and corpus linguistics. He is the author of Germanic Future Constructions (, John Benjamins), Constructional Change in English (, Cambridge University Press), and Construction Grammar and its Application to English (, Edinburgh University Press). He is Editor of the journal Functions of Language and Associate Editor of Cognitive Linguistics. Willem B. Hollmann is a Senior Lecturer at the University of Lancaster. His research focuses on cognitive-typological linguistic theory and methodology, language change, and dialect grammar. These areas frequently overlap and interact in his work, especially in his publications in the nascent area of cognitive sociolinguistics. He is currently Chair of the national Committee for Linguistics in Education (CLiE), which reflects and is part of his keen interest and engagement in language teaching in primary and secondary education.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xviii
Rodney Huddleston earned his BA at the University of Cambridge and his PhD in Applied Linguistics at the University of Edinburgh. He is a Corresponding Fellow of the British Academy. He taught at Edinburgh, London, and Reading before moving in to spend the majority of his career in the Department of English at the University of Queensland. He has published numerous books and papers on English grammar, the most significant work being The Cambridge Grammar of the English Language (, with Geoffrey K. Pullum), which won the Linguistic Society of America’s Leonard Bloomfield Book Award in . Marianne Hundt is Professor of English Linguistics at Zürich University. Her corpusbased research focuses on variation and grammatical change in contemporary and late Modern English. Her publications cover both first- and second-language varieties of English, notably in the South Pacific and South Asia. She has been involved in the compilation of various corpora and has explored the use of the World Wide Web as a corpus. Her publications include English Mediopassive Constructions (, Rodopi) and New Zealand English Grammar: Fact or Fiction? (, John Benjamins). She is co-author of Change in Contemporary English: A Grammatical Study (, CUP) and co-editor of English World-Wide. Lesley Jeffries is Professor of Linguistics and English Language at the University of Huddersfield, UK, where she has worked for most of her career. She is the co-author of Stylistics (, CUP) and Keywords in the Press (, Bloomsbury) and author of a number of books and articles on aspects of textual meaning including Opposition in Discourse (, Bloomsbury), Critical Stylistics (, Palgrave) and Textual Construction of the Female Body (, Palgrave). She is particularly interested in the interface between grammar and (textual) meaning and is currently working on a new book investigating the meaning of contemporary poetry. Gunther Kaltenböck, who previously held a professorship at the University of Vienna, is currently Professor of English Linguistics at the University of Graz. His research interests lie in the areas of cognitive-functional grammar, corpus linguistics, pragmatics, phonetics, variation and change, as well as Thetical Grammar. Apart from numerous book chapters and contributions to international journals, his publications include a monograph on It-Extraposition and Non-Extraposition in English (, Braumüller) and several co-edited volumes, such as New Approaches to Hedging (, Emerald), Outside the Clause (, John Benjamins), and Insubordination: Theoretical and Empirical Issues (, de Gruyter). Evelien Keizer is Professor of English Linguistics at the University of Vienna. She obtained her PhD in English Linguistics from the Vrije Universiteit Amsterdam in ; since then she has held positions at the University of Tilburg, University College London, and the University of Amsterdam. She has published widely on the noun phrase in English (e.g. The English Noun Phrase: The Nature of Linguistic Categorization, , CUP) and Dutch (Syntax of Dutch: The Noun Phrase, Vol. , ,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xix
Amsterdam University Press). She is also the author of A Functional Discourse Grammar for English (, OUP) and co-editor of several edited volumes and special issues. Ekkehard König was educated at the Universities of Kiel, Edinburgh, and Stuttgart. After holding professorial positions in Germany and other European countries, he retired in from the Freie Universität Berlin and is now Adjunct Professor at the Albert-Ludwigs-Universität Freiburg. In addition to directing a variety of research projects in Germany, he was Programme Director of the ESF-funded project ‘Typology of Languages in Europe’. His current duties include the editorial responsibility of the international journal Studies in Language (together with Lindsay Whaley). Bernd Kortmann is Full Professor of English Language and Linguistics at the University of Freiburg, Germany, and since October Executive Director of FRIAS, the Freiburg Institute for Advanced Studies. He has widely and extensively published in English linguistics, notably on the grammar of standard and non-standard varieties of English, and is co-editor of the international journal English Language and Linguistics as well as of the book series Topics in English Linguistics and Dialects of English. Terje Lohndal is Professor of English Linguistics at NTNU The Norwegian University of Science and Technology in Trondheim and holds an Adjunct Professorship at UiT The Arctic University of Norway. Together with Marit Westergaard, he directs the AcqVA (Acquisition, Variation, Attrition) research group. Lohndal works on syntax and its interfaces from a comparative perspective, drawing on data from both monolingual and multilingual individuals. He has published articles in journals such as Linguistic Inquiry, Journal of Linguistics, Journal of Semantics, and several books, among others, Phrase Structure and Argument Structure with Oxford University Press and Formal Grammar with Routledge. J. Lachlan Mackenzie is Emeritus Professor of Functional Linguistics at VU Amsterdam, having previously been Full Professor of English Language there. With a PhD from the University of Edinburgh (), his career was in the Netherlands, working closely with Simon Dik, Kees Hengeveld, and many others on the development of Functional Grammar and Functional Discourse Grammar. He is an Editor of the journal Functions of Language and his research interests range from functional linguistics to pragmatics, discourse analysis, and the expression of emotion. Key publications include Functional Discourse Grammar (, OUP) and Pragmatics: Cognition, Context and Culture (, McGraw Hill). See www.lachlanmackenzie.info Gergana Popova works at Goldsmiths, University of London. She obtained her MA from the University of Sofia and her PhD from the University of Essex. Her interests are in theoretical linguistics and linguistic theory, morphology, and the interface between morphology and syntax and morphology and lexical semantics. She is currently working on periphrasis (with Andrew Spencer). A further interest is how
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xx
corpora and corpus-analytic techniques can be used in the study of linguistic phenomena, including the study of language use and discourse. Geoffrey K. Pullum is Professor of General Linguistics in the School of Philosophy, Psychology, and Language Sciences at the University of Edinburgh. He is a Fellow of the British Academy, the Linguistic Society of America, and the American Academy of Arts and Sciences. He previously taught at University College London; the University of Washington; Stanford University; the University of California, Santa Cruz; and Brown University. In addition to more than scholarly publications on linguistics, he has written hundreds of popular articles and blog posts. He co-authored The Cambridge Grammar of the English Language () with Rodney Huddleston. Doris Schönefeld is a Professor of Linguistics at the Institute of British Studies at the University of Leipzig (Germany). She works in the field of usage-based (cognitive) linguistics with a special focus on Construction Grammar. In addition to research into particular constructions of English (such as copular constructions), she is interested in more general linguistic issues, such as the relationship between lexicon and syntax (Where Lexicon and Syntax Meet, , Mouton de Gruyter) and methodologies in empirical linguistic research (co-authored articles (, ), and an edited book on Converging Evidence: Methodological and Theoretical Issues for Linguistic Research, , John Benjamins). Carson T. Schütze is Professor of Linguistics at the University of California, Los Angeles, where he has taught since . His research spans topics in syntax, morphology, first language acquisition, language processing, and linguistic methodology, often focusing on Germanic languages. His monograph The Empirical Base of Linguistics (, reprinted ) is often cited as a catalyst for the recent eruption of empirical and philosophical work on acceptability judgements. Peter Siemund has been Professor of English Linguistics at the University of Hamburg since . He pursues a crosslinguistic typological approach in his work on reflexivity and self-intensifiers, pronominal gender, interrogative constructions, speech acts and clause types, argument structure, tense and aspect, varieties of English, language contact, and multilingual development. His publications include, as author, Pronominal Gender in English: A Study of English Varieties from a Cross-Linguistic Perspective (, Routledge), Varieties of English: A Typological Approach (, CUP), and Speech Acts and Clause Types: English in a Cross-Linguistic Context (, OUP), and, as Editor, Linguistic Universals and Language Variation (, Mouton de Gruyter) and Foreign Language Education in Multilingual Classrooms (with Andreas Bonnet; , John Benjamins). Andrew Spencer retired from the Department of Language and Linguistics, University of Essex, in , where he had taught for twenty-five years. He is the author of Morphological Theory (), Phonology (), Clitics (, with A. Luís), Lexical Relatedness (), and Mixed Categories (in press, with I. Nikolaeva). His interests are
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xxi
in theoretical morphology and in morphology and its interfaces with syntax and the lexicon. He is currently working on periphrasis (with G. Popova) and on a crosslinguistic study of deverbal participles. Jon Sprouse is an associate professor in the Department of Linguistics at the University of Connecticut. His research focuses on experimental syntax—the use of formal experimental measures to explore questions in theoretical syntax—with a particular focus on acceptability judgements. He is the Editor of the forthcoming Oxford Handbook of Experimental Syntax, and Co-Editor of Experimental Syntax and Island Effects (with Norbert Hornstein, , Cambridge University Press). John R. Taylor is the author of Linguistic Categorization (rd edn, , OUP); Possessives in English: An Exploration in Cognitive Grammar (, OUP); Cognitive Grammar (, OUP); and The Mental Corpus: How Language is Represented in the Mind (, OUP). He edited The Oxford Handbook of the Word () and co-edited Language and the Cognitive Construal of the World (, Mouton de Gruyter). He is a member of the editorial board of Cognitive Linguistics Research series (Mouton de Gruyter) and is an Associate Editor of the journal Cognitive Linguistics. Margaret Thomas is Professor in the Program of Linguistics at Boston College. Most of her current research is in the history of linguistics, especially in the United States, with additional interests in second language acquisition and in Japanese psycholinguistics. She is the author of Fifty Key Thinkers on Language and Linguistics (, Routledge), and Formalism and Functionalism in Linguistics: The Engineer and the Collector (, Routledge). She serves on the editorial boards of several journals, and as the reviews editor for Second Language Research and for Language and History. Anastasios Tsangalidis is Professor of Syntax and Semantics at the School of English, Aristotle University of Thessaloniki. His main research interests are in the area of syntactic and semantic description and the relevance of grammar to language teaching, focusing on the description of the verb in English and Modern Greek—and especially the interaction of tense, aspect, mood, and modality. Most recently he has co-edited (with Agnès Celle) a special issue of the Review of Cognitive Linguistics on The Linguistic Expression of Mirativity. Sean Wallis is Principal Research Fellow and Deputy Director of the Survey of English Usage at University College London. His publications include Exploring Natural Language (, with G. Nelson and Bas Aarts, John Benjamins), The English Verb Phrase (, edited with J. Close, G. Leech, and Bas Aarts, CUP), as well as book chapters and articles in journals across a range of topics from artificial intelligence and computing to statistics and corpus research methodology. He runs a blog on statistics in corpus linguistics, corp.ling.stats (http://corplingstats.wordpress.com). Anja Wanner is Professor of English Linguistics at the University of WisconsinMadison, where she teaches syntax and grammar in use and directs the ‘Grammar
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xxii
Badgers’ outreach project. Trained as a generative linguist, she became interested in syntactic variation and genre after studying the representation of implicit agents and changing attitudes towards the use of the passive voice in scientific writing. Additionally, she has published on the relationship between verb meaning and syntactic behaviour, the role of prescriptive grammar in language change, and the grammar of persons diagnosed with Alzheimer’s Disease. Debra Ziegeler attained her PhD from Monash University, Melbourne, in : Aspects of the Grammaticalisation of Hypothetical Modality, published in as Hypothetical Modality: Grammaticalisation in an L Dialect (John Benjamins, SILC series). A second study, Interfaces with English Aspect (, John Benjamins, SILC series), looked at the relationship between modality and aspect in English. In other publications, she has focused on the semantics of modality associated with proximative meaning (in Journal of Pragmatics , , Journal of Historical Pragmatics ) as well as the diachronic grammaticalization of the semi-modals, e.g. be supposed to, be able to, and have to (e.g. Journal of Historical Pragmatics , ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
, ,
I
.................................................................................................................................. T volume aims to provide an authoritative, critical survey of current research and knowledge in the field of English grammar, where ‘grammar’ is used in the sense which encompasses morphology (the principles of word formation) and syntax (the system for combining words into phrases, clauses, and sentences). This handbook is not, however, intended to be a grammar of English. While it includes descriptive coverage of core topics in English grammar, it differs from a typical grammar in several ways, and casts its net much wider. First, it devotes considerable attention to rival analyses of particular areas of grammar, and the evidence and arguments for these analyses. Second, it addresses foundational areas of research methodology and different theoretical approaches to grammar, enabling readers to take a more informed and critical approach to grammatical descriptions and current research. Third, it covers important areas of extension beyond ‘core’ grammar: the relationship of grammar to other areas of the language (lexis, phonology, meaning, and discourse); and grammatical variation over time, across genres, and among regional dialects and World Englishes. We discuss the rationale for our approach in more detail in the next section.
R
.................................................................................................................................. Ever since William Bullokar’s Pamphlet for Grammar was published in , countless grammars have been produced, with eighteenth-century authors being particularly productive (Linn ). Each of these grammars is in many ways unique. Thus while Bullokar (: ) opined that English did not have much grammar (‘As English hath few and short rules for declining of words, so it hath few rules for joining of words in
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xxiv
sentence or in construction’; cited in Michael : ), other grammarians struggled with the question of how many word classes to recognize. This led to accounts in which there were only two or three word classes, and others in which there were many, such that by there were fifty-six different systems of word classes (Michael ). What we find, then, from early times, is that grammars present very particular, often idiosyncratic, analytical views of English. This situation has continued right up to the present. One needs only to compare the grammars of Jespersen (–), Quirk et al. (), McCawley (), and Huddleston and Pullum (), to name but four, to see how the syntax of the same language can be analysed in very different ways. For example, in Jespersen we find very distinctive terminology and a unique analytical framework, many elements of which were later adopted by theoretical linguists. We also find novel analyses in Quirk et al. (, ), and this framework is perhaps the most influential in the field of language teaching. McCawley’s grammar is different again: here it is obvious that the author was a Generative Semanticist, which resulted in analyses that are often surprising, idiosyncratic, and highly original. Finally, Huddleston and Pullum, like Quirk et al., build on the tradition, but often base their analyses on recent theoretical work in linguistics, especially Chomskyan grammar and phrase structure grammar. Their work is characterized by numerous new analyses in many areas of grammar, such as the treatment of conjunctions and prepositions. Anyone who reads or consults these grammars (and others) could be forgiven for sometimes feeling somewhat bewildered by the different analyses that can be found in the literature for one and the same phenomenon. As an example, consider the structure of English noun phrases. An ostensibly simple NP like the cats receives quite different analyses in terms of which element is the head. Traditional grammars regard the noun as the head, whereas modern generative work assumes that the determiner is the head (resulting in a DP, rather than an NP). As another example, the treatment of auxiliary verbs in the two most widely used grammars of English, namely Quirk et al. () and Huddleston and Pullum (), is radically different. The former regard auxiliaries as ‘helping verbs’ which are dependents of a lexical verb (the so-called dependent-auxiliary analysis), whereas the latter, influenced by early generative work (e.g. Ross , Pullum and Wilson ), regard auxiliaries as ‘catenative verbs’ with their own complement-taking properties (the so-called catenative-auxiliary analysis). Analytical differences of this kind are quite widespread in these two grammars, despite the fact that they are rather close to each other in conceptual outlook. Differences in approach and analysis are amplified when two different (theoretical) frameworks are involved, for instance Generative Grammar and Cognitive Grammar. The former takes structure as primary (‘autonomous syntax’), whereas the latter takes meaning to be fundamental. Recent work in the various versions of Construction Grammar has resulted in further developments in English grammar. In this handbook we have asked authors to show how approaches to the same set of data depend on the theoretical framework an analyst chooses to deploy. We believe that this book will help readers to come to grips with different treatments of the core areas of English grammar, and thus to develop a more refined and informed knowledge of
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xxv
the language. It will help to create a generation of linguists who can look beyond the frameworks that they are familiar with, so as to become more open-minded scholars who are conversant with the rich analytical tradition of English.
O
.................................................................................................................................. The handbook is divided into five parts. Parts I and II survey different methodological and theoretical approaches to grammar respectively, providing essential foundations to enable readers to take a more informed and critical view of grammatical analyses and research. Part I begins with a chapter on the history of English grammar writing since the eighteenth century, providing a broader context for later chapters. This is followed by three contributions on methodological issues pertaining to the contemporary study of grammar. It is important—in a book that aims to offer accounts of rival analyses of particular areas of grammar—to help readers understand which argumentative techniques and research methods can be used to arrive at particular analyses. Part II of the book (‘Approaches to English grammar’) offers readers a concise introduction to different linguistic frameworks. The chapters in this part provide coverage of the major types of theoretical approach that have been taken to English grammar, and that readers are likely to encounter in the literature. To make Part II more approachable and helpful for readers, we have grouped related frameworks together under one heading. For example, the chapter on dependency approaches covers Valency Grammar and Hudson’s Word Grammar, and the chapter on functional approaches covers e.g. Dik’s Functional Grammar, as well as Hallidayan grammar and its derivatives. Each chapter illustrates how the approach in question is applied to particular areas of English grammar. Included in this part is a chapter which specifically covers different theoretical approaches to morphology. Part III of the book complements Part II in covering all the core areas of English grammar, including morphology. Contributors were asked to pay particular attention to major areas of controversy between rival analyses in different frameworks, and to the evidence that has been used to support them. Part IV is about the relationship between grammar and other areas of linguistics, specifically lexis, phonology, meaning, and discourse. Recent decades have seen innovative and fruitful research in these fields, which deserve coverage in the handbook. This part of the book is intended to benefit readers who would like to find out more about domains of linguistics other than the field they work in. These areas of relationship are sometimes treated in contrasting ways in different theoretical approaches and, in keeping with the overall aims of the handbook, contributors to this part were asked to discuss some of these differences. Part V covers grammatical variation and change, another area that has received increasing attention over recent years. Most of the book is concerned with the common
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
xxvi
core of standard English, as used in countries where English is an official language. However, this final part of the book looks at variation among different regional dialects and global varieties of English, as well as variation in different genres of English and in literature. In addition, this part includes a chapter on grammatical change over shorter and longer time periods. This topic is closely related to that of variation, as varieties arise by processes of change, and genres also undergo change over time.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.............................................................................................................
GRAMMAR WRITING AND METHODOLOGY .............................................................................................................
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. T chapter aims to provide readers with insight into the history of English-language grammar writing. It is impossible to survey here the full scope of the topic: English grammars have been published since the late s (Alston ), and vary widely in purpose, reception, intended readership, socio-cultural context, and level of detail— among other factors. Instead, I will illustrate some of the range of that variation by focusing on five figures from the last years whose work is important to English grammaticology, that is, to the study of the principles of grammar writing (Graustein and Leitner : –): Lindley Murray (–); Henry Sweet (–); Otto Jespersen (–); Randolph Quirk (–); and Noam Chomsky (b. ). These five form a heterogeneous, non-exhaustive, sample of scholars who have made significant contributions. Not all can be identified primarily as grammarians, and not all may now have sufficient currency to be cited elsewhere in this handbook. I have selected them not as the top five names in the recent history of English grammar, but because they exemplify a range of positions on a fundamental, two-pronged issue, namely, conceptualization of what counts as the data a grammar of English should analyse, and the source of those data. The issue is germane since, as Graustein and Leitner (: –) point out, the totality of a language is too complex and multidimensional for any grammar to treat comprehensively. Rather, grammarians necessarily choose particular domains to address, ‘according to linguistic and user-related criteria’ (Graustein and Leitner : ), with certain criteria privileging certain sources for
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
those data. For example, some grammarians have been concerned exclusively with an educated, prestige-laden, style of English attested in the writings of especially admired authors. Other grammarians have assumed that the evolution of English over time is fundamental to its current structure, so that historical facts guide their selection of material and references to earlier stages of the language suffuse their analysis. Some grammarians have focused on contrasts between features of English and those of other languages, or those asserted to be shared by all languages. The data on which grammarians base their analyses may derive from written records, selected according to the grammarian’s judgement of what counts as appropriate models; they may be carried over from earlier grammars; or they may be constructed by the grammarians themselves, out of their own intuitions. Another approach is to collect samples of speakers’ or writers’ output, then to induce linguistic generalizations by inspecting the corpus. To sharpen the characterization of how these five scholars conceptualize the data their grammars address, I will compare their treatments of a single construction in English, the so-called ‘double negative’. Although other chapters in this handbook may not converge on how to analyse double negatives, they exhibit a shared twenty-firstcentury understanding of what counts as the data of an English grammar, and they derive those data from a common array of sources. Most modern grammarians prioritize spoken registers of English (more or less idealized) without excluding written texts, and admit many kinds of dialectal and stylistic variation. For the most part, their working methods are inductive, based on data from heterogeneous sources: corpora; constructed examples; historical evidence; experimental results; speakers’ unguarded slips of the tongue; and the language of children, second-language learners, and people living with aphasia and other language disorders. Twenty-first-century grammarians of English might incorporate data derived from any of these sources. That was not necessarily true for their predecessors. However, it is worthwhile keeping in mind that, as Andrew Linn wrote, ‘grammar-writing in the modern age carries its past with it’ (: ).
. L M (–)
.................................................................................................................................. Little in Lindley Murray’s background anticipates that he would become the nineteenth century’s most widely read grammarian of English. The oldest son of an enterprising merchant of Quaker and Scotch-Irish immigrant background, Murray’s family achieved social and commercial prominence in New York City (Monaghan ). His father expected him to join the family business, but under the influence of an education informed by Enlightenment scientific-literary values, Murray resisted. In the end, he prepared for a career in law, although managing his family’s commercial ventures eventually became his major lifework. Just as Murray launched his legal career in , increasing friction between the British government and American colonies disrupted life in New York. Murray’s
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
family temporarily removed to England, then Murray and his father returned to weather the effects of the run-up to the American Revolution on their investments in shipping, industry, and banking. According to Monaghan (), Lindley’s Memoirs (; Allott ) obfuscate whether the family sided with the loyalists or patriots. There is evidence that under cover of Quakerly neutrality, the Murrays profited by sometimes aligning themselves with one side, sometimes the other. Public opinion, however, considered the Murrays loyalists. When the British were driven from New York in , Lindley and his wife fled back to England. Although Murray was probably more sympathetic to the patriots than the loyalists, his voluntary exile helped protect the family’s wealth from confiscation.1 His relatives—including his father, who had more openly supported the British—eventually re-established their social positions in New York and re-assumed their commercial activities. Murray and his wife settled reluctantly in Yorkshire, supported by their investments in America. On the basis of Murray’s reputation as a local man of education, teachers at a nearby Quaker school recruited him to tutor them in English grammar (Fens-de Zeeuw : ). Eventually he put the essentials into writing for students. Murray intended his English Grammar for the neighbourhood Quaker schools, but it achieved immediate and widespread success in England, with fifty-two editions published by . It was even more successful in America. Murray energetically revised and promoted the text without either pursuing or realizing significant financial gain. From onward, he published a companion book of exercises, a key to the exercises, and an abridgement of the Grammar. Next came Murray’s English Reader and a French reader (each with ancillary texts), an English spelling book, and a two-volume edition combining the Grammar with the exercises and key (Murray ). Murray’s books were translated and multiply reprinted for teaching English in India, Germany, Ireland, France, and Portugal (Tieken-Boon van Ostade : ). Monaghan (: ) estimates that his textbooks sold . million copies altogether, making him ‘the largest-selling author in the world in the first four decades of the nineteenth century’. The Grammar was widely reviewed, criticized, and imitated, dominating the market until around (Fens-de Zeeuw : ). In this way, exiled entrepreneur and lawyer Lindley Murray became the renowned ‘Quaker grammarian’ (Fens-de Zeeuw : –). His qualifications for the role came from his education and social standing, not from protracted experience as a teacher or scholar. In this he was not unusual for a grammarian of his time, nor was his work innovative. Murray’s Introduction makes it clear that innovation was not the goal: he described the text as ‘a new compilation . . . a careful selection of the most useful matter’ which he hoped would achieve the ideal balance between over- and underexplicitness in a book designed ‘for the instruction of youth’ (: iii). Murray represented his work as that of ‘selecting and forming [definitions and rules]’ (nd ed., : v), rather than offering a novel analysis. From the third edition onward, Fens-de Zeeuw () argues that Murray’s move to Britain was instead motivated by his health problems. 1
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Murray named the major figures his text drew on, emphasizing that the grammar is ‘a compilation . . . consist[ing] chiefly of materials selected from the writings of others’ (rd ed., : ; th ed., : ; etc.)—which most prominently included Robert Lowth (–), Joseph Priestley (–), and Charles Coote (–). But by modern standards, Murray was lax in identifying his sources. Vorlat () gives a point-by-point exegesis of many definitions Murray took (near-)verbatim from previous grammars, in particular from Lowth’s Short Introduction to English Grammar (nd ed., ). Moreover, Murray copied many of his examples from previous grammars, sometimes with a trivial alteration, as where he recycled Lowth’s illustrations of proper nouns (‘George, London’) versus common nouns (‘Animal, Man’; Lowth : ) as ‘George, London, Thames’ versus ‘animal, man, tree’ (Murray : ). Murray sometimes updated the diction, revising Lowth’s ‘unless he wash his flesh’ to ‘unless he wash himself ’ (Vorlat : ). But as with his definitions, Murray’s examples were mostly carried over from earlier published works. Moreover, Murray’s English Grammar shows little innovation in its organization. The order and substance of its four major sections cleaves to the tradition that goes back to the fourth-century Roman grammarian Aelius Donatus: ‘Of Orthography’, on letters, sounds, and syllables; ‘Etymology’, separately addressing each of the classic parts of speech; ‘Syntax’, which became progressively more elaborate in subsequent editions; and ‘Prosody’, treating inter alia accent, emphasis, and punctuation (with the latter becoming an independent section by the third edition). Sections are sub-divided into units that typically begin with a definition of the feature under discussion, followed by discussion and examples in smaller type, sometimes with examples of incorrect use. This sketch provides a basis for inferring what Murray conceived to be the relevant data a grammar of English should analyse, and the sources of those data. In concert with the works from which he compiled his grammar, Murray’s target was the formal, literary, version of late eighteenth-century English, privileging its written form, while ostensively taking speech into account as well. Lowth’s Preface adverts to ‘the English language as it is spoken by the politest part of the nation, and as it stands in the writings of our most approved authors’ (nd ed., : vii). Murray likewise incorporates speech, writing, and social class into his conceptualization of grammar as ‘the art of speaking and writing the English language with propriety’ (: ). He rarely distinguishes speech from writing, but remarks on the muddled vowels of ‘a person of a poor education’ compared to the ‘distinct, open, specific’ vowels of well-educated people (p. ). An additional criterion for inclusion in Murray’s grammar is that he rejected material that ‘might have an improper effect on the minds of youth’ so as to ‘advance the best interests of society, by cherishing the innocence and virtue of the rising generation’ (p. v). The English that Murray’s Grammar analyses is thus an educated, primarily written style (which still claims to address speech), expunged of offensive material. Murray’s data are mostly taken from earlier grammars, with minor retouching. However, on one point he departed from dependence on his major source, Lowth (). One of Lowth’s trademarks was to cite in footnotes what he perceived as grammatical errors in texts by
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
well-known writers, giving full attributions to each. His purpose in ‘ransacking the finest literature for “barbarisms”’ (Pullum : ) was to counter-exemplify principles of grammar he articulated.2 Murray, however, did not copy this feature of Lowth’s text strictly. The Grammar contains abundant examples of what Murray considered proper usage, alongside illustrations of what he called ‘false grammar’ (: iv). But Murray rarely labelled counter-examples with their authors’ names, whereas Lowth pointedly called attention to the discrepant usage of admired stylists such as Shakespeare, Jonathan Swift, and Samuel Johnson. Moreover, in Murray’s Exercises (published together with the Grammar in ), he provided more than pages of counterexamples, keyed sub-section by sub-section to the Grammar. The intention was for students to turn from the Grammar to the Exercises for practice identifying errors in sentences like ‘You and us enjoy many privileges’; students then applied the relevant grammatical rules to amend the sentence to ‘You and we . . . ’, checking their output against the key to the exercises. Murray does not indicate the provenance of his counter-examples. He simply states that he ‘collect[ed] and arrange[d]’ them (, Vol. : ), again emphasizing exclusion of content that might injure the ‘sensibility of young persons’ (p. ). Compared to Lowth’s counter-examples culled from classic authors, Murray’s Exercises lack variety of voice and content. Their homogeneous tone and diction—judicious, didactic, sententious; many adopting the oracular voice of proverbs—give them the air of having come from the same pen, probably Murray’s.3 The point comes to a head in Murray’s treatment of double negatives (, Vol. : –), a common but disparaged construction. Tieken-Boon van Ostade (: –; : ) has analysed double negation in eighteenth-century letters, prose, and plays, exemplifying both frequent and rare patterns. Among Murray’s twenty-four examples, only one conforms to a frequent contemporary pattern (‘not . . . neither’), while twelve of twenty-four are unattested even among rare patterns. This strengthens the inference that Murray constructed (here, mis-constructed), rather than compiled or observed, his examples of ‘false grammar’. Murray’s Grammar addresses a prestigious style of eighteenth-century English, groomed to maintain a high moral tone. Its models of grammatical propriety as well as most definitions and discussion, derive from other published grammars, augmented by counter-examples in the Exercises that Murray probably constructed. In its conceptualization of the language to be analysed, and in the sources of his data, Murray’s Grammar was unexceptional in his time. Its spectacular reception signals Murray’s success at meeting the needs of a specific readership, at a specific moment in the history of English grammar writing.
2 Tieken-Boon van Ostade (: ) counts this feature of Lowth’s grammar as evidence of its essential descriptivism, despite Lowth’s modern reputation as ‘an icon of prescriptivism’. 3 Additional examples (as corrected in the Key) include: ‘A good and well-cultivated mind, is greatly preferable to rank or riches’ (p. ); ‘The garment was decently formed, and sewed very neatly’ (p. ); ‘A hermit is austere in his life; a judge, rigorous in his sentences’ (p. ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. H S (–)
.................................................................................................................................. The bold and fractious British scholar Henry Sweet is best known as a phonetician and philologist. But in the s Sweet’s attention turned to grammar, and he produced the two volumes of A New English Grammar, Logical and Historical (, ). Unlike Murray’s grammar, Sweet’s did not find a large commercial market. Although at least one contemporary reviewer admired it as a work of ‘solid scholarship . . . contain[ing] many acute observations and luminous comments’ (Smith : ), Hudson (b) aptly characterized it as a ‘neglected masterpiece’. Also unlike Murray, Sweet analysed English for an advanced readership. His scientific, rather than normative, goals were ‘founded on an independent critical survey of the latest results of linguistic investigation’ (Sweet : v). Sweet’s starkly different nineteenth-century conceptualization of English grammar provides a revealing parallax to Murray’s eighteenth-century one. Sweet was born and raised in London. From adolescence, when he first started teaching himself Old English and Old Icelandic, he was preoccupied by study of the Germanic languages. Aged eighteen he spent a year at the University of Heidelberg absorbing German philology first-hand. When he was twenty-four, the same year he belatedly matriculated at Oxford’s Balliol College, Sweet was publishing linguistichistorical research in the Transactions of the Philological Society. His relationship with Oxford proved to be a lifelong, highly-charged, struggle. Sweet rebelled against the institution’s traditions and curriculum, which he claimed to have retarded his selfdefined real education (Sweet : vi). Oxford returned his disfavour, branding him with a fourth-class degree on his graduation. As his star nevertheless rose in the world of Germanic philology, Sweet set his sights on raising the level of language scholarship in Britain, especially at Oxford; but his independent mindset and contentious disposition led him to be repeatedly denied professorships (once at University College London, twice at Oxford) commensurate with his erudition and originality. He eventually accepted a readership in Phonetics at Oxford, a position that little reflected the esteem with which his contributions were held by his peers outside Britain, especially in Germany. Despite Sweet’s alienation and unfavourable institutional status, he produced a steady stream of publications marked by creativity, depth of learning, and intellectual self-confidence. He was adept at writing for both academic and non-specialist readerships, sometimes producing parallel texts on the same topic, one for language scholars, one for the public. Above all, Sweet championed phonetics as the foundation of language study. He invented a two-tiered transcription system that presaged the notion of the phoneme; proposed a new system of shorthand; wrote authoritatively about the history of English (especially phonetics); contributed to debate about language teaching; and produced a stream of translations, lexicons, and exegeses of Old and Middle English texts. Sweet’s philological interests informed his New English Grammar (, ). The Introduction to Part I comprises a -page tutorial in late nineteenth-century linguistics. It begins by distinguishing descriptive from explanatory grammar, with the former establishing a foundation for the latter. Explanatory grammars, like his own, rely on
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
evidence from either historical, comparative, or general (philosophical) grammar, all of which entail analysis of older versions of the language: whether we try to understand a construction by examining its ancestor forms, by reconstructing its cognates in related languages, or by surveying the histories of unrelated languages for general patterns— every approach requires study of how English developed over time. Thus Sweet’s New Grammar starts with a strong statement of the value of historical knowledge to any method of language analysis. Having adopted this perspective, Sweet introduces a basic vocabulary for language analysis, defining and exemplifying ‘qualifier’/‘attribute’; ‘subject’/‘predicate’; ‘derivation’/ ‘inflection’; ‘logical’/‘grammatical’ (categories); and ‘ellipsis’, ‘intonation’, ‘conversion’, etc. Sweet then breaks into about one hundred pages on the features and sub-categories of the classical parts of speech, with almost all examples taken from contemporary English, followed by twenty-five pages on the basic structure of sentences, clauses, and ‘wordgroups’ (avoiding the term ‘phrase’). The remainder of the Introduction returns to general historical issues: how languages evolve over time, emphasizing sound change; social control over language change; historical relationships among languages; and finally a sketch of how Modern English descended from its ‘Arian’ roots. The remainder of Part I comprises two major sections, ‘Phonology’ and ‘Accidence’. Both are dominated by discussion of how sounds and word-forms developed from Old to Modern English. Sweet abundantly illustrated his analysis with cogent examples from everyday speech (e.g. contrasts between ‘gold chain’/‘golden chain’; ‘silky hair’/‘silken hair’ [: ])—unattributed and therefore likely of his own invention. Rarely, he adds quotations from literary sources. In Part II of New English Grammar, ‘Syntax’ (), Sweet addresses word order, first as a general linguistic phenomenon, then as it bears on each of the major parts of speech, taking up issues such as the position of adverbs relative to verbs (‘He never is ready on time’/‘He is never ready on time’, p. ) and ’s versus of genitives (‘the king’s son’/‘the son of the king’, p. ). Throughout, Sweet distinguishes written and spoken English, and sometimes adverts to particular registers (narrative historical present tense [: ]), regional dialectal diversity (high front vowels in southern versus northern England [: ]; ‘Scotch’ syntax [: ]) or stylistic variation (effects of rapid speech on stress [: ]). A striking feature of Sweet’s presentation is that he embeds all facets of the language in historical context. That is, Sweet characteristically indicates the antecedents of the grammatical features of modern English, usually by providing equivalents in Old or Middle English or, as relevant, by referring to Latin, Anglo-Frisian, or Early Modern English. He quotes numerous examples of earlier versions of English, usually supplying modern translations but without citing his sources. Sweet’s implicit stance is that a grammar of English at any specific point in time is a snapshot of an object in motion, the essence of which cannot be captured solely with reference to its present state. The treatment of negation in Part I (pp. –) is representative. Historical facts pervade the exposition: Sweet first introduces the Old English negative preverbal particle ne, which spread by prefixation onto pronouns and adverbs in the same sentence. Where a negated verb lacked words eligible for ne-prefixation, negation
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
was instead strengthened by preverbal nā or naht. In Middle English, nat (a prefixed form of ‘nothing’) supplanted ne as a general negative marker, coming down to us as modern not. However, Old English nā has survived, especially before comparatives, in Modern English ‘He is no better’ and ‘No more of this!’. Then Sweet adds: . . . the influence of Latin grammar led to the adoption of the logical principle that ‘two negatives contradict each other and make an affirmative,’ which is now strictly carried out in the Standard language, spoken as well as written, though the old pleonastic negatives are still kept up in vulgar speech, as in I don’t know nothing about it = the educated I do not know anything about it or I know nothing about it. (Sweet : )
From this we can infer how Sweet conceptualized the database of a grammar. An earlier passage (: –) discussed the position and meaning of negative particles in Modern English. But to Sweet the historical context of those data is essential to scientific understanding. ‘Modern English’ is only one point in a developmental stream; a grammar should indicate, for example, that the negators in ‘He is no better’ and ‘I do not know anything about it’ descended from separate sources. Moreover, a grammar should point out that Old and Middle English multiple negators survive in ‘vulgar’ Modern English, even if they have been expunged from educated speech under the influence of an extra-linguistic logical notion. To Lindley Murray, that logical notion (which Murray expressed—borrowing Lowth’s words—as ‘Two negatives, in English, destroy one other’ [: ; Lowth : ]) legitimates excluding ‘I don’t know nothing about it’ from the grammar of English. Sweet’s historical approach accounts for both the origin of double negatives and for their suppression. It also demonstrates that he conceived of the data for which a grammar is responsible as extending beyond the present state of the language, to its development over time.4 On the other hand, Sweet and Murray converge in how they illustrated their grammars, in that both employed examples they apparently constructed themselves, alongside material they extracted from written records. The written records Murray used were texts constructed by earlier grammarians, or culled by them from literature, whereas Sweet drew on Old and Middle English texts in his discussion of earlier forms of the language. To Murray and to Sweet, these were probably the obvious sources for exemplifying the structure of English. But they do not exhaust all possibilities.
. O J (–)
.................................................................................................................................. The Dane Otto Jespersen led a long, productive, and intellectually engaged life. Born in rural Jutland into a family of lawyers and public servants but orphaned by age thirteen, Jespersen worked his way through the University of Copenhagen as a language tutor, 4
Sweet () conceded limits to this orientation, questioning whether heavy-handed historical content is advantageous to second-language learners.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
and as a stenographer for the Danish Parliament. He would maintain an affiliation with the university for the rest of his life, first as a student of Law before switching to French and comparative-historical linguistics, then as Denmark’s first regular professor of English language and literature, as Dean of the Faculty of Arts, and finally as Rector of the university. Despite Jespersen’s institutional stability, and despite the facts that he enjoyed strong collegial relationships, closely followed trends in language study throughout Europe (and, to some extent, North America), and contributed to several international collaborative projects, he was a strikingly independent scholar. His work resists categorization into any of the schools of language scholarship that proliferated during his lifetime (the Neogrammarians; groups of scholars inspired by Ferdinand de Saussure [–]; the Prague Circle; and even the Copenhagen Circle). Jespersen wrote principally about phonetics and Indo-European historical and descriptive linguistics, especially of the Germanic and Romance branches. He worked to reform language teaching on a more linguistically sound basis, and participated in attempts to create an auxiliary language. Jespersen was a committed pacifist who hoped, in vain, that a shared international language would ease interwar tensions across Europe. In much of his work, Jespersen championed the belief that language change constitutes neither deterioration from a past golden age, nor either cyclic or random replacement of one construction or word by another. Rather, to Jespersen change represents a language’s increasingly efficient adaptation over time to meeting the communicative purposes of its speakers. Among Jespersen’s greatest achievements—and the one most relevant to our concerns here—is his seven-volume Modern English Grammar on Historical Principles (–). The research and writing of the Grammar extended over almost fifty years, with the last volume assembled posthumously by Jespersen’s former students. Part I covers the sounds and spelling of English; Parts II through V and Part VII, syntactic phenomenon; Part VI, morphology. All parts of the work display Jespersen’s capacity to notice and articulate patterns in language data, his penetrating analysis of the nuances of linguistic forms and meanings, and his encyclopaedic understanding of how English has changed over time. His felicitous powers of observation, broadmindedness, originality, and erudition—all communicated in an easy and artless prose style—attract admiring readers to this day. In the Foreword to a reprint of Jespersen (/), Randolph Quirk portrayed him as ‘the most distinguished scholar of the English language who has ever lived’. Breadth and originality also characterize Jespersen’s conceptualization of the language data a grammar should analyse, and the sources of those data. Although none of the grammarians discussed in Chapter derives his data from a single source, Jespersen stands out as especially catholic. His sources are more heterogeneous than Murray’s or Sweet’s, and in this he anticipates the practices of later grammarians. The Grammar cites hundreds of examples taken from written materials. Among them are samples of the ancestors of Modern English in all periods of its recorded development, including the classic texts and authors that Murray and Sweet trafficked in. However,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Jespersen also includes in his database many examples taken from contemporary popular literature, labelled with author, truncated title, and page number. Even granted the Grammar’s orientation to ‘Historical Principles’, modern sources seemingly prevail: Jespersen cites fewer works by Jonathan Swift or Samuel Johnson than by Arthur Conan Doyle, George Bernard Shaw, Robert Louis Stevenson, Booth Tarkington, or H. G. Wells—and by other lesser-known then-current writers, social workers, historians, physicians, and so forth. In this sense Jespersen’s grammar is more grounded in the present state of the language than Murray’s or Sweet’s. All three grammarians also extensively mined secondary sources which analyse English, although naturally by the early decades of the twentieth century Jespersen had access to a larger collection of grammars, dictionaries, lexicons, books on dialectology, spelling, etc. In assembling his sources, Jespersen took a methodological cue from the compilers of the New English Dictionary on Historical Principles (Francis ). He read extensively, and culled material from his reading by marking with a dot in the margin any particularly striking use of language. He extended that same observational vigilance to speech by writing down on ten-by-eight-centimetre scraps of paper any notable word or construction he overheard in conversation. Later, he annotated and numbered those slips and drew on them, as relevant, in his writing (/: –). Jespersen’s habit of collecting first-hand attestations of English is probably responsible for much of the freshness and authenticity of the data he analyses in the Grammar. For example, in a subsection on negation in Part V Jespersen introduces an array of contracted negatives in their historical and social contexts, including cou’dn’t, han’t, wo’n’t, amn’t, shatunt, whyn’t, warn’t, mustna, bettern’t (–, Part V: –). The extent to which Jespersen constructed his grammar of English on the foundation of directly observed data departs from the practices of Murray or Sweet, in whose grammars attested data played a minor role. It is also notable that Jespersen was reluctant to employ data that he himself constructed, whereas earlier grammarians felt free to provide their own examples and counter-examples. In these ways, Jespersen’s work takes a more empirical approach. Still, it is important to acknowledge that Jespersen’s data was limited to the intersection of whatever words and constructions he happened to encounter, and his own judgement of what was worth recording. Jespersen’s treatment of double negation (–, Part V: –) demonstrates the scope of what his grammar accounts for. He establishes a taxonomy that first separates double negation into constructions that communicate weakened affirmation (‘not uncommon’; ‘I do not deny’) versus four subtypes of constructions that communicate negation: negative attraction (‘you won’t lose nothing by it’); resumptive negation (‘I cannot live without you—nor will I not’); paratactic negation (‘You may deny that you were not . . . ’); plus a residue of intractable constructions. Jespersen illustrates his analysis with data culled all the way from Old English texts up to popular literature published in the first decade of the s, alongside a few unattributed examples of vernacular language that Jespersen may first have captured, by hand, on his notecards (‘he cannot sleep, neither at night nor in the daytime’).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
The exposition of double negation in the Grammar also reveals, in seed form, another conceptualization of what a grammar should address: the features of other languages, viewed comparatively. Jespersen remarks in passing on the applicability of his taxonomy to negation in Latin, Greek, Slavonic and Romance languages, Magyar, and German. He also mentions negation in varieties of English: Irish English, Cockney, and that spoken by ‘Yorkshire men and Americans’ (p. ). It is salient that during the long interval when he worked on the Grammar, Jespersen also produced an independent volume, Negation in English and Other Languages (/), the title of which announces its focus on cross-linguistic comparison. The subsection on double negation (pp. –) presents the same analysis, and cites many of the same sources, as does the Grammar. But the volume expands comparative treatment to negation in eleven languages, by way of testing the adequacy of his taxonomy and situating the specific properties of English. With this Jespersen enlarges the scope of what a particular grammar might claim responsibility for to include not only its own development over time and the usage of speakers across various registers, but also its place in the family of human languages in general.
. R Q (–)
.................................................................................................................................. Randolph Quirk was born on the Isle of Man, in a farming community where the Celtic language Manx could still be heard, and where ‘Manx values’ (Quirk : ) mixed with the overlaid cultural influences of England and Scandinavia. He left the island for University College London in to study Germanic philology, including Old High German, Gothic, Old Norse, and Anglo-Saxon. When his classes were relocated to Aberystwyth during the war, Quirk added Welsh. After a hiatus in the Royal Air Force, Quirk returned to London to study under leading post-war linguists Daniel Jones (–) and J. R. Firth (–). He obtained a bachelors and a doctoral degree, then accepted a Junior Lectureship at University College London. Quirk wrote that by this time he was ‘hooked on the idea of research’—adding ‘Ah, but in what?’ (: ). He eventually carried out many kinds of language research, in several institutional contexts, towards many ends, including inter alia old-school Germanic philology (the topic of his dissertation); Middle English lexicography; public lectures about language, broadcast on the BBC; applications of linguistics to literature and speech therapy; and renovation of English language teaching in British schools, prisons, and abroad. In , Quirk spent a year at Yale and the University of Michigan, where he encountered the key figures in American post-Bloomfieldian structuralism and acquired a taste for empirical research into the syntax of living languages. From this mix of experiences Quirk’s signature accomplishment emerged: a massive, collaborative, survey of contemporary English, and co-production of English grammars based on the resulting corpus. The project first germinated during an interval teaching at the University of Durham, but attracted more funding and personnel when he
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
returned to University College London in . Quirk was eventually named Quain Professor at University College London, and served a term as Vice Chancellor of the University of London. Quirk () presents a forceful and lucid case for the value of his Survey of English Usage. He argues that existing grammars lack systematicity, completeness, and authority, because grammarians rely on introspection instead of consulting speakers’ attested usage. He evinces scepticism about constructed illustrations of grammatical points, as they ‘describ[e] primarily what is grammatologically received and what [a grammarian] expects to find’ (p. ). Quirk also rejects the tradition of exemplifying usage by citing passages of high literary value: rather than particularly apt or eloquent use of language, it is ‘actual and normal’ (p. ) language that demands attention. Moreover, almost all extant grammars focus on written language, not spontaneous speech. Quirk concedes that some modern linguists were beginning to analyse authentic spoken language, but a broader and more consistent initiative was needed to uncover the features of everyday conversational English. Starting in , Quirk accepted the challenge of creating a more adequate database for a grammar of English. The project absorbed most of his attention in the s into the s, and remains an important resource (http://www.ucl.ac.uk/english-usage/ index.htm). The Survey amassed a one-million-word corpus of mid-twentieth-century British English, about equally from written and spoken sources. It comprises separate texts of , words each: published and unpublished writings, including instructional manuals, legal documents, newspaper articles, recipes, fiction, letters, and prose read aloud; and spontaneous speech, sometimes conversations, in diverse social contexts. Some of the spoken data derive from interviews broadcast on radio and television. Some was surreptitiously recorded in private or semi-private settings, with the goal of capturing as closely as possible what one project participant labelled ‘the earthy dynamism of unconstrained domestic spontaneous conversation’ (Crystal : ). The design, collection, and analysis of the corpus was from the start a collaborative effort, involving many of Quirk’s former students and colleagues from Britain, continental Europe, and North America. The written Survey data were analysed and annotated by hand on thousands of individual slips of paper. The spoken data were recorded on reel-to-reel tapes, transcribed, then, like the written data, annotated by hand. Later, linguists at the University of Lund computerized the ,-word spoken corpus (adding thirteen additional files), to produce the London-Lund Corpus, descendants of which are accessible through the Survey website. Eventually the written data and their annotations were computerized as well.5 Quirk was not the first to undertake such a survey, but its size, range, and inclusion of varieties of spontaneous speech broke new ground. From the outset, Quirk supplemented it with elicited material which he viewed as necessary to probe speakers’ evaluation of 5 The Survey contributed to two additional corpora: the British component of the International Corpus of English, and the Diachronic Corpus of Present-Day Spoken English; see http://www.ucl.ac.uk/ english-usage.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
(for example) agreement in sentences like ‘Mr. Smith with his son leave/leaves for London’, or adverb position in ‘Did they badly treat the servant?’ (Greenbaum and Quirk : , ). But for the most part he expected that a grammar of English ‘could be formulated from the patterns . . . emerging from a corpus of natural material’ (: ). Starting in , Quirk and a rotating cohort of collaborators began publishing grammars of English for various readerships, based largely on the London-Lund Corpus but sometimes supplemented by reference to earlier American or British corpora of written English, with occasional citation of elicited data. The largest, most ambitious, product of the Survey is Quirk, Greenbaum, Leech, and Svartvik’s -page Comprehensive Grammar of the English Language (CGEL) (), published about years after Murray’s grammar and one hundred years after Sweet’s. Despite their choice of title, the authors concede that writing a ‘complete and definitive grammatical description of English’ exceeds their capacity; they aim instead for ‘breadth of coverage and depth of detail, in which observation of particularities goes hand in hand with the search for general and systematic explanations’ (p. ). CGEL is organized in a spiral of three ‘cycles’: Chapter surveys the basic units of English phrases, clauses, and sentences; Chapters through cover, one by one, the constituents of simple sentences and their combinatorial privileges; Chapters through analyse more complex structures, including ellipsis, coordination, subordination, and the grammar of discourse. Separate Appendices address word-formation, prosody, and punctuation. The approach is assiduously descriptive, and the coverage exhaustive. CGEL notes in passing the existence of double negatives, within a footnote embedded in a subsection on nonassertive items with negation. It provides a telling contrast with earlier grammars: In nonstandard English, a negative item can be used wherever in standard English a nonassertive item follows the negative: : No one ever said anything to anybody : No one never said nothing to nobody Such double or multiple negatives are condemned by prescriptive grammatical tradition (Quirk et al. : )
A later subsection comments on the semantics of double negation: ‘The additional negatives in nonstandard English do not cancel out the previous negatives’ (p. ). Unlike Murray, Quirk et al. register the existence of double negation without labelling it as defective in principle: ‘No one never said nothing to nobody’ simply illustrates one option for marking negation, albeit a dispreferred one. Unlike Sweet, Quirk et al. do not set the construction in its historical context. Unlike Jespersen, they provide no cross-linguistic comparison. The London-Lund corpus allowed Quirk et al. to extract, assemble, and generalize across multiple exemplars of words and constructions, in diverse contexts. But of course this required more than simply collating patterns that emerged from the corpus: Quirk and his colleagues had to make many decisions about how to order items under a grammatical rubric; how to classify data where varieties of a construction shade
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
incrementally from one category into another; when to invent new terms rather than rely on conventional ones; and so forth. All such decisions are fraught; no grammar satisfies all readers. However, even as reviewers raised all manner of criticism, most, like Huddleston (: ), acknowledged the ‘pre-eminent and authoritative’ status of CGEL. CGEL and its sister texts also necessarily transcend the contents of the London-Lund corpus in a different sense. Quirk () asserted that the basis of a grammar of English resides in native speakers’ spontaneous speech. The Survey, augmented by the results of elicitation experiments, was designed to provide that basis. However, Quirk et al. do not strictly limit themselves to those data. First, they freely employ counter-examples to illustrate the limits of grammatical constructions, as in ‘Fortunately, he knows about it’ versus ‘*Does he fortunately know about it?’ (p. ). Many of these sentences were used in the elicitation experiments (e.g. Greenbaum and Quirk : ), legitimating the judgements indicated in CGEL. But not all of Quirk et al.’s data can be accounted for in this way. For example, pages – discuss the structure and meaning of (–): () We’ll buy the house, with a bank loan or without one () With or without a bank loan, we’ll buy the house () We’ll buy the house, bank loan or no bank loan () Bank loan or no, we’ll buy the house There’s no evidence that Quirk et al. elicited or observed these sentences, much less their semantics. More likely, Quirk et al. created (–) from their own intuitions, despite Quirk’s () scepticism about constructed examples. Second, the spoken data cited by Quirk et al. display a general absence of hesitation morphemes (‘um . . . uh . . . ’), fillers (‘I mean’; ‘You know’), fragments, repetitions, or restarts. The dysfluency of spontaneous speech seems to have been groomed out of the thousands of illustrative sentences in CGEL. The result is an idealized representation of natural language, albeit one far more genuine than earlier grammarians’ examples. In the final chapter, where Quirk et al. analyse conversations and unscripted discourse, their data are less retouched, leaving intact expressions like ‘well’ or ‘oh’, and self-interruptions, self-corrections, and restarts (pp. –). Still, the distance remains rather great between what appears on the page and the Survey’s goal to capture raw, truly authentic, language in all its ‘earthy dynamism’. Even if Quirk et al. did not realize their goal of inducing a grammar of English strictly from a naturalistic corpus, CGEL represents a departure in its conceptualization of what counts as the relevant data for a grammar, and in deriving those data from actual spontaneous language.
. N C (. )
.................................................................................................................................. Noam Chomsky is a theoretical linguist, not a grammarian. Unlike the other four scholars discussed here, Chomsky has not attempted to analyse the whole of English.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Nevertheless, he developed an influential conceptualization of what counts as a grammar, and of the source of the data on which a grammar should rest. On these points Chomsky differs starkly from Murray, Sweet, Jespersen, and Quirk—and the impact of his views on grammar writing has been widespread since the mid-twentieth century. Chomsky was born and raised in Philadelphia. His father was a scholar of medieval Hebrew, and principal of a Jewish community school where Chomsky’s mother taught. Chomsky’s extended family provided a heady intellectual environment, exposing him from an early age to a range of midcentury sociopolitical issues, and encouraging his precocious libertarian social consciousness. (Chomsky is now recognized internationally as an indefatigable critic of all forms of authoritarianism, in government, academia, the media, the military, and business; McGilvray .) Chomsky attended the University of Pennsylvania, where only mathematics and linguistics captured his imagination (Barsky ). His senior thesis on Modern Hebrew morphophonemics led to a Master’s thesis completed in . Chomsky then moved to Cambridge, Massachusetts, where he spent four years as a Junior Fellow at Harvard University. During that interval he began to develop what came to be called generative theory, which challenged behaviourism in psychology and the established schools of American structuralism in linguistics. Along the way, Chomsky created a manuscript of some , pages. It would not be published for another twenty years (Chomsky ), but he submitted a chapter to the University of Pennsylvania in lieu of a dissertation and was awarded a doctorate in . That same year Chomsky moved to MIT to work on a machine translation project, a poor fit to his interests and intellectual commitments. Eventually he settled into linguistic research and teaching. His work began to attract other young, unconventional, scholars interested in language, philosophy, and the emerging field of cognitive science. From their collaboration, a graduate programme in linguistics was born. Chomsky continued lecturing worldwide on both linguistics and politics beyond his retirement around . In he accepted the position of Laureate Professor of Linguistics, Agnese Nelms Haury Chair at the University of Arizona. Arguably, the basic character of generative theory has remained intact from the s, even as Chomsky’s expositions of its components and their interaction have shifted significantly. Its technical apparatus has evolved from concrete analyses of constructions in particular languages, to increasingly abstract accounts of the general architecture of human language and increasing concern for structural differences across languages. Three consistent themes relevant to grammaticology stand out. One fundamental theme is Chomsky’s problematization of how children learn their native language as rapidly and accurately as they do, granted what he calls ‘the poverty of the stimulus’ (Berwick, Pietroski, Yankama, and Chomsky ). This term labels Chomsky’s observation that children are exposed to only limited evidence about the language (or languages) used in their environment, evidence that does not necessarily transparently associate the forms of utterances to their meanings, and which is degraded by normal dysfluency and speech errors. In this sense, input to language learners is impoverished relative to their resulting knowledge. Chomsky argues that learners can only surmount the poverty of the stimulus if they have access to a rich and
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
highly-articulated language faculty—a cognitive template defining what human languages (including dialectal variants) have in common and what constitutes admissible differences across languages. Chomsky calls this faculty ‘Universal Grammar’. Language data in learners’ environments lead them to select options appropriate to the surrounding language from an inventory of structural properties that Universal Grammar affords. A second relevant theme in generative theory is its emphasis on the creativity of human language. Chomsky has consistently opposed the view that grammars are comprised of patterns transmitted across generations by modelling and imitation, shaped by the exigencies of communication. Rather, Chomsky asserts that Universal Grammar provides recursive mechanisms that generate (for every human language) an infinite number of possible rule-governed combinations. Speakers of any language, including child learners, routinely exploit the recursive capacity of their language to construct entirely novel utterances that any other member of the speech community can interpret. A third fundamental theme is generative theory’s distinction between underlying ‘competence’ (what speakers tacitly know about the structure of their language), and ‘performance’ (speakers’ observable use of language). Performance imperfectly reflects competence, in several ways. Most simply, performance may include errors or lapses that misrepresent competence. In addition, performance does not indicate the full scope of competence, such as speakers’ understanding of the near-synonymy of ‘I gave the box to Marie’ and ‘I gave Marie the box’ or the divergent roles of ‘John’ in ‘John is eager to please’ versus ‘John is easy to please’. Performance also under-represents competence in not specifying constructions ruled out by Universal Grammar: observed language use does not necessarily indicate what speakers consider impossible. The goal of linguistics, according to Chomsky, is to discover the nature of speakers’ internal competence, rather than to observe and catalogue their external performance. These three themes inform generative grammar’s conceptualization of what counts as a grammar, and what are appropriate sources for the data on which a grammar is built. According to Chomsky, a grammar is not a list of the features of a language, but an analysis of how the language faculty is instantiated within a particular language environment. That analysis must respect the notion of the poverty of the stimulus, since without a satisfying explanation of how a child learner can acquire a particular linguistic feature within the constraints of Universal Grammar, the grammarian risks positing an unlearnable (hence, impossible) grammar. To write a complete grammar therefore requires full knowledge of the contents of the language faculty, in addition to full understanding of how child learners select options from those available in Universal Grammar. It is not surprising, then, that few generative grammarians have attempted to write comprehensive grammars of any language.6 Instead, most have For example, the pages of Stockwell, Schachter, and Partee () do not constitute a comprehensive grammar of English—even granted that true ‘comprehensiveness’ is unattainable. The text is not ‘comprehensive’, in that it addresses only about ten constructions in English selected for their amenability to analysis by s generative grammar. Nor is it a ‘grammar’, in that its purpose is not to depict 6
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
focused on narrow domains that become tractable as the technical apparatus of the theory develops: domains such as reflexive pronouns, parasitic gaps, or the relationship of verbal inflection to word order. Chomsky’s recognition of the creativity of normal language and its basis in recursion also bear on assumptions foundational to writing a grammar. If human linguistic competence comprises knowing how to generate an infinite number of (potentially novel) utterances, then pursuit of a ‘comprehensive’ grammar of language X is misguided, because the collected output of the speakers of X does not define the grammar of X. Rather, the grammar of X resides in a state of the language faculty that differs in narrow but consequential ways from the state of the language faculty that generates the grammar of language Y. Efforts to record, label, and organize language data produced by speakers of X capture only an arbitrary portion of the output of the grammar of X, not the grammar itself. The distinction between competence and performance raises a related challenge to the validity of the data on which grammars rely. Generative grammarians seek to depict competence, but only performance can be observed. To Chomsky, probing the nature of competence requires ‘experimentation of a fairly indirect and ingenious sort’ (b: ), or introspection by speakers of the target language (see Sprouse and Schütze, this volume), because ‘it is absurd to attempt to construct a grammar that describes observed linguistic behavior directly’; in fact, linguistic analysis based on a corpus is—according to Chomsky—‘useless’ (p. ). A distinctive grammaticological style has therefore emerged from the conflation of Chomsky’s insistence that a grammar must not merely describe a language, but explain its acquisition, with his view of grammar as a generative system rather than a collection of facts, and with his scepticism about corpus-based research. Returning to the example of double negatives, grammarians influenced by Chomsky have produced a number of analyses of this feature of English, at different stages in the development of generative grammar.7 They share certain features, however: acceptance of double negation as an option supported by the human language faculty (regardless of any stigma attached to it in certain varieties of certain languages); attention to the problem of learnability; and respect for the gap between what can be observed in usage versus speakers’ underlying grammatical competence. For example, Zeijlstra () investigates cross-linguistic as well as diachronic variation in double negation (as ‘negative concord’). He also explicitly addresses the question of learnability (pp. –). Zeijlstra does not adopt Chomsky’s apparently uncompromising rejection of corpus-based research, in that his data derive from collections of spontaneous speech, augmented by published and author-constructed examples. However, Zeijlstra takes seriously the gap between the structure of English, but to explore the strengths and weaknesses of s generative analytic instruments. See Chapin’s () review of its ancestor text. 7 Labov (a) relies on Chomsky’s () transformational framework; Kallel () on Chomsky’s (b) principles and parameters model; Tubau Muntañá () and Zeijlstra () on the Minimalist Programme (Chomsky ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
competence and performance in proposing a highly abstract underlying grammar of negative concord, including multiple levels of configurational relationships, licensing of negative elements by abstract operators, and syntactic functions that act on formal features. Zeijlstra demonstrates what a generative grammar of English negative concord might look like—and in doing so illustrates the grammaticological assumptions of one school of twenty-first century linguistics.
. C
.................................................................................................................................. The works of Murray, Sweet, Jespersen, Quirk, and scholars in the generativist tradition differently conceptualize the material on which an English grammar should be based and how that material should be amassed. None uniquely monopolizes a particular conceptualization. Rather, each is part of the past that grammar-writing in the modern age carries forward; each is a star in the galaxy of English grammaticology, positioned variably with respect to other stars in a multi-dimensional skyscape. The intricate and subtle analyses represented in the following chapters of this handbook cluster relatively closely together in a twenty-first century constellation in their conceptualization of the relevant material of a grammar, and in the sources of those data. Their shared and divergent traits can best be appreciated against the full backdrop of the history of English grammaticology.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. D the grammar of a language involves setting up categories such as word classes, phrases and clauses (called a taxonomy), and creating a systematic and internally consistent explanatory account of how these categories relate to each other and how they combine to form larger units. But how many categories are needed, and how do we establish these larger units? This is a matter that naturally requires careful thought, and grammarians need to motivate the choices they make. Put differently, they need to make extensive use of argumentation to establish the framework of grammar that they are describing. Argumentation is a general notion which can be defined as an evidence-based stepby-step procedure for taking decisions. We all use various kinds of arguments all the time to take decisions in very different situations in our daily lives, for example in deciding whether to study linguistics, whether to buy a car or not, or where to go on holiday. In establishing grammatical descriptions, for example when deciding whether the word my should be analysed as a pronoun, rather than as a possessive adjective, we also need to make use of reliable arguments. We refer to this as syntactic argumentation. The aim of this chapter is to show how it plays a role in describing the grammar of English in many different ways. I will focus on grammar in the narrow sense, i.e. as referring to syntax, though some attention will also be paid to morphology (see also Chapters , , and ). My approach to grammar in this chapter is a relatively theory-neutral one, in which the distribution of the elements of language motivates grammatical description. In the next section I will look at some general principles that are used in syntactic argumentation, namely economy and elegance, which can be seen as two dimensions of simplicity. In sections . and . these principles will then be discussed in greater
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
detail using case studies. Section . looks at how argumentation plays a role in establishing constituency in clauses. Section . is the conclusion.
. G
.................................................................................................................................. In investigating the world around us scientists broadly speaking adopt one of two methodologies. One involves induction, the other deduction. The former is a procedure whereby you start from the particular (e.g. observable facts) and create a theory of some phenomenon, based on those observations, whereas the latter is defined as a kind of reasoning that proceeds from the general (a hypothesis) to the particular. (See also Wallis, this volume.) The Scottish philosopher David Hume argued against the use of induction by formulating what has become known as the problem of induction. The Austrian-British philosopher of science Karl Popper (–) articulated the problem as follows (/): It is usual to call an inference ‘inductive’ if it passes from singular statements (sometimes also called ‘particular’ statements), such as accounts of the results of observations or experiments, to universal statements, such as hypotheses or theories. Now it is far from obvious, from a logical point of view, that we are justified in inferring universal statements from singular ones, no matter how numerous; for any conclusion drawn in this way may always turn out to be false: no matter how many instances of white swans we may have observed, this does not justify the conclusion that all swans are white. The question whether inductive inferences are justified, or under what conditions, is known as the problem of induction. The problem of induction may also be formulated as the question of the validity or the truth of universal statements which are based on experience, such as the hypotheses and theoretical systems of the empirical sciences. For many people believe that the truth of these universal statements is ‘known by experience’; yet it is clear that an account of an experience—of an observation or the result of an experiment—can in the first place be only a singular statement and not a universal one. Accordingly, people who say of a universal statement that we know its truth from experience usually mean that the truth of this universal statement can somehow be reduced to the truth of singular ones, and that these singular ones are known by experience to be true; which amounts to saying that the universal statement is based on inductive inference. Thus to ask whether there are natural laws known to be true appears to be only another way of asking whether inductive inferences are logically justified.
Although induction can play a role in science (see Wallis, this volume), nowadays modern science prefers the deductive approach, advocated by Popper. The idea here is
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
that in thinking about a particular phenomenon that requires explanation you start out with a hypothesis about its nature. You then try to falsify the hypothesis (i.e. undermine it) by looking incrementally at more evidence that pertains to the case at hand. This is the reason why this approach is also called the hypothesis-falsification approach. According to Popper, a hypothesis is only worth investigating if it is falsifiable in principle. In investigating a particular hypothesis by looking at more and more evidence you end up with an optimally refined hypothesis which offers the best possible account of the facts, but which may in principle still be wrong. How do you go about establishing which hypothesis about some phenomenon might be correct? Naturally, any account of that phenomenon needs to cover the empirical facts. But what do you do if you end up with two or more accounts which cover the empirical facts? The answer is that you can then assess those accounts using the notion of simplicity. What this means is that, other things being equal, a simple account of the grammatical facts is to be preferred over a more complex account. Culicover and Jackendoff (: ) have encapsulated this idea in their Simpler Syntax Hypothesis: The most explanatory syntactic theory is one that imputes the minimum structure necessary to mediate between phonology and meaning.
We can distinguish at least two closely related dimensions of simplicity, hinted at by Chomsky in various publications, namely economy (also called ontological simplicity or ontological parsimony) and elegance (sometimes called syntactic simplicity). Economy refers to the strategy of using as few categories as possible in a description of a phenomenon. As such it is a quantitative concept. Aristotle was one of the first philosophers to value it, but the idea became known much later as the Principle of Occam’s Razor, after the English philosopher William of Occam (c. –) who held that Entia non sunt multiplicanda praeter necessitatem (‘Entities should not be multiplied beyond necessity’). It is also called the Principle of Parsimony and, more informally, the KISS Principle (Keep it Short and Simple). Scientists have always advocated some version of it, as in the following quotation from Einstein (where he is discussing the theory of relativity in his autobiography): The theory of relativity is a fine example of the fundamental character of the modern development of theoretical science. The initial hypotheses become steadily more abstract and remote from experience. On the other hand, it gets nearer to the grand aim of all science, which is to cover the greatest possible number of empirical facts by logical deduction from the smallest possible number of hypotheses or axioms. (: )
In the field of linguistics Chomsky stressed the importance of economy in one of his earliest works: In careful descriptive work, we almost always find that one of the considerations involved in choosing among alternative analyses is the simplicity of the resulting grammar. If we can set up elements in such a way that very few rules need to be
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
given about their distribution, or that these rules are very similar to the rules for other elements, this fact certainly seems to be a valid support for the analysis in question. (Chomsky /: )
And more recently: I think we can also perceive at least the outlines of certain still more general principles, which we might think of as “guidelines,” in the sense that they are too vaguely formulated to merit the term “principles of U[niversal] G[rammar].” Some of these guidelines have a kind of “least effort” flavor to them, in the sense that they legislate against “superfluous elements” in representations and derivations. (Chomsky : )
The other guiding principle for grammar writing is elegance. This refers to how a grammar is optimally organized or streamlined, for example what kind of assumptions are made, whether it is internally consistent, etc. The concept of elegance is distinguished from economy in being qualitative, rather than quantitative, and as such is perhaps somewhat numinous and subjective, as Chomsky indicates in an interview from : In physics, one might ask the same question: why look for elegant answers? Everybody does, but you might ask: why do it? The reason they do it is an almost mystical belief that there is something about our concept of elegance that relates to truth, and that is certainly not logically necessary. Our brains might have been devised in such a way that what looks elegant to them is totally off base. But you really have no choice but to try to use the resources of your mind to find conceptual unification and explanation and elegance, hoping that by some miracle that is the way the world works. (Chomsky : )
In sections . and . we will see how recent developments in the study of syntax demonstrate how economy and elegance can play a role in grammatical descriptions.
. E:
.................................................................................................................................. The dictum ‘less is more’ reflects the idea that a description of any kind of system, in our case the system of grammar, is more highly valued if the description is achieved by positing as few concepts as possible, everything else being equal. This idea is encapsulated in the Principle of Occam’s Razor, mentioned in the previous section. In what follows I will look at some examples of categorial economy to illustrate this principle at work, specifically the word classes in English. But before I do so, I will first discuss how word classes can be established in the grammar of English. We can establish which word class a particular element belongs to by looking at its morphological make-up and how it behaves in relation to other words when it is used
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
in a sentence. The latter is referred to as the word’s distribution (see Chapter ). For example, we can assign a word like investigation to the class of nouns because, among other things, it has a suffix -ion, it allows a plural -s inflection, and it can be preceded by a determinative such as the.1 If this word is followed by a complement it must be in the form of a prepositional phrase, e.g. an investigation of his tax affairs. By contrast, the word investigate does not allow a plural inflection, and cannot be preceded by the. We instead assign this word to the class of verbs on the grounds that it can take other kinds of inflections, such as -ed for the past tense and past participle forms, and -ing for the present participle form. We can furthermore say that investigate can combine with a subject, which has the role of ‘doer’ or ‘instigator’ of the act of investigating, and an object with the role of ‘patient’, i.e. the person, group, etc., that is being investigated. We will not look at each of the English word classes in detail here (see Hollmann, this volume). Suffice it to say that a general principle that applies to setting up categories—not just word classes—is that you should not have too many of them, and you should not have too few. You will have too many word classes (called splitting) if you have missed some generalizations regarding the way certain words behave in a language; conversely, you will have too few word classes (called lumping) if it turns out that you have ignored important differences between them.
.. Split or lump? Nouns and pronouns The issue of whether to split or to lump needs to be faced by anyone who wants to describe nouns and pronouns in English. The question is: ‘Do nouns and pronouns form separate word classes in English, or is there just one class of nouns?’ Interestingly, the two most influential reference grammars of English, namely Quirk et al. () and Huddleston and Pullum (), take opposing views on this issue: the former recognizes two word classes, whereas the latter subsumes pronouns under nouns (see Chapter ). Let’s look at some arguments that could be put forward for arguing that there are two word classes (adapted from Aarts ), and then discuss if they are valid. • Pronouns show nominative and accusative case distinctions (she/her, we/us, etc.); common nouns do not. • Pronouns show person and gender distinctions; common nouns do not. • Pronouns do not have regular inflectional plurals in Standard English. • Pronouns are more constrained than common nouns in taking dependents. • Noun phrases with common or proper nouns as head can have independent reference, i.e. they can uniquely pick out an individual or entity in the discourse context, whereas the reference of pronouns must be established contextually.
1
In this chapter I follow Huddleston and Pullum () in using determinative as a form label, and determiner as a function label.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Those who argue that we need recognize only a word class of nouns might respond to the points above as follows: • Although common nouns indeed do not have nominative and accusative case inflections they do have genitive inflections, as in the doctor’s garden, the mayor’s expenses, etc., so having case inflections is not a property that is exclusive to pronouns. • Indisputably, only pronouns show person and arguably also gender distinctions, but this is not a sufficient reason to assign them to a different word class. After all, among the verbs in English we distinguish between transitive and intransitive verbs, but we would not want to establish two distinct word classes of ‘transitive verbs’ and ‘intransitive verbs’. Instead, it would make more sense to have two subcategories of one and the same word class. If we follow this reasoning we would say that pronouns form a subcategory of nouns that show person and gender distinctions. • It’s not entirely true that pronouns do not have regular inflectional plurals in Standard English, because the pronoun one can be pluralized, as in Which ones did you buy? Another consideration here is that there are regional varieties of English that pluralize pronouns (for example, Tyneside English has youse as the plural of you, which is used to address more than one person; see Chapter on varieties of English). And we also have I vs. we, mine vs. ours, etc. • Pronouns do seem to be more constrained in taking dependents. We cannot say e.g. *The she left early or *Crazy they/them jumped off the wall. However, dependents are not excluded altogether. We can say, for example, I’m not the me that I used to be or Stupid me: I forgot to take a coat. As for PP dependents: some pronouns can be followed by prepositional phrases in the same way as nouns can. Compare: the shop on the corner and one of the students. • Although it’s true that noun phrases headed by common or proper nouns can have independent reference, while pronouns cannot, this is a semantic difference between nouns and pronouns, not a grammatical one. The conclusion that we can draw from the considerations above is that although pronouns are not prototypical nouns, i.e. they don’t necessarily share all the properties of typical nouns, a good case can be made that they belong to that word class nonetheless. There is one further argument that clinches the matter, and this is the fact that pronouns as heads of phrases distribute like noun phrases, e.g. in subject position, direct object position, and as complement of prepositions, as in () and (): ()
The kids love carrot cake. > They love it.
()
Tim sent it to his sister. > He sent it to her.
This is a strong reason for lumping pronouns and nouns into one category and recognizing only noun phrases in the grammar of English, not noun phrases and
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
nouns
pronouns
armchair, bicycle, door, extremity, garden, happiness, love; Bill, London, Susan, etc.
I, me, you, he, him, she, her, we, us, they, them; my, mine, your, yours, her, hers, his, its, our, ours, their, theirs; some, someone, somebody, everyone, etc.
nouns armchair, bicycle, door, extremity, garden, happiness, love; I, me, you, he, him, she, her, we, us, they, them; my, mine, your, yours, her, hers, his, its, our, ours, their, theirs; some, someone, somebody, everyone, Bill, London, Susan, etc.
. Lumping of nouns and pronouns
pronoun phrases.2 So we see that we can argue successfully on grounds of economy in favour of recognizing a single word class of nouns, instead of having two word classes, nouns and pronouns. Figure . schematically shows lumping for nouns and pronouns.
.. Economy of linguistic concepts: the so-called ‘gerund’ Let’s look at a further example of economy by discussing the notion of ‘gerund’ in English grammar, and the question of whether or not we need this concept. Consider the sentence below: ()
[This meeting of minds] will revitalize research into a cure for this illness.
We would be forced to recognize ‘pronoun phrases’, unless we allow for phrases that have a head belonging to a word class other than the one that they are named after, as do Quirk et al. () where pronouns function as head in noun phrases, despite the fact that these authors regard pronouns as a separate word class. 2
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In this sentence we have the bracketed string this meeting of minds which functions as the subject of the sentence. Most linguists will agree that this is a noun phrase whose head is the noun meeting. How do we know that meeting is a noun? If we look at the company this word keeps—in other words if we look at its distribution—we find that it is preceded by the determinative this and is followed by the prepositional phrase of minds. This is a typical environment for nouns to occur in. Notice also the potential to change this string by adding, for example, an adjective in front of meeting, e.g. this wonderful meeting of minds, which is another property typical of nouns. Let’s now look at another sentence: ()
The committee will be meeting the candidates at p.m. today.
This case is very different from (), because here meeting is not a noun, but a verb. How can we tell? Again this is a matter of distribution. Notice that in () meeting has a noun phrase after it which functions as its direct object. Taking a direct object is a verbal property. The verb meet also has a subject, namely the committee, again a property of verbs (though not exclusively so). Furthermore, as with () we need to look at the potential for adding elements to test whether meeting is indeed a verb. In this connection, consider (): () The committee will be meeting the candidates briefly at p.m. today. Here we have added the manner adverb briefly, which can only modify a verb. (Substituting the adjective brief here would render the sentence ungrammatical.) Consider next (): () [Meeting the candidates briefly] will not give us enough time to assess them. Which word class do we assign meeting to in this example? This case is less obvious. You may well be thinking ‘this word is clearly also a verb here, for the same reasons that meeting is a verb in ()’, i.e. the verb has a direct object and is modified by a manner adverb. However, some grammarians would say that meeting in () is a gerund which they define as the -ing form of a verb when it (and any dependents it may have) are used in typical noun phrase positions (e.g. subject, object, etc.). Under this definition the word meeting in () is also a ‘gerund’, because the -ing form here is a noun functioning as the head of a noun phrase in subject position. Although these linguists would recognize that there is a difference between () and () by saying that the ‘gerund’ in () is not an ordinary noun, but a special kind of noun, often called a verbal noun, because it is derived from the verb meet, whereas meeting in () is verblike by virtue of taking a direct object, they would nevertheless insist that because of their typical position (as the head of a constituent in a noun phrase position), we should use the label ‘gerund’ for both () and (), as distinct from the participle form of the verb in (). There is a ‘cost’ to this reasoning because these authors need an additional item in their repertoire of grammatical concepts. What arguments do they
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
use to support their view? The rationale for labelling meeting in () and () as ‘gerunds’ is that historically both instances of this word derive from a different source than the present participle in (): ‘gerunds’ derive from nouns ending in -ing or -ung in Old English, whereas the present participle derives from a form that contains -nd (namely -end(e)/-and(e). For some recent accounts, see Denison : , De Smet : , Los : f.). However, in Present-Day English these different historical roots are no longer visible because both the ‘gerund’ and the present participle have exactly the same form.3 While it is true that the string meeting the candidates briefly in () occurs in a noun phrase position, and hence displays a noun-like property, the verbal properties of meeting outweigh the nominal properties. The question we should now ask is whether we need the concept of ‘gerund’ for an economical description of English. You will have realized where this discussion is heading, given that I have consistently used inverted commas around the word ‘gerund’, thus signalling that I do not give much credence to this concept. The reason for this is that the simplest account of examples ()–() is surely one in which only the word meeting in () is a noun (to be sure, a verbal noun), whereas the other instances of this word in ()–() are verbs (participles), despite the fact that examples like () do display some nominal properties.
. E:
.................................................................................................................................. In the previous section we looked at ways in which we can achieve economy in grammatical analyses. In this section I will exemplify how we can achieve elegance in grammatical descriptions by looking at some traditional treatments of the word classes and how they can be improved by using syntactic argumentation.
.. Reconceptualizing word class membership: prepositions and subordinating conjunctions All grammars distinguish separate word classes of prepositions and subordinating conjunctions (or subordinators for short). In many frameworks the former class includes words such as in, on, through, and under, whereas the latter includes that, whether, if,4 and for, as well as a large number of items used to indicate a 3 Huddleston and Pullum () use the hybrid label gerund-participle to reflect the identity in form of the ‘gerund’ and the present participle, as well as the different historical roots of ‘gerunds’ on the one hand and the present participle on the other. 4 Both interrogative if (if inter), as in I wondered if he would call me, and conditional if (if cond), as in She will make a lasagna if you buy the ingredients.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
range of semantic notions, e.g. after, before, since, when, while, whilst, etc. (temporal subordinators); as, because, since (reason subordinators); although, (even) though, while, whilst (concessive subordinators); so (that), in order (that) (purposive subordinators), and so on. In the linguistics literature a number of issues concerning both prepositions and subordinating conjunctions have been discussed extensively (e.g. Huddleston and Pullum ), two of which I will now look at in more detail (see also Chapter ). Starting with prepositions, the traditional definition stipulates that they are typically followed by a noun phrase or pronoun to form a prepositional phrase, e.g. up the stairs, on the floor, with us, etc.5 Because under this view prepositions must be followed by NPs, elements such as up in She ran up or on in He moved on, as part of so-called phrasal verbs (both the literal and non-literal types, as in these examples), have not been analysed as prepositions, but as belonging to a different class, typically particles (or sometimes adverbs). We might ask if this is warranted. First of all, recognizing a new word class should be avoided—if we take Occam’s Razor seriously—unless there are good independent reasons for doing so. In this particular case, you will no doubt have noticed that the words up in look up and on in move on are used as prepositions in my earlier examples up the stairs and on the floor. The simplest account of up in up the stairs and look up, and of on in the combinations on the floor and move on, is surely one in which these words are assigned to the same word class, by virtue of the fact that in the constructions concerned up and on have exactly the same form.6 One way of doing so is by saying that up and on always belong to the class of prepositions, but that these prepositions can be transitive or intransitive. What this means is that they can occur with or without a complement following them, just like transitive and intransitive verbs. Consider the following contrast: ()
He ran [PP up [NP the stairs]], and put his coat [PP on [NP the floor]]. Here up and on are transitive prepositions functioning as the head of a PP and taking a noun phrase as complement.
()
She looked [PP up] and then moved [PP on]. Here up and on are intransitive prepositions functioning as the head of a PP that is licensed by the verbs look and move.
This account of prepositions achieves not only an economy, since we can now do away with a redundant word class of particles, but the grammatical description is also more streamlined because we do not need to make multiple statements about which word classes elements like on and up belong to in different grammatical contexts. These words look like prepositions, and in their transitive and intransitive
5 In the traditional account prepositions can also take prepositional phrases as complements, e.g. [PP from [PP inside the building]], but not finite clauses (see the discussion of example () below). 6 The simplest hypothesis of data that you want to investigate is called the null hypothesis.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
uses occur in similar positions, so it is reasonable to say that they belong to the same word class.7 Turning now to subordinators, these are defined as elements which function as marker to indicate that a particular clause is subordinate to another. In theoretical frameworks subordinators that introduce clauses that function as complement (e.g. of a verb or adjective) are called complementizers. This set includes only four items, namely that, whether, if inter., and for. Here are some examples: ()
I think [that this poem justifies his point]. (WA- )8
() Some may wonder [whether all this is the true stuff of citizenship]. (SA- ) ()
She asked [if she might see a hand-mirror], please, and when it was placed in her hand she started fussing, quite urgently, with her hair. (WF- )
()
I’m perfectly happy [for you to clap and sing and be as loud as you want], and if it makes you feel good then I’m really happy. (SA- )
The set of complementizers does not include the other elements listed above: after, before, since, when, while, whilst, as, because, although, (even) though, if cond., so (that), etc. The reason for this is that the latter typically introduce clauses that function as adjunct. In Aarts (: ) I call these items adjunctizers. Given this difference between complementizers and adjunctizers there have been recent proposals to reconceptualize the class of subordinators to include only the complementizers, and to re-assign the adjunctizers to another class. Which class would that be? Looking at the list of adjunctizers, the word class that comes to mind is ‘preposition’. If we take this step, then we have one more reason to allow prepositions to occur intransitively or transitively, as in the sentences below, using before as an example: ()
Have we met before? [used intransitively, without a complement]
() We met before the party. [used transitively, with an NP as complement] ()
We met before the party began. [used transitively, with a finite clause as complement]
Huddleston and Pullum (: ) recognize ‘particles’ as elements of what are often called ‘transitive phrasal verbs’, e.g. look (NP) up, work (NP) out. But confusingly they also say that ‘[t]he most central particles are prepositions – intransitive prepositions, of course, since they are one-word phrases’. For an account in which these ‘particles’ are exclusively analysed as intransitive prepositions, see Aarts (). 8 Examples with ID codes are taken from the British component of the International Corpus of English (‘S’ for spoken examples; ‘W’ for written examples). 7
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
subordinators complementizers: that, if interr., whether, for adjunctizers: although, as, because, if cond., since, while, etc.
prepositions subordinators that, if
interr.,
whether, for
at, by, in, on, through, under, with, etc.
although, as, because, if cond., since, while. etc.
. Reassignment of adjunctizers to the preposition class
Allowing for prepositions to be transitive or intransitive does not complicate the grammar because the notion of transitivity is needed in any case to describe verbs. We can visualize the reassignment of the adjunctizers to the class of prepositions as in Figure .. This may look like splitting, but it is not, because we already had an existing category of prepositions. What we have done is reconceptualize the class of subordinators, such that some of its supposed members now belong to the class of prepositions. This results in a simpler, more streamlined, and hence more elegant grammatical system, because it obviates the need to assign the same word (e.g. before in ()–()) to different word classes.
.. Avoiding meretricious lumping: adjectives and determinatives Consider the following examples: ()
the stories
() my stories ()
untrue stories
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
As these examples show, the italicized words share the property of being placed before a noun, and for this reason they have often been lumped together into the adjective word class. The American grammarian George Curme (: ) writes as follows about adjectives: There are two classes, descriptive and limiting. A descriptive adjective expresses either the kind or condition or state of the living being or lifeless thing spoken of: a good boy, a bright dog, a tall tree; a sick boy, a lame dog. The participles of verbs in adjective function are all descriptive adjectives, since they indicate either an active or passive state: running water, a dying soldier, a broken chair. A limiting adjective, without expressing any idea of kind or condition, limits the application of the idea expressed by the noun to one or more individuals of the class, or to one or more parts of a whole: this boy, this book, these books; this part of the country. In all the examples given above, the adjective stands before the noun. The adjective in this position is called an adherent adjective.
Curme, like Bloomfield () before him, recognizes nine classes of limiting adjectives (: –):9 • possessive adjectives, e.g. my, his, her, its, our. • intensifying adjectives, e.g. myself, yourself, himself (as in e.g. Father himself ). • demonstrative adjectives, e.g. this, these, that, those, the same, such, the, both, each, every, all. • numeral adjectives, e.g. four, fifty, first, second, third, last, twofold. • relative adjectives, e.g. which, what, whichever, whatever. • indefinite adjectives, e.g. a(n), all, every, some, many, little, few, enough. • interrogative adjectives, e.g. what, which. • proper adjectives, e.g. Harvard, as in Harvard student. • exclamatory adjectives, e.g. what and what a, as in What nonsense and What a beautiful day! Is Curme justified in lumping all these elements into one class of adjectives? To answer this question, let’s contrast his class of ‘descriptive adjectives’ with his ‘demonstrative adjectives’ and ‘possessive adjectives’. First, let’s look at the function of words such as the and my on the one hand, and untrue on the other. As Curme acknowledges, in this regard they are different. The former have what he calls a ‘limiting’ function, i.e. a specificational function, whereas the latter have a ‘descriptive’ function. Put differently, and using current terminology, the and my function as determiners inside the noun phrase, whereas untrue functions
These are discussed in a section headed ‘Limiting Adjectives Used as Pronouns’. I’ve used the version of the book, published online (see References). 9
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
as a modifier (or adjunct).10 From the point of view of meaning, the and my mark the NP in which they occur as definite, whereas untrue ascribes a property to the head noun. What about the distribution of these elements inside noun phrases? If we follow the principle that we assign elements to word classes on the basis of the company they keep, i.e. their distribution,11 then the following examples should shed some light on the question we asked: ()
*[the my] stories
()
*[my the] stories
() untrue, vengeful, damaging stories ()
*[completely/very the] stories
()
*[completely/very my] stories
() completely/very untrue stories ()
*(the) stories are the
()
*(the) stories are my
()
(the) stories are untrue
() *unthe () *unmy From () and () we see that words like the and my cannot be used together, whereas descriptive adjectives can easily be stacked, as in (). Examples ()–() show that the and my cannot be preceded by an intensifying adverb like completely or very, whereas this is unproblematic for untrue. Examples () and () show that the and my cannot occur after a linking verb such as be, whereas this is perfectly fine for untrue, as () demonstrates. Notice also from () and () that the prefix un- cannot be added to the and my, whereas we can analyse untrue morphologically as consisting of the root true with the prefix un- added to it. All these observations lead us to conclude that the and my must belong to a different word class than untrue. However, although you will agree that () is ungrammatical, you may have wondered about (), in which we have mine. () The stories are mine. Most grammarians (including Curme) regard this independently occurring word as a possessive pronoun, but if we compare () with (), we should ask whether it is possible to conclude that mine is an adjective. The answer is: no, not solely on the basis of comparing these two sentences, because the position after verbs like be is not 10 11
See Footnote on the determinative/determiner distinction, as well as Chapter . See Chapter for alternative views.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
exclusively a position where adjectives occur; noun phrases can also be placed here, as in (): ()
These are my stories.
What’s more, mine can occur in typical noun phrase positions, such as subject, object and complement of a preposition: () Mine are true, yours are a lie. ()
You have your stories, I have mine.
()
Your stories are similar to mine.
Finally, () shows that mine cannot be modified by an intensifying adverb: ()
*The stories are very mine.
These data all strongly suggest that the standard analysis of mine as a pronoun heading an NP is correct. However, consider next (): ()
It’s my favorite time to be alive, because the city feels like it’s completely mine. (Men’s Health, ., , from the COCA Corpus12)
This sentence is interesting, because here mine seems to be modified by completely, and (pro)nouns cannot normally be modified by an adverb.13 This is troublesome, because it now seems that we have some prima facie evidence in () that mine can behave like an adjective after all. However, that conclusion would be too hasty. We would need to examine further data to see how completely behaves in other constructions. If we do so, using attested data, we find that () is not all that exceptional, and that pronouns more generally can be preceded by completely. I found the following examples in the NOW Corpus:14 () I can’t change it and I’ve been completely me and I’m really happy with how I’ve put my best foot forward the whole time. (New Zealand Herald, April ) ()
In between performing emotional ballads, Adele is completely herself on stage and tells some hilarious anecdotes. (Irish Times, June )
()
Rugby isn’t completely everything for me. (Irish Independent, March )
12 13 14
See https://www.english-corpora.org/coca/. For some exceptions, see Payne, Huddleston, and Pullum (). See https://www.english-corpora.org/coca/.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Now, even Curme would not wish to draw the conclusion that me, herself, and everything are adjectives in these examples. For him and most other linguists these words are pronouns (cf. : f.).15 But that leaves us with the problem of how to account for ()–(). If we insist that mine, me, herself, and everything are pronouns (and hence a kind of noun), how is it possible that these words seem to be modified by completely? A possible answer would be to say that mine is indeed a pronoun that functions as the head of a noun phrase, but that it is the noun phrase as a whole that is being modified by completely, as follows: ()
[NP completely [NP mine]]
At first sight, this looks suspiciously like an ad hoc solution, i.e. a solution that is cooked up to provide a convenient, but ultimately implausible, solution to a problem. Why would it be implausible? Because there is no way of distinguishing between (), in which completely modifies the entire NP headed by mine, and (), in which completely modifies the adjective mine functioning as the head of an adjective phrase. ()
[ADJP completely [ADJ mine]]
However, a look at further data suggests that there may be some independent justification for an analysis along the lines of (), namely examples such as the following: () That would be completely the wrong decision. (Daily Mail, May , from the NOW Corpus) ()
That would be the wrong decision completely.
In these examples we have completely modifying a noun phrase as a whole, and it can do so before or after the NP, as the bracketings below show: ()
[NP completely [NP the wrong decision]]
()
[NP [NP the wrong decision] completely]
In () and () the adverb completely does not modify the head of a phrase, as is the case in () (completely/very untrue), but an entire phrase. In Huddleston and Pullum (: ) such elements are regarded as peripheral modifiers which occur at the edges of NPs ‘mainly in initial position (before any predeterminer) but in a few cases in final position’. How are () and () relevant for the analysis of a string like completely mine? The reasoning goes as follows: if we need this kind of analysis for () and (), then we have independent evidence that an analysis like () is not as far-fetched as it at first As noted above, Curme would label herself, as in e.g. the neighbour herself, as an ‘intensifying adjective’. 15
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
seemed, and that what initially looked like support for analysing mine as an adjective is in fact invalid. I hope that from the preceding discussion it is clear that Curme’s lumping strategy is specious: at first sight it looks as though an economy is achieved, since Curme does not need a category of determinative, but if we look carefully at the facts of English it cannot be defended. An account which does away with the class of limiting adjective is more elegant for robust semantic and syntactic reasons.
. C
.................................................................................................................................. Syntactic argumentation also plays a role in grammars of English that recognize constituents, i.e. strings of words that behave like units (see Lohndal and Haegeman, and Borsley, this volume, and Aarts ). In sections .. and .. I will discuss the two main tests for establishing constituency: movement and substitution.
.. Movement The movement test for constituency relies on the idea that if a group of words in a clause is displaced, then that group of words must be a constituent. To illustrate, let’s assume that we wish to know whether the string of words the business dealings of Mr Frump forms a constituent in () below. () The court will investigate the business dealings of Mr Frump. A simple way in which we can test this is by moving the italicized string to the front of the clause, as is shown in () (the underscore symbol ‘_’ indicates the position from which the unit has been displaced): ()
The business dealings of Mr Frump, the court will investigate _ .
This process is called topicalization because the fronted unit becomes the topic of the clause.16 Another way in which to transform () is to make it passive. If we do so the direct object in () becomes the subject in (): ()
The business dealings of Mr Frump will be investigated by the court.
The data in () and () show that the business dealings of Mr Frump indeed behaves like a unit. As this unit has a noun as its most prominent element, we call it a noun phrase. 16
On topicalization, see also Kaltenböck (this volume).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Notice that we can also show that the court is a noun phrase by changing () into an interrogative structure: ()
Will [the court] investigate the business dealings of Mr Frump?
We have inverted the position of the subject the court and the auxiliary verb will here, and this shows that the words the and court belong together. Consider next (): () The politicians said that they will change the world. We now wish to investigate how to carve up the subordinate clause that they will change the world. Specifically, we might ask: do the two verbs will and change together form a unit as some kind of complex verb, or do we group will, change, and the world together? Or perhaps only change and the world? The way to test this is to try out a few movements (as before the symbol ‘_’ indicates the position from which the unit has been displaced): ()
*They said that they will change the world, and [will change] they _ the world.
()
*They said that they will change the world, and [will change the world] they _ .
()
They said that they will change the world, and [change the world] they will _ .
The data show that change the world behaves like a unit, but will change and will change the world do not. As its most prominent element is a verb, we call this unit a verb phrase.
.. Substitution The substitution test for constituency relies on the idea that if a group of words in a clause is replaced by a so-called pro-form, then that group of words must be a constituent. Consider again sentence () above. Notice that we can replace the subject the court by the pronoun it and the italicized direct object string by them: ()
It will investigate them.
In this case we have a specific kind of pro-form, namely pronouns, replacing noun phrases (see also () and () above). Verb phrases too can be replaced by pro-forms, as the following example illustrates: ()
The court will investigate the business dealings of Mr Frump, and the FBI will do so too.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Here the pro-form do so in the second clause replaces the verb phrase investigate the business dealings of Mr Frump, again showing that these words together form a unit. Apart from movement and substitution, there are a number of further ways in which we can use syntactic argumentation to test constituency. These are discussed extensively in Aarts ().
. C
.................................................................................................................................. In this chapter I began with a discussion of some of the principles of syntactic argumentation. After a discussion of induction and deduction we looked at a number of case studies which demonstrated how the notion of simplicity, which encompasses economy and elegance, helps us to decide between competing accounts of a particular linguistic phenomenon or issue, for example how to set up optimally defined word class categories which can be independently justified, without having to resort to ad hoc solutions. In the second part of the chapter we saw how syntactic argumentation can be used to establish the constituent structure of sentences. By manipulating possible and impossible structures, as we did in section ., argumentation can be a useful heuristic for establishing the principles of syntactic structure in a language.
A I’m grateful to my fellow editors, internal referee Jon Sprouse, and an anonymous referee for comments on this chapter. Thanks also to Bob Borsley for some useful references.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. ̈
. I
.................................................................................................................................. O approximately the past twenty years, linguists have taken a renewed interest in the data that underlies grammatical theories. This has taken the form of large scale reviews of data in syntax (e.g., Schütze ), textbooks (e.g., Cowart ), proposals for new experimental techniques (e.g., Bard et al. ), enticements to widespread adoption of formal experimental methods (e.g., Featherston ), proposals for new grammatical models that can accommodate different data types (e.g., Keller , Featherston ), criticisms of informally collected data (e.g., Gibson and Fedorenko ), and finally chapters like this, which attempt to provide a useful summary of our current state of knowledge about data in syntax. Here we will discuss five types of data: corpus data, acceptability judgements, reading times (self-paced reading and eye-tracking), electrophysiological methods (EEG and MEG), and haemodynamic methods (specifically fMRI). It is important to note that in principle, there is no privileged data type in linguistics. Any data type that can bear on the nature of the grammar is a potential source of data for theorists. The wide spectrum of possible data types arises because many linguists assume that there is a single grammatical system that plays a substantive role in both language comprehension and language production (Marantz () calls this the Single Competence Hypothesis). Given this assumption, all language behaviours, and all neurobiological responses related to language behaviours, are potential sources of information about the grammar. We will focus on these five data types because they play the largest role in modern studies of grammar. In the sections that follow, we will discuss each data type in turn. The approach that we take for each data type will be slightly different, because each data type has historically played a different role in grammatical theory construction.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. C
.................................................................................................................................. Chapter by Sean Wallis in this volume provides an excellent discussion of the use of corpus data in linguistics. We won’t double that effort here (and couldn’t do nearly as good a job). Instead, we will briefly discuss some potential disadvantages of corpus data that lead some linguists to explore other data types. For example, two fundamental questions in studies of grammar are (i) Is a given sentence grammatical or ungrammatical (i.e., possible or impossible in the language)?, and (ii) For ungrammatical sentences, what is the property that causes the ungrammaticality? Because corpus data is fundamentally observational, it cannot be used to definitively answer these two questions. For the first question, one potential strategy would be to use the presence or absence of a sentence in the corpus as a proxy for grammaticality or ungrammaticality, respectively. The problem with this approach is that presence and absence are influenced by factors other than the grammar: it is possible for grammatical sentences to be absent from a corpus simply due to sampling (the relevant sentence was accidentally not uttered during corpus generation), and it is possible for ungrammatical sentences to be present in a corpus due to speech errors or the inclusion of nonnative speakers in the sample. This leads linguists interested in the first question to seek out data types that more directly test the impact of the grammar while controlling for other factors that might influence the outcome—in other words, a controlled experimental setting. For the second question (what properties cause the ungrammaticality), the concern among some linguists is that corpus data can only reveal correlations between grammatical properties and presence/absence in the corpus; corpus data cannot directly reveal causal relationships. Again, this leads linguists who are interested in the mechanisms underlying ungrammaticality to seek out controlled experimental methods that can be used to directly manipulate grammatical properties to reveal causal relationships with grammaticality. As Chapter demonstrates, there are plenty of interesting research questions one can investigate using corpus data, but for linguists interested in ungrammaticality and its mechanisms, corpus data tends to be less useful than experimental methods.
. A
..................................................................................................................................
.. What are acceptability judgements? An acceptability judgement is simply the act of judging whether a sentence is ‘acceptable’ in a given language. But this simple definition belies the complexity of what one means by ‘acceptable’. One common assumption in linguistics is that the act of comprehending a sentence automatically (in the sense of an automatic cognitive process) gives rise
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
to an evaluation of that sentence along multiple dimensions: the grammaticality of the sentence, the plausibility of the meaning, the processing difficulty associated with comprehending the sentence, etc. Another common assumption is that those multiple dimensions tend to be (automatically, and therefore subconsciously) combined into a single percept. It is that multi-dimensional percept that linguists call ‘acceptability’. Acceptability judgements can then be defined as a conscious report of the automatic evaluation of the acceptability of a sentence, elicited for experimentally designed sentence types (either in an informal setting, as has been typical in linguistics, or in a formal experiment, as has become more common over the last two decades). In short, acceptability judgements are a behavioural response that can reveal information about the grammaticality of a sentence, if the experimenter controls for other factors that influence acceptability (plausibility, processing, etc.). Therefore, the goal of an acceptability judgement experiment is to isolate a potential difference in grammaticality between two (or more) conditions, while holding differences in other properties constant. There are at least three pieces of information that acceptability judgements can potentially provide that are relevant for constructing grammatical theories. The first is the presence/absence of an effect—whether there is a difference in acceptability between two (or more) conditions. In many ways this is the minimum piece of information that may be relevant for a grammatical theory. Assuming that the two (or more) conditions were well controlled, such that the only difference between them was the grammatical property of interest, the presence of a difference tells us that the grammatical property has an effect on acceptability. The second potential piece of information is the effect size—the size of the difference between conditions. Again, assuming that the conditions were well controlled, the effect size tells us how big an impact the grammatical manipulation has. The third potential piece of information is the location of the conditions on the scale of acceptability.1 These three pieces of information can be used in different combinations by linguists to construct and evaluate grammatical theories. Figure . highlights these three pieces of information. The raw data for Figure . comes from sentence types (conditions) that form pairwise phenomena. For example, one of the phenomena is exemplified by these two sentences (from Sobin ): () a. Some frogs and a fish is in the pond. b. Some frogs and a fish are in the pond. By hypothesis, the sentence in (a) is ungrammatical because it does not respect the subject-verb agreement properties of English (whereas (b) does). Sprouse et al. ()
1
Location on the scale is a complex topic in its own right, as the way that participants use a scale will be influenced by the instructions that they are given (How are the points on the scale labelled? Are example items given for the points on the scale?) and the content of the experiment itself. See Schütze and Sprouse for a discussion of the details of acceptability judgement tasks.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
0
−1
mean judgment (z−transformed)
4 1
effect size (Cohen’s d)
mean judgment (z−transformed)
3 2 1
200 100 conditions (ordered by judgment)
1
0
−1
0 0
0
100 50 phenomena (ordered by effect size)
0
100 50 phenomena (ordered by effect size)
. Three illustrations of the types of information available from acceptability judgements. Raw data is pairwise phenomena randomly sampled from Linguistic Inquiry and tested by Sprouse et al. (). The leftmost panel shows location information. The middle panel shows effect size information. The rightmost panel combines the two. The middle and rightmost panels also convey the presence/absence of an effect
tested this reported contrast and others that were randomly sampled from the journal Linguistic Inquiry in formal judgement experiments using a seven-point Likertlike scale (see also section .. below). In Figure ., we use the phenomena that showed statistically significant effects to illustrate the three pieces of information that judgement experiments make available to linguists. The leftmost panel of Figure . takes all sentence types, and orders them in ascending order according to their mean judgement (after a z-score transformation, which removes some types of scale bias; see Schütze and Sprouse for more discussion). This panel therefore highlights location information: some sentences are very clearly on the low end of the scale, others at the high end, and still others in the middle. The middle panel of Figure . highlights effect sizes. Each vertical bar represents one pairwise phenomenon, with the height of the bar representing the size of the effect, which reflects the size of the difference between the mean ratings of the two conditions in each phenomenon (reported here as Cohen’s d, which standardizes the raw effect sizes by dividing them by the standard deviation). The rightmost panel combines these two pieces of information by plotting the ratings of the two conditions in each phenomenon in a vertical pair (thus showing location information), and connecting them with a line (thus showing raw effect size information). Both the middle panel and the rightmost panel also highlight the presence/absence of an effect in a way that the leftmost panel does not. In practice, which of the three pieces of information the linguist decides to use depends upon the specific grammatical theory being investigated, and the way it is used depends on the specific argument that the linguist wishes to make (see Chapter by Bas Aarts on linguistic argumentation). That said, some basic patterns of use do emerge. First, all studies using acceptability judgements report the presence/absence of an effect, as this is the minimum required to determine whether a grammatical manipulation is potentially relevant to a theory. Second, grammatical theories that rigidly divide sentences into two types (grammatical and ungrammatical) tend to also use location on
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
the scale as a potential piece of information about whether a sentence should be classified as grammatical (high on the scale) or ungrammatical (low on the scale). The use of scale location information is complex in its own right because there is no necessary connection between location on the scale and grammaticality. Even if extra-grammatical properties are controlled across the conditions in the experiment as discussed above, participants can rate an individual sentence high or low because of the extra-grammatical properties of that individual sentence. This is particularly salient in classic examples of mismatches between acceptability and grammaticality, such as the acceptable-butungrammatical comparative construction in () (from Montalbetti ), which appears acceptable but has no coherent meaning (Wellwood et al. ), and the unacceptablebut-(by-hypothesis)-grammatical doubly centre-embedded relative clause in () (from Miller and Chomsky ), which appears unacceptable, but is most likely unacceptable due to the difficulty of processing the centre-embedded relative clauses. () More people have been to Russia than I have. (acceptable-but-ungrammatical) () The food the dog the cat scratched ate spoiled. (unacceptable-but-grammatical) Despite the possibility of mismatches between location information and grammaticality, it is not uncommon for location information to play a role in linguistic argumentation about grammars that divide sentences into two sets (e.g., an author might assume a transparent mapping between the two halves of the scale of acceptability and ungrammaticality/grammaticality). The third piece of information, effect size information, tends to play a role in studies that seek to compare the constraints that make up the grammar. This can either be because the grammatical theory under investigation distinguishes more than two levels of grammaticality (e.g., Keller , Featherston ), or because the investigation is exploring properties that might distinguish two constraints from one another (e.g., Chomsky a).
.. Are the acceptability judgements published in the literature valid? Perhaps the most important debate in the acceptability judgement literature concerns the validity of the judgements that have been published in the literature so far. Since the earliest days of acceptability-based linguistic theories there has been a concern that the relatively informal methods that linguists tend to use to collect acceptability judgements might lead to invalid data, and therefore incorrect theorizing (e.g., Hill , Spencer ). Over the past twenty years, as formal experimental methods for judgement collection have gained in popularity, this question has arisen more and more often, with many linguists asking whether the field should shift entirely to formal acceptability judgement collection methods (e.g., Bard et al. , Schütze , Cowart , Ferreira , Featherston , Gibson and Fedorenko , Gibson and Fedorenko , among many others). Nearly every linguist who
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
has written about this issue agrees that there are benefits to formal acceptability judgement experiments, making them an important tool in the syntactician’s toolkit. What is less clear is whether informal judgement collection methods should co-exist with formal experimental methods in that toolkit, and whether the informally collected judgements that have been published in the literature to date should be considered valid. The central concern (e.g., from Gibson and Fedorenko ) is that linguists tend to solicit informal judgements from other linguists. Because professional linguists are aware of the theoretical issues at play for a given judgement, it is possible that their judgements might be (subconsciously) biased. A related concern is the fact that linguists tend to collect small numbers of judgements, leading to the possibility that the results they obtain are not representative of the population. Similarly, linguists tend to use a small number of example items, leading to the possibility that the results they obtain are not representative of all of the possible tokens of a given construction. If any of these potential problems were actual problems, the published judgement literature would not be a solid foundation for constructing grammatical theories. Sprouse et al. () and Sprouse and Almeida () took a first step toward investigating these concerns by directly comparing published informal judgements with formal judgements that they collected using the best practices of formal experimental work (e.g., naïve participants, large samples, multiple items per condition, frequentist and Bayesian statistical analyses, etc.). Their argument is that, although this does not settle the question of which method is best, it does begin to address the question of how much impact these design choices might have on the data. If the results overlap substantially, then either both methods are valid or both are invalid. If the results diverge substantially, then either one method is valid and one is invalid, or both are invalid in different ways. Sprouse et al. () found a per cent ( per cent) overlap when they re-tested a random sample of twocondition phenomena ( sentence types) from the journal articles published in Linguistic Inquiry (LI) between and . Sprouse and Almeida () found a per cent overlap when they re-tested all of the data points published in Adger’s () Core Syntax textbook. Taken together, these results suggest that data from informal and formal methods overlap to a very high degree. This suggests that replacing informal judgement data with formally-collected judgement data would have little impact, at least for work on English syntax. Of course, studies such as these are just the beginning of these kinds of investigations. Future work should explore other languages, other judgement types (e.g., semantic judgements), and other facets of the data (e.g., gradience).
.. Using formal judgement methods to explore the nature of the grammar Formal judgement experiments are not limited to investigating the validity of informal judgements; they can also be used to push the boundaries of the theory of grammar
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
itself. One way formal judgement experiments can do this is by using higher-order experimental designs to isolate and quantify putative grammatical effects while controlling for many of the other factors that potentially influence acceptability judgements. As a concrete example, let us consider the whether-island violation in (), which we have labelled with a star to indicate that it is generally judged to have low acceptability in US English:2 ()
*What do you wonder whether Jack bought __ ?
One common analysis in the generative literature is to postulate a grammatical constraint that prevents movement of a wh-phrase like what out of an embedded question—the embedded question is metaphorically an ‘island’ for this kind of movement (Chomsky a, Ross , Chomsky ), so this is called a whether-island violation. If we wanted to study the necessity of this constraint, we would first want to isolate the effect of moving out of the embedded whether-clause over and above other possible effects that are in this sentence, such as the effect of having a long-distance movement in a sentence, and the effect of having an embedded question in a sentence (both of which might lower acceptability independently of the potential island effect). A factorial design can solve this problem by creating a sequence of subtractions of ratings that eliminate the extra confounds. Take the four sentences in ():
() a. b. c. d.
Who __ thinks that Jack bought a car? declarative What do you think that Jack bought __ ? declarative Who __ wonders whether Jack bought a car? interrogative What do you wonder whether Jack bought __ ? interrogative
| | | |
short long short long
If we make the subtraction (a – b), we isolate the effect of a long-distance as opposed to a short-distance (local) movement. The difference (a – c) isolates the effect of the embedded question without any movement out of it. The difference (a – d) combines three effects: the effect of a long-distance movement, the effect of an embedded question, and crucially, the effect of moving out of an embedded question (the island effect). By subtracting the first two from the third, we can isolate the island effect, as in (): () island effect = (a – d) – (a – b) – (a – c) In this way, a series of three subtractions can isolate island effects even though there are two other effects present in the critical sentence. This design is called a factorial
2
In this and subsequent examples, the underscore indicates the position from which a wh-phrase has moved, according to movement-based theories of generative syntax.
OUP CORRECTED PROOF – FINAL, 23/10/2019, SPi
design because there are two factors, and , and each factor has two levels (declarative vs. interrogative and short vs. long). There are a number of benefits to factorial designs. The most important, of course, is allowing us to isolate the effect of interest. They also allow us to isolate two other effects, one for each factor in the design. Further, they lend themselves to a perspicuous visual interpretation, as illustrated in the two contrasting hypothetical outcomes in the two panels of Figure .. If there is no effect unique to the island violation in (d) over and above the effects of the two other factors, plotting the four conditions in a pattern known as an interaction plot will yield parallel lines, as in the left panel of Figure .. However, if there is a further effect over and above those two factors, such as an island effect in the design in (), the two lines will not be parallel, as in the right panel of Figure .. Another benefit of factorial designs is that they allow for the use of relatively standard statistical tests such as two-way ANOVAs and omnibus linear mixed effects models. The effect of interest (e.g., island effect) in these designs will show up as the interaction term. One final advantage of factorial designs is that, though they only quantify three effects (one for each factor plus the interaction effect), they theoretically allow us to control for an infinite number of potential confounds as long as the confounds are distributed across conditions such that they subtract out at the end of the subtraction steps. The theoretical value of the factorial design really becomes apparent when one considers analyses that attempt to explain acceptability judgement phenomena as being caused by factors outside the grammar. For example, Kluender and Kutas (b) proposed that island effects (the interaction in the right panel of Figure .) may not be due to syntactic constraints in the grammar, but rather to the demands of the two independent processing challenges of a long-distance dependency and an embedded question interacting with each other, perhaps because they both draw on the same limited pool of working memory resources, such that the parser’s attempt to process both simultaneously leads to a larger cost than one would expect from the
mean judgment
no island effect
island effect
0.5 0.0 −0.5
declarative interrogative
short
long
distance
short
long
. The visual logic of factorial designs, illustrated using the whether-island effect design from (). The left panel shows main effects for the two factors ( and ), but no island effect over and above those two factors. The right panel shows an island effect (interaction) over and above those two main effects
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
linear sum of the two costs in isolation. With a factorial design, we can isolate that effect (the superadditive interaction), and attempt to study its properties to see if it is more likely to come from the grammar or more likely to come from the working memory system. For example, Sprouse et al. () looked to see if the superadditive effects for four island effects in English correlate with two working memory measures, which is one possible prediction of the Kluender and Kutas (b) proposal. Factorial designs also have the potential to reveal novel data for the construction of grammatical theories—such as the existence of superadditive interaction patterns without any single sentence being in the lower half of the acceptability scale (e.g., Featherston , Sprouse et al. , Villata et al. ). What is interesting about these effects is that the superadditivity suggests the possibility of a grammatical constraint at work, but the fact that all of the sentences are in the top half of the acceptability scale suggests that this potential constraint is not causing the extreme unacceptability that is typically associated with constraint violations. The question facing the field is how to capture these types of effects in grammatical theories. This is one of the current areas of debate in the acceptability judgement literature.
. R : - -
.................................................................................................................................. The next three data types that we will discuss (reading times, electrophysiological measures, and haemodynamic measures) are far more prevalent in psycholinguistics and neurolinguistics, two domains that focus on language processing, than theoretical linguistics, which focuses on grammatical theories. This means that the value of these data types is primarily tied to the relationship between theories of language processing and theories of grammar. Nonetheless, we believe it is valuable to provide brief reviews of these three data types, not only because readers may encounter them in the literature, but also because we believe that the future of linguistics is one where there is a closer integration between theories of grammar and theories of language processing. There are two primary reading time measures used in psycholinguistics: self-paced reading and eye-tracking. The most prevalent version of self-paced reading uses what is called a moving window to more accurately mimic natural reading (Just et al. ). In a moving window self-paced reading task, the words of a sentence are replaced with underscores, one per character. The underscores representing the sentence are presented in their entirety on the screen. The participant can then reveal each word in succession by pressing a key on a keyboard or a button on a response box. The computer measures the amount of time between button presses, yielding a measure of the amount of time (typically in milliseconds) that it takes to read each word. In reading-based eye-tracking, the entire sentence is presented on screen (without masking by underscores). A sophisticated camera then records the movements of the participant’s pupils as they read the sentence naturally. Analysis algorithms can then
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
be used to determine how long the participant fixated on each word (or multi-word region) of the sentence, as well as several secondary measures such as if (and for how long) the participant looked back to previously read material. Though the method of collection of reading times differs between self-paced reading and eye-tracking, the logic applied to these two data types is the same: one can compare the reading times between two sentences that differ by a specific property of interest at one or more critical words; if the critical word(s) differ in reading times between the two sentences, then one can infer that the processes deployed to understand those sentences at the critical word(s) differed in either quality or quantity. Reading time measures such as self-paced reading and eye-tracking have been used in this way to build complex theories of sentence processing. As the previous paragraph makes clear, reading times are primarily used to investigate sentence processing, not grammatical theory. But under the assumption that there is a predictable relationship between grammar and sentence processing (setting aside the exact nature of that relationship), it is possible to look for the consequences of proposals from grammatical theories in real-time sentence processing. If found, these effects can increase our confidence in such proposals, as there would now be convergent evidence from multiple sources. One concrete example of this is a series of findings showing that the human sentence processor appears to respect syntactic island constraints (briefly discussed in section .) in real time during incremental sentence processing. These studies are predicated upon the Active Filler Strategy (Frazier and Flores d’Arcais ), a parsing strategy that appears to be operative in most, if not all, human languages. In effect, the Active Filler Strategy says that the parser attempts to complete long-distance dependencies at the first viable location. For the wh-dependencies that characterize questions in English, this means that the parser will attempt to associate the wh-word (called the filler in the sentence processing literature) with the first location that could potentially host the wh-word (this location is often called the gap, with the full dependencies called filler-gap dependencies)—typically following a verb or preposition. The Active Filler Strategy leads to several useful effects in sentence processing. For space reasons, we will use only one as an example: the filled-gap effect. In the filled-gap effect, participants experience a reading time slowdown when presented with filler-gap dependencies in which the first potential gap location is occupied (or filled) by another argument (Crain and Fodor , Stowe ). For example, Stowe () found a reading time slow-down using self-paced reading at the critical word us in sentence (a), which contains a filler-gap dependency, relative to sentence (b), which contains no dependency. Under the Active Filler Strategy, the parser initially associates the filler who with the object of the verb bring, but when us is encountered, the parser must reanalyse the structure (and eventually associate the filler with the object of the preposition to), leading to a reading time slow-down, which is called the filled-gap effect. () a. My brother wanted to know who Ruth will bring us home to __ at Christmas. b. My brother wanted to know if Ruth will bring us home to Mom at Christmas.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
In this way, the filled-gap effect is evidence that the Active Filler Strategy is operative, and that the location of us is considered a viable gap location by the parser. Having established the filled-gap effect as a consequence of the Active Filler Strategy, psycholinguists are able to probe the processing consequences of syntactic island constraints. Syntactic island constraints prohibit gaps from occurring inside of certain structures. The Active Filler Strategy attempts to associate a filler with the first gap location that the parser encounters. This leads to the following question: Will the parser attempt to posit a gap inside of an island structure? Stowe () investigated this question for the Subject Island Constraint using the filled-gap paradigm. The Subject Island Constraint (Huang ) prohibits gaps inside of complex subjects in English as in (): ()
*What did [the joke about __ ] offend the audience?
Stowe () looked for a filled-gap effect inside of complex subjects using (a), which contains a filler-gap dependency with a filled-gap location in the complex subject, and (b), which contains no filler-gap dependency (thus acting as a control condition). If the Active Filler Strategy attempts to posit gaps inside of complex subjects (contrary to the Subject Island Constraint), there should be a filled-gap effect (a reading time slowdown) at the noun Greg’s. If the Active Filler Strategy respects the Subject Island Constraint, there should be no filled-gap effect. () a. The teacher asked what the silly story about Greg’s older brother was supposed to mean. b. The teacher asked if the silly story about Greg’s older brother was supposed to mean anything. Stowe () found no filled-gap effect in this paradigm (but did find an effect with similar complex noun phrases in object position), suggesting that the parser respects the Subject Island Constraint in real time by suppressing the Active Filler Strategy when gap locations are prohibited by the grammar. This basic finding has been replicated using both the filled-gap effect and other consequences of the Active Filler Strategy, such as the plausibility effect, for a number of islands and a number of languages (see Phillips for a comprehensive list as of that time). It has also led to a number of more sophisticated explorations of the interaction of syntactic island effects and sentence processing, including the effect of parasitic gaps (Phillips ), and differences between parasitic gaps and across-theboard movement (Wagers and Phillips ). The literature on the Active Filler Strategy has also inspired similar investigations in the domain of pronoun coreference, where a similar active search strategy has been found with respect to the search for antecedents for pronouns (van Gompel and Liversedge ), using a reading time slow-down effect similar to the filled-gap effect (called the gender mismatch effect). This literature has also explored the effect of grammatical constraints on coreference
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
dependencies, called Binding Constraints (Chomsky ), on this active search for an antecedent (Sturt , Kazanina et al. ). There is even recent work exploring a similar notion of an active search for the antecedent in ellipsis constructions (Yoshida et al. ). For space reasons we can’t summarize them all here, but the general logic is the same in each—one can use a well-established reading time effect to probe the interaction of a parsing strategy with a constraint from the grammatical literature.
. E: EEG MEG
.................................................................................................................................. Much like reading times, electrophysiological responses are primarily used in the psycholinguistics literature to investigate sentence processing. They are rarely used to investigate the grammar directly. That said, similar to reading times, the assumption of a predictable relationship between grammar and parser can allow some limited forms of inference (or at least corroboration) to flow between the two fields. To be fair, there has been much less of this in the electrophysiological literature than the reading time literature. This doesn’t seem like a necessary fact, so in the spirit of making this chapter maximally useful to future researchers, we will review some of the findings in the EEG and MEG literature that show the most promise for connections with grammatical theory. Electroencephalography (EEG) and magnetoencephalography (MEG) are two sides of the same coin: EEG measures (some of) the electrical activity generated by the brain (typically through electrodes placed on the scalp; though also potentially through electrodes placed directly in the cortex), and MEG measures the magnetic fields generated by (some of) the electrical activity of the brain (through sensors positioned around the head). The analysis of EEG and MEG results is relatively complicated, but the basics are as follows. First, the electrical activity of the brain is an alternating current, which means that both EEG and MEG data can be characterized as timevarying waves (either the oscillations of electrical voltage in EEG, or of magnetic fields in MEG). This means that all M/EEG data analysis can take advantage of the mathematics of waves. Second, nearly all research in neurolinguistics is event-related. This means that the measurements are time-locked to a specific event (such as the presentation of a word), so that neurolinguists can explore the effects of that event on cognition. Third, M/EEG activity can be divided into two types: evoked activity and induced activity. Both evoked and induced activity are event-related, which means that the electrophysiological response is recorded relative to specific (experimentally controlled) events such as the presentation of a word. The difference between the two types of activity lies entirely in the amount of phase-locking across tokens of the event. Evoked activity is activity that is phase-locked to the event: the peaks of the waves from one token of the event line up with the peaks of the waves of another token (and the same holds for the troughs of the waves). Induced activity is not phase-locked across
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
tokens of the event. This means that it is possible for the peaks of waves from one token of the event to line up with the troughs of waves from a second token of the event. Evoked and induced activity require different analysis techniques (because of the potential for destructive interference with induced activity). Evoked and induced activity may also represent distinct neurophysiological events, though their precise interpretation is an active area of research. The vast majority of research using M/EEG in neurolinguistics has focused on evoked activity in the form of event-related potentials (ERPs), which are simply changes in the amplitude of electrical potentials over time in evoked EEG activity, and event-related fields (ERFs), which are simply changes in the amplitude of magnetic fields over time in evoked MEG activity (see Luck for a complete introduction to the ERP technique, and see Cohen for a complete introduction to analysis techniques for induced activity). There have been some recent explorations of induced activity in the M/EEG literature, but that area is still relatively new inside neurolinguistics, so we won’t review it here (but see Bastiaansen et al. for an excellent overview of that literature to date). The majority of the EEG literature in neurolinguistics has focused on ERPs. Though many ERPs have been identified in the broader EEG literature, the sentence-level neurolinguistics literature tends to focus on four ERPs: the early left anterior negativity (ELAN), the left anterior negativity (LAN), the N, and the P. The ERP literature typically describes two facets of ERPs when defining them: their eliciting conditions, and their functional interpretation. Both facets are potentially useful for linking the EEG literature to the grammatical literature. The eliciting conditions can be used to link grammatical theories and ERPs at the level of specific phenomena. For example, the grammatical theory may predict a certain kind of violation in a sentence, and thus predict a specific type of ERP at a specific location in the sentence. The functional interpretation can potentially license the kind of logic we saw in the previous section on reading times. The functional interpretation can indicate what aspect of sentence processing the ERP indexes, and then one can ask whether known sentence processing strategies predict that those aspects of sentence processing should be engaged at the relevant positions in the sentence. As mentioned above, there aren’t many great examples of this being used to connect with the grammatical literature, but it is possible in theory. So here we will briefly review the eliciting conditions and potential functional interpretation of the four sentence-level ERPs that readers are most likely to encounter in the literature: ELAN, LAN, N, and P. As the name suggests, the ELAN (early left anterior negativity) is a negative-going deflection that peaks in a relatively early processing window (–ms post-stimulus onset) and is greatest over left anterior electrode sites. The ELAN was first reported by Neville et al. () to a specific phrase structure violation in which a preposition appears in an ungrammatical position, as in (b) (note that the critical position may contain a number of categories, but not a preposition): () a. The boys heard Joe’s stories about Africa. b. *The boys heard Joe’s about stories Africa.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
A similar effect was reported by Friederici et al. () in German, in this case when a participle appears in an ungrammatical position: ()
*Das Baby wurde im gefürttert The baby was in-the fed ‘*The baby was in the fed’
The ELAN has since been elicited to very similar phrase structure violations in Spanish (Hinojosa et al. ) and French (Isel et al. ), and further replicated in English (Lau et al. , Dikker et al. ) and German (e.g., Hahne and Friederici , Hahne and Friederici , Rossi et al. ). The ELAN is not affected by task (Hahne and Friederici ) or by the probability of the violation in the experiment (Hahne and Friederici ), and it is not elicited by the less frequent resolution of a syntactic category ambiguity (Ainsworth-Darnell et al. , Friederici et al. ). These results suggest that the ELAN is a very specific response to phrase structure violations, and not simply a response to difficult or unlikely structures. The functional interpretation of the ELAN is an area of much active debate. Here are four proposals that exist in the literature: (i) Friederici (), among others, interprets the ELAN as a marker of syntactic-category-based violations; (ii) Lau et al. () interpret the ELAN as a marker of failed syntactic predictions more generally; (iii) Dikker et al. () interpret the ELAN as indexing processing in the sensory cortices that occurs prior to lexical access; and (iv) Steinhauer and Drury () argue that the ELAN is an artifact of specific data analysis properties of ELANgenerating experimental paradigms. The LAN (left anterior negativity) is a negative-going deflection that is generally largest over left-anterior electrode sites (similar to the ELAN), and tends to occur in the –ms time window (later than the ELAN). The LAN has been elicited by a broad array of (morpho-) syntactic violations, such as agreement violations (Coulson et al. , Gunter et al. , Münte et al. , Kaan , Osterhout and Mobley ), case violations (Münte and Heinze ), phrase structure violations (Friederici et al. , Hagoort et al. ), island constraint violations (Kluender and Kutas b), and even garden-path sentences (Kaan and Swab ). The LAN has also been elicited during the processing of long-distance dependencies such as wh-movement, at both the filler and the gap location (Kluender and Kutas a, Phillips et al. ). The functional interpretation of the LAN is even less clear than that of the ELAN. One issue is that the LAN results are often relatively fragile, and do not always replicate from study to study. Another issue is that the LAN arises for very different phenomena (e.g., morphosyntactic agreement and dependency processing), suggesting either a highlevel interpretation that links these two phenomena, or two distinct sources for the LAN. A final issue is that LAN effects often co-occur with P effects, raising the possibility that the LAN effect is really the result of averaging Ns and Ps from different sub-populations, with the two canceling out in the LAN time window, except for where their distributions fail to overlap (i.e., if the N is bilateral, and the P is
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
right-lateralized, the combination may yield a left negativity; see Tanner and van Hell , Tanner ). The N is a negative-going deflection that is generally largest over centro-parietal electrode sites, and tends to occur –ms post-stimulus onset (with a peak amplitude occurring at ms, hence the name). The N was first observed by Kutas and Hillyard () when they presented participants with sentences that ended with unexpected words. They compared baseline sentences with semantically congruent endings (a) to sentences with semantically incongruent endings (b) and sentences with endings that were incongruent due to the physical properties of the stimulus such as words written in all capital letters (c): ()
a. I spread the warm bread with butter. b. I spread the warm bread with socks. c. I spread the warm bread with BUTTER.
Kutas and Hillyard () observed a larger N for (b) compared to (a), and a larger P (also known as a Pb) to (c) compared to (a). This qualitative difference in the responses to (b) versus (c) suggests that the N is specifically related to semantic processes rather than general error detection. In the decades since its discovery, the N has been elicited by a broad array of linguistic and non-linguistic stimuli, with the common pattern being that they are all meaningful in some way: spoken words, written words, signed words, pseudowords, acronyms, environmental sounds, faces, and gestures (for a review see Kutas et al. ). There are two leading functional interpretations of the N: Hagoort (), among others, interprets the N as an index of the increased difficulty of integrating incongruent words into the preceding context, while Kutas and Federmeier (), among others, interpret the N as an index of processes related to the activation of semantic features in the lexicon (or semantic memory). The P (alternatively the ‘syntactic positive shift’) is a positive-going deflection that is generally largest over centro-parietal electrode sites and tends to occur –ms post-stimulus onset (although there is a good deal of variability in its latency in the ERP literature). Like the LAN, the P has been reported for a broad array of syntactic violations, in many cases co-occurring with a preceding LAN. For example, Ps have been elicited to phrase structure violations (Hagoort et al. , Friederici et al. , Hahne and Friederici , Friederici and Frisch , Osterhout and Holcomb ), agreement violations (Hagoort et al. , Kaan ), syntactic garden-paths (Friederici et al. , Kaan and Swaab , Osterhout et al. ), and island violations (McKinnon and Osterhout ). The sheer number of violation types that elicit a P has led some researchers to suggest that the P may be a (slightly delayed) version of the P (or Pb), which is a general response to unexpected stimuli (Coulson et al. ; see Osterhout and Hagoort for a response). Ps have also been elicited by the processing of grammatical sentences with particularly complex syntactic properties, such as ambiguous structures (Frisch et al. ) and wh-movement (Fiebach et al.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
, Kaan et al. , Phillips et al. ). Recent research has even found Ps to sentences that appear to contain one very specific type of semantic violation (Kim and Osterhout , Kuperberg et al. , van Herten et al. , Kuperberg , Bornkessel-Schlesewsky and Schlesewsky , Stroud and Phillips ). As for functional interpretation, Friederici (), among others, interprets the P as indexing syntactic revision during a stage of processing requiring the integration of syntactic and semantic information, whereas Hagoort (), among others, interprets the P as indexing the difficulty of unifying syntactic and semantic information (a subtly different interpretation from the revision approach). MEG data can, in principle, be used the same way as EEG data, yielding ERFs to various grammatical violations. And much of the early MEG literature has that profile. However, because MEG lends itself to better spatial resolution of the source of activity than EEG, the current trend in the MEG literature is to focus on the localization of event-related activity to specific areas of the brain. For example, using a series of experiments that investigate activity to two-word units that undergo semantic composition such as ‘red boat’, Bemis and Pylkkänen () found activation in the left anterior temporal lobe and the ventro-medial pre-frontal cortex (with much subsequent work directed at determining to what extent this activation reflects purely semantic composition versus other types of conceptual combinatorics). Similarly, using a naturalistic story-listening task, Brennan and Pylkkänen () found that activation in the left anterior temporal lobe correlates with the number of parse steps in a predictive leftcorner (syntactic) parser. Studies such as these suggest a potential future in which MEG is used to localize the fundamental grammatical operations that are postulated by grammatical theories, at least those that translate into distinct operations at the level of sentence processing.
. H :
.................................................................................................................................. One potentially useful property of the human circulatory system is the fact that depletion of local energy stores due to neural firing in the cortex triggers a haemodynamic response wherein oxygenated blood is sent to that area to replenish those stores. Functional magnetic resonance imaging (fMRI) leverages this property of the circulatory system to localize cognitive function. Though the details are quite complex, the basic idea is as follows. Oxygenated and deoxygenated hemoglobin have different magnetic properties. The fMRI method can detect these differences (the signal is called the BOLD signal: blood oxygen-level dependent). By comparing two conditions, one
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
with a specific cognitive process and another without, it is possible to use the BOLD signal to localize the specific area(s) of the brain that depleted local energy stores due to that cognitive process. The fMRI method has better spatial resolution, and requires fewer controversial assumptions, than source-localization using MEG (and EEG). However, fMRI also has worse temporal resolution (because blood travels much slower than electricity and magnetic fields). As such, fMRI can have difficulty isolating the differential effects of sequences of processes; but if the researcher can isolate the relevant process in an experimental design, fMRI is unparalleled among current noninvasive technologies for spatial localization. The localization of cognitive processes is not typically a component of grammatical theories. But as briefly discussed in the previous section, the localization of grammatical operations (by way of sentence processing theories) is a natural extension of grammatical theories as theories of cognition, and can help to integrate linguistics with the rest of cognitive neuroscience. To that end, in this section we will briefly review research that has sought to identify brain areas that underlie syntactic processing using fMRI. There are two areas that have been the focus of most neuroimaging research on syntax: Broca’s and the anterior temporal lobe. Broca’s area is probably the most famous brain region to be correlated with structural properties of sentences. The term Broca’s area usually refers to a portion of the left inferior frontal gyrus (LIFG) composed of the more anterior pars triangularis (Brodmann area ) and the more posterior pars opercularis (Brodmann area ). Paul Broca originally identified this area as central to speech processing based on the post-mortem inspection of the brains of two patients that exhibited severe aphasia: one patient could only produce the word ‘tan’, the other only a handful of basic words. With the advent of non-invasive neuroimaging techniques such as fMRI, Broca’s area has taken centre stage in the investigation of the neural substrates of syntactic processing. At least two thirds of the neuroimaging studies of the brain areas involved in sentence processing (in health) over the past fifteen years reveal an increased activation in (at least part of) Broca’s area for at least one of the reported contrasts, suggesting that this area indeed plays a significant role in some aspects of sentence processing. Although there has been great debate about the key property that modulates activity in Broca’s area in sentence processing, perhaps the most theory-neutral description of the central data is that Broca’s area tends to respond more to sentences with noncanonical word order than sentences with canonical word order. For example, relative to controls with canonical word order, Broca’s area shows increased BOLD signal for relative clauses (e.g., Just et al. , Ben-Shachar et al. ), wh-movement (e.g., BenShachar et al. , Santi and Grodzinsky ), topicalization (e.g., Ben-Shachar et al. ), clefting (e.g., Caplan et al. ), and scrambling (e.g., Friederici et al. , Bornkessel et al. , Bornkessel-Schlesewsky et al. ). The question then is which cognitive processes these syntactic phenomena have in common. There is quite a bit of debate in the literature about this. For example, Grodzinsky and colleagues have argued that Broca’s area seems to be more active for non-canonical word orders because
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Broca’s area supports the syntactic mechanism of movement that is familiar from generative syntactic theory (Grodzinsky ; see Grodzinsky and Santi for a recent review). Bornkessel-Schlesewsky and colleagues have proposed that the same effects can be explained by assuming a parsing stage in which the argument relations of the sentence are computed according to several prominence hierarchies that are familiar from typological research (e.g., the animacy hierarchy, the case hierarchy, the definiteness hierarchy; Comrie , Bornkessel and Schlesewsky , Wolff et al. ). This parsing stage would require a ‘linearization’ process that maps word order to argument structure according to these prominence hierarchies. For BornkesselSchlesewsky and colleagues, Broca’s area supports this linearization process (Grewe et al. , Chen et al. , Bornkessel-Schlesewsky et al. ). In principle, if one assumes a strong relationship between cognitive theories and neurobiology (such as the strong reductionism examined critically in Fodor ), it is possible that the resolution of this debate about the functional interpretation of these effects in Broca’s area could be used as evidence to adjudicate between these competing theories of syntax. Although Broca’s area has been the focus of many neuroimaging studies of syntax, there is a growing literature implicating portions of the temporal lobe in syntactic processing. One of the most robust neuroimaging findings about sentence-level processing is that lateral anterior portions of the superior and middle temporal cortex show greater activation bilaterally for reading or listening to sentences than word lists (Mazoyer et al. , Stowe et al. , Friederici et al. , Vandenberghe et al. , Humphries et al. , , and Brennan and Pylkkänen , mentioned in the previous section). Furthermore, lesion mapping has associated damage to the left lateral anterior temporal lobe with comprehension impairment for most sentences more complex than simple declaratives (Dronkers et al. , although cf. Kho et al. ). These findings suggest that anterior portions of the temporal lobe support sentence-level computations that do not rely on lexical semantics, but this leaves open a number of possible candidate processes: syntactic processes, argument structure processes, discourse processes, and even prosodic processes. If there were a brain region dedicated to basic syntactic phrase structure computation in comprehension, one would expect it to show a profile similar to that of the anterior temporal lobe, showing more activity for processing word strings with syntactic structure than those without. However, demonstrating that this area is specifically involved in syntax as opposed to other phrase-level computations has proved challenging (but see Brennan et al. , Brennan and Pylkkänen , and Rogalsky and Hickok for interesting attempts to distinguish syntax and semantics). As our confidence in the functional localization of the brain increases, it may become possible to use that information to test and refine grammatical theories. If a grammatical theory (as suitably integrated into a theory of sentence processing) predicts that a specific process should be deployed in a sentence, a definitive theory of functional localization in the brain would allow us to use methods such as MEG and fMRI to test whether that process is indeed deployed. Though we are still far from a definitive theory of functional localization in the brain, studies such as the ones
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. ̈
reviewed in this section begin to demonstrate the value of integrating neuroscience and linguistics for both fields.
. C
.................................................................................................................................. In principle, any data type that is related to language behaviour is a potential source of information for grammatical theories. In practice, acceptability judgements form the majority of data used to construct grammatical theories, primarily due to their ability to reveal causal relationships, and the assumption that grammatical properties are one of several factors that directly impact acceptability judgements. Because of the prevalence of acceptability judgements in the literature, the past two decades have seen a number of advancements in the use of formal experimental methods (including factorial designs) for the collection of acceptability judgements. However, for many linguists, the future of grammatical theory lies in integrating grammatical theories with theories of sentence processing, both at a behavioural level and at a neurophysiological level. One potential step toward this integration is an exploration of the data types that are used to construct theories of sentence processing (e.g., reading times, M/EEG data, and fMRI data). Though the relationship between these data types and grammatical theories is less direct than that of acceptability judgements, we hope that the discussion in this chapter makes it clear that such an integration is a worthy goal for twenty-first-century linguistics.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. T chapter explores the potential of natural language corpora, databases of text samples, for grammatical research. These samples may be entire texts or excerpts, and may be composed of written text, transcribed speech, handwriting, or even sign language. Texts are sampled using a set of criteria termed the sampling frame, which specifies the genres, contexts, meta-data, and participants sampled. These ‘texts’ contain more than words and punctuation. A plain text corpus with minimum annotation identifies sentences and, where relevant, speaker turns; it may mark other phenomena, such as headlines, pauses, and overlapping speech. Grammatical research requires either grammatically annotated corpora or ‘on-thefly’ grammatical interpretation of plain text corpora. In practice, due to the availability of effective algorithms, most corpora are tagged so that every word is grammatically categorized by its word class. In corpus linguistics, tagged corpora predominate. Such corpora can be exceedingly large (hundreds of millions of words upwards), or may be drawn from specialized data sources. Tagged corpora are an improvement on plain text, but they are rather limited. Grammar is fundamentally structural (see Chapter ). Word and word class sequences are meaningful in the context of structural grammatical relationships. It follows that corpora with the greatest potential benefit for grammarians are likely to be those where every sentence has been given a full grammatical tree analysis. Creating this kind of parsed corpus (also known as a ‘treebank’) is difficult and timeconsuming, and consequently resources tend to be smaller: typically, a million words or so. Opting to parse a corpus means selecting a particular framework and applying it consistently across diverse naturally-occurring data. Which scheme should we choose?
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Once selected, are future researchers constrained by our parsing decisions? How do we apply the scheme consistently? We discuss these questions in this chapter. Corpora have been used to study, inter alia, semantics, syntax, morphology, and pragmatics. The distinction between applying categorical labels (‘tagging’) and specifying structural relationships (‘parsing’) is extensible to other linguistic levels. Indeed, a richly annotated corpus might allow frameworks to be related. For example, pragmatic analysis might apply speech act categories (such as ‘request’ or ‘assertion’) to main clauses independently categorized by structural type (such as ‘interrogative’ or ‘declarative’: see Chapter ). Researchers have approached natural language text data at multiple levels, in different ways, with various software tools. This combination of purposes, approaches, and tools falls within the scope of the methodology of corpus linguistics. This chapter does not attempt a complete review of tools, or to enumerate the broad range of fields of enquiry, from stylistics to social geography and pedagogy, to which corpus linguistics methods have been applied. For such a review, see McEnery and Hardie (), O’Keefe and McCarthy (), or Biber and Reppen (). This chapter has a particular focus: which corpus linguistic methods are likely to be of the greatest benefit for the study of grammar? Corpus linguistics has grown in popularity over the years, but not all linguists agree about the relevance of corpus data for their subject. Some, like John Sinclair (), argue that all linguistic knowledge resides in the text. Others, notably Noam Chomsky, have argued (see e.g., Aarts ) that corpus data represents at most a collection of performances and epiphenomena, the study of which tells us little about the internal linguistic processes that give rise to grammar. Corpus linguistic data has important strengths and weaknesses for the grammarian. Although corpora can be drawn from a wide range of sources, most corpora are constructed from daily life rather than artificial contexts, and responses are not cued artificially by a researcher. Corpus data is raw primary data, unselected by linguistic introspection. The task of the corpus linguist is to use linguistic insight to interpret this data after it has been collected. In this chapter, I argue that the optimum position for grammatical researchers is at the intersection of linguistic theory and corpus data analysis. This means working with grammatically annotated corpora, while recognizing that the annotation is based on a necessarily partial knowledge of grammar. This chapter is organized as follows. Section . considers what a corpus could potentially tell us about language—the classes of evidence that corpus linguistics can offer a grammatical researcher. Armed with these distinctions, the next section returns to the Chomsky–Sinclair dichotomy outlined above. In section . the proposed solution is further subdivided into levels of knowledge and process. This set of distinctions allows us to relate different grammatical research programmes with corpora, including top-down corpus parsing, cyclic treebank exploration, and bottom-up clustering. To conclude, I discuss how corpus research raises practical problems of experimental design and analysis.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. W ?
.................................................................................................................................. Corpus linguistics arrived late to the grammar party. Corpus linguists tend to date the advent of their field with the compilation of the Standard Corpus of Present-Day Edited American English (popularly known as the ‘Brown Corpus’; Kučera and Francis ). This was followed by the Survey of English Usage Corpus of spoken and written British English (the ‘Quirk Corpus’), begun on paper in , the spoken part of which was published electronically in as the London-Lund Corpus. What made these corpora different from simple collections of texts was a focus on scale and sampling. Firstly, these corpora were substantial collections of naturally occurring text samples. They were of the order of a million words—small by today’s standards, but larger than contemporaneous resources. Secondly, they were consciously collected with the aim of creating a representative sample of the relevant English language variety. Generalizations from such a sample might be said to be ‘representative of the language’, or more precisely, a well-defined subset of it. The novel contribution of corpora lay not in their being a source of natural-language examples. Indeed, linguists had long drawn insight from, and accounted for, real-world examples. Thus in , Otto Jespersen wrote ‘I have tried . . . to go to the sources themselves, and have taken as few facts and as few theories as possible at second hand’ (Jespersen –, I: VI). Pre-corpus sources tended to be limited to the library of a particular grammarian, the Bible, or the works of Chaucer or Shakespeare. Moreover, from a grammatical perspective, representativeness has one major drawback. Test cases that expose theoretical distinctions between analyses may be too infrequent to be found in a typical corpus. Constructing artificial conditions to elicit examples from speakers may be a more effective approach (see Chapter ). The benefits of a corpus for grammatical theory lie elsewhere. There are three types of evidence that may be obtained from a corpus (Wallis ; see Figure .), which can apply to many different linguistic phenomena. These phenomena, instances of which we will term a ‘linguistic event’, x, might be (for example) an individual lexeme, a group of words, a prosodic pattern, a grammatical construction, a speech act, or any configuration of these, such as a question speech act employing a rising tone. The three types of evidence are factual evidence (event x took place, written ‘Exists(x)’), frequency evidence (x occurs ‘f(x)’ times), and interaction evidence (x tends (not) to co-occur with another event y). Factual evidence is simply evidence that an event occurred in the corpus, i.e., it was expressed at some point in the past. One application of this is the identification of novel events not predicted by a framework, discussed in .. below. Corpus linguistics is most strongly associated with frequency evidence. A single observed frequency, f(x), is an observation that a linguistic event appears in a corpus a certain number of times. More useful is a frequency distribution: a set of frequencies of
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Exists(x)
f(x)
p(x ∧ y)...
. Three types of evidence: left, factual evidence: event x is found in a corpus; middle, frequency evidence: x is distributed in the corpus by frequency; and right, interaction evidence: y co-occurs with x more/less often than chance would predict
related events. Thus the first application of the Brown Corpus was a word frequency list and set of distributional analyses, Kučera and Francis (). Distributions allow researchers to compare related frequencies. For example: the word pretty is conventionally considered first as an adjective in dictionaries.1 However, Nelson et al. (: ) report that in the British Component of the International Corpus of English (ICE-GB), pretty is found per cent of the time ( times out of ) as an adverb.2 Another type of frequency distribution might reveal variation in the frequency or rate of the same phenomenon over a sociolinguistic contrast, such as genre, speaker gender, or time (see e.g., Chapter ). The third type of evidence is interaction evidence. These are observations that two events tend (or tend not) to be found together, sometimes referred to as ‘association’, ‘attraction’, or ‘co-location’ statistics. This evidence is interesting because if two apparently independent events coincide more frequently than expected, there may be a deeper connection between them. It is also possible to find evidence of negative association or ‘horror aequi’ (Rohdenburg : ). We discuss interaction evidence in ...
.. Factual evidence and the validation of frameworks When corpora are constructed, one of the first tasks of the compilers is to choose a framework and completely annotate the corpus with it. In a tagged corpus, every word must be classified, including novel words, i.e. words that have not previously been given a word class. Thus if we applied a scheme developed with a standard language source to a corpus of regional dialect speech we might find words with no agreed classification, which a human linguist would need to determine. The same principle extends to parsing or any other framework. 1 In the OED, pretty (adjective) has between seven and eight times the space afforded for the description of the adverb form (OED, : VIII –). 2 Since corpora are of different sizes, many corpus linguists cite frequencies ‘normalized’ by scaling per thousand or million words before reporting. However, the number of words per text is often not the optimum basis for comparison. A better approach is to pick a meaningful baseline. See section ...
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
A corpus can be used to validate a framework through a process of identifying ‘gaps’ and encouraging reappraisal of the scheme. First, the corpus is subjected to an annotation process, e.g. the texts are tagged and parsed. Second, a search process is applied to the entire corpus to identify unannotated cases. Let us take a real example. Quirk et al.’s (: ) transitivity framework describes the following complementation patterns: TRANSITIVE VERBS →
MONOTRANSITIVE VERBS occur in type SVO DITRANSITIVE VERBS occur in type SVOO COMPLEX TRANSITIVE VERBS occur in type SVOC and SVOA
Quirk et al. offer My mother enjoys parties (where parties is a direct object) as a monotransitive pattern. The vast majority of SVO patterns (, in ICE-GB) are of this type. Yet in parsing ICE-GB, the compilers found some clauses where the single object was an indirect object, like she told me (cf. ditransitive she told me a story, with indirect object + direct object). They had two options: • Create a new category. The ICE-GB team decided to create a new ‘dimonotransitive’ type for these patterns. The distinction between direct and indirect objects was considered sufficiently important to be recorded as a transitivity feature. • Modify an existing category to include the novel pattern. A literal reading of Quirk et al. might treat ‘monotransitive’ as encompassing all single-object complementation patterns: she told me is monotransitive. Alternatively, perhaps she told me is considered to have an ellipted direct object, in which case we might argue it is ditransitive. Irrespective of which approach is taken, the distinction is not recorded as a transitivity feature. The process of incorporating novel events into a framework is not merely one of applying a label, but one of grammatical argumentation (see Chapter ). Contrast this process with the ad hoc classification of novel examples without a corpus. The non-corpus linguist, initially at least, finds a single unexplained example. However, it can be difficult to determine a single differentiating factor from just one case, or to decide if the instance is simply anomalous. Obtaining further attested examples of infrequent phenomena requires exhaustive manual reading. It is commonly said that a corpus cannot tell us what is impossible in a language. Corpora do tell us what is possible—occasionally, contrary to our expectations. Validating frameworks against data makes them empirically more robust, so the probability of finding new novel events in the future tends to decline.
.. Interaction evidence Interaction evidence is rarely discussed in corpus linguistics textbooks. It is empirical evidence that theoretically independent events actually tend (or tend not) to co-occur.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
This evidence is relevant to grammarians at both abstraction and analysis levels (see .). Interaction evidence may be associative (bi-directional) or directional. The simplest type is associative. Consider two events, x and y, assumed to arise independently, each with probabilities p(x) and p(y). The expected joint probability (the chance that they will co-occur independently) is simply the product, p(x) p(y). We measure the observed rate of co-occurrence, p(x ∧ y), and compare figures with a statistical test. The result is a statistical correlation that can occur for a variety of reasons. This principle is used in many computer algorithms, from part of speech tagging and probabilistic parsing to collocation analysis. It can also be used in interaction experiments (see section ..). Collocation analysis (see also section .. below) finds word pairs tending to co-occur (co-locate) next to each other. In the British National Corpus (BNC), askance tends to be immediately preceded by the lemma . The chance of a word being look, looks, looked, or looking jumps up dramatically if the next word is askance. This example is directional. The probability that askance is preceded by is around per cent in the BNC ( askance is idiomatic). This is far greater than the probability that is followed by askance, which is around . per cent ( is much more likely followed by up, forward, etc).3 Directionality is taken further in word class tagging algorithms. These use transition probabilities, p(y | x), ‘the probability of y given x’. A training algorithm analyses a previously tagged corpus and creates a database, which is then used by a tagging algorithm to tag new texts. Let us take a simple example. In ICE-GB the word work appears times: times as a verb and times ( per cent) as a noun. This is a frequency distribution. However, if work is immediately preceded by an article, the chance of work being a noun jumps to per cent. (In all cases of article + work in ICE-GB, work is a noun.) We can collect an interaction distribution of transition probabilities from the corpus, and use it to make tagging decisions in the future, thus: article preposition adjective adverb ...
the work to work recent work presently work
out of
( per cent) ( per cent) ( per cent) ( per cent)
Note how adverbs predict the opposite outcome, i.e. twenty-six times out of thirty, work is a verb.
Despite the highly directional evidence, the bi-directional ‘mutual information’ score places askance first in collocates of . See also https://corplingstats.wordpress.com////direction. 3
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Events x and y need not be adjacent for interaction evidence to be derived, merely found in the same text. The exact relationship depends on the research question. The same principle can be used to study grammatical priming using a corpus (Gries ). In this model, events x and y are instances of the same grammatical structure (Speaker A says x, and ‘primes’ Speaker B to say y later on), potentially separated by many utterances. Patterns of interaction may arise for several reasons. Some co-occurrence tendencies are due to a grammatical relationship, such as the known relationships between articles, adjectives, and nouns illustrated above. Others may be due to psycholinguistic processes of attention and memory operating on grammatical rules (Wallis ). However, interaction can also arise for more trivial reasons: a text concerns a particular topic (Church ), or patterns reflect semantic associations and idioms. What at first sight appears to be grammatical ‘priming’ may be mere lexical repetition. Detecting patterns is only a starting point. We may need multiple experiments to distinguish plausible causes. Frequency and interaction evidence are open to statistical inference. If the corpus is representative of the language from which it is drawn, and instances are drawn from many participants and texts, it becomes possible to argue that detected preferences are not due to an individual writer or speaker but are typical of the language community. We can make sound statistical claims about the population of sentences from which the corpus is drawn.
. A
.................................................................................................................................. Commonly, a contrast is drawn between ‘corpus-based’ and ‘corpus-driven’ linguists (Tognini-Bonelli ). A corpus-based linguist is one whose research is primarily theoretical, who uses a corpus for exemplification and hypothesis-testing. For them, a corpus is a source of knowledge about grammar that must be interpreted by theory. While widely used, the term ‘corpus-based’ is far too general. It includes any linguist who uses a corpus but does not claim to rely on corpus data exclusively, i.e. a corpus linguist who is not ‘corpus-driven’. This seems unsatisfactory. There are many possible approaches to research, ranging from an extreme theory-driven approach that avoids corpus data (typified by Chomsky) to an extreme corpus-driven one that sees all theory as inevitably incorrect (typified by Sinclair). Instead of ‘corpus-based’ versus ‘corpus-driven’, it is clearer to say there is a continuum of corpus research perspectives between ‘theory-driven’ and ‘corpus-driven’ poles.
.. Corpus-driven linguistics ‘Corpus-driven’ linguists argue against what they perceive as a necessarily selective approach to the corpus. John Sinclair () objected that the grammatical tradition of
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
top-down research (with or without a corpus) resulted in a plurality of grammatical frameworks with no agreed way to select between them. Corpus-based linguists are vulnerable to research bias. They discover what they expect to find and explain away counterevidence as ‘performance errors’. ‘Research’ is reduced to categorizing data under a pre-existing theory, rather than an attempt to critically engage with theories. Consider the ‘dimonotransitive’ verb category we discussed earlier. The compilers of ICE-GB did not overturn the category of transitivity, but extended it to account for problematic examples. Sinclair’s point is that corpus-based researchers tend to ‘patch’ their framework rather than reconstruct it from first principles—and risk reappraisal of their previous research. Corpus-driven linguists adopt a different starting point. Research should start from the plain text, and researchers should derive theoretical generalizations from the corpus itself. Let us make a minimum set of assumptions and see where this takes us. We can harness computational power to process many millions of words and identify new generalizations.4 Sinclair’s achievement was the construction of the proprietary Bank of EnglishTM corpus of over million words, and a grammatical framework (published as the Collins COBUILD English Language Dictionary () and Collins COBUILD English Grammar ()) that his team reportedly compiled ‘bottom-up’ from the corpus. However, the COBUILD project raises an obvious objection. If it is possible to obtain a grammatical framework from a corpus, how should we refine or extend this grammar? Are we obliged to start again (as true corpus-driven linguists), or may we take this work as a starting point, thereby incorporating generalizations found in the first stage as theoretical assumptions for the next? But if we adopt the latter course, are we not becoming corpus-based?
.. Theory-driven linguistics Probably the most famous example of a theory-driven position is found in Noam Chomsky’s comments on corpus linguistics (Aarts , Chomsky ). His explanation of corpus evidence as ultimately constituting evidence of performance, rather than indirect evidence of some kind of internalized language faculty, might appear to place him outside of corpus linguistics altogether.5 Linguistics is then principally an exercise in deduction (see Chapter ). Nonetheless, many linguists strongly influenced by Chomsky’s theories have engaged with corpus evidence (e.g., Wasow ). Instead of starting with a corpus and generalizing upwards, they have attempted to test hypotheses against corpus data. The principal point of reference for a top-down, theory-driven researcher is their theoretical framework. Corpus data is interpreted by the theory, so contrary evidence 4 5
See http://corplingstats.wordpress.com////pos-tagging. See http://corplingstats.wordpress.com////why-chomsky.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
could be interpreted as epiphenomena and exceptions, or even ignored.6 It is ultimately not possible to formally disprove a theory by this method. Yet a key goal of all science lies in attempting to improve theories. We have already seen how a corpus can identify phenomena unanticipated by a grammatical framework. Using a corpus, is it possible to identify and test the kinds of theoretical predictions that might cause us to choose between competing frameworks?
.. Transcending the dichotomy The resolution of the two positions—‘theory-driven’ and ‘corpus-driven’—starts with the following observation: neither extreme is necessary. Instead, we can agree that any current linguistic theory is almost certainly incorrect, but a theory is necessary to make progress—including identifying where it fails. In other sciences, theories are understood to be necessarily partial—indeed, potentially untrue and misleading—but also a necessary part of the scientific process (Putnam ). It is not possible to avoid ‘theory’, thus compiling a word list requires us to define a ‘word’. If theories are unavoidable, we must state their assumptions. See also Chapter . All theories include auxiliary assumptions, i.e. assumptions outside of the theory proper that are necessary to obtain data. Thus, in astrophysics, one cannot directly measure the chemical composition of stars. Instead researchers collect spectrographs, where each light ‘spike’ matches the characteristic wavelength of a fluorescing atom. But this raw data is further distorted by Doppler shifts (due to the star moving relative to the viewer), bent by gravity, etc. Auxiliary assumptions are found in calculations to calibrate equipment and interpret measurements. How are observations ultimately interpreted? By comparing spectrographs from distant and local stars, and by relating observations to an overall theory of stellar decay: in short, by comparing expected and observed results and making sense of what remains. Moreover, a systematic difference between expectation and observation can obtain indirect observations, i.e. observations that cannot be perceived directly but may be inferred by another observed effect. Famously, Pluto was first detected by perturbations (wobbles) in the orbit of Uranus.7 Naïve falsification (Lakatos ), where hypotheses are rejected on a single piece of counterevidence, is unusual in science (the exception being fields obliged to have a low tolerance for error, such as clinical trials). Counterevidence may even be revelatory. Rather, progress is often made by triangulation, that is, support for a position builds up when multiple independent approaches tend to converge on the same conclusion, and Labov (: ) writes: ‘[I]n listening to everyday speech, we tend to hear only those linguistic features that have already been described, and it takes a major effort to hear the new variables that are being generated in the speech community.’ 7 See www.discoveryofpluto.com for an introduction to this topic. 6
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
by competition between alternative theories to explain observations. In short, Lakatos () and Kuhn () argue, parallel research programmes co-exist and progress to a certain point, whereupon the dominant theory fails to explain new phenomena or is supplanted by a more effective theory. What does this mean for linguistics? Since neither theory nor data can be dispensed with, extreme top-down and bottom-up positions are untenable. Instead, linguists should adopt a position where corpus data and linguistic theory are engaged in a critical dialogue or dialectic: testing the theory against the corpus while enriching the corpus theoretically to make sense of data. Researchers must commit to a particular theoretical framework (albeit temporarily) to make progress. Finally, if all theories are necessarily partial, then we must transcend the corpusdriven/theory-driven dichotomy by a cyclic methodology.8 Knowledge ultimately derives from the attempt to explain data by theory. But researchers can work from the data up, or from the theory down, as the need arises.
. T
.................................................................................................................................. Modern corpus linguistics could not exist without computation, and a mini-industry of tools and algorithms, ‘toolkits’ and platforms, has grown up alongside corpora. Some tools help build and annotate corpora; others explore existing corpora. Some tools, such as automatic taggers and parsers, perform different tasks. Others, like automatic parsers and manual tree editors, perform comparable tasks with different methods. To make sense of diverse algorithms and approaches we need finer distinctions than ‘top-down’ and ‘bottom-up’. Wallis and Nelson () propose a A perspective in corpus linguistics. This identifies three processes that generalize from raw text to linguistic hypothesis: ‘annotation’, ‘abstraction’, and ‘analysis’. Figure . summarizes the idea. The A model is fundamentally cyclic. Each process is capable of upward and downward application. Thus ‘annotation’ is usually considered top-down (applying a framework to text), but, as we observed in .., detecting novel elements in a text may generate new terms in a scheme, bottom-up. The model also identifies three more general knowledge layers above the source text. These are the ‘corpus’ (text enriched with annotation); ‘dataset’ (example set extracted from the corpus); and ‘hypothesis space’ (set of hypotheses testable on the dataset). Abstraction (and its reverse, ‘concretization’) maps abstract terms to particular examples in the annotated corpus. Consider a linguist concerned with investigating verb phrase complexity. She uses a concept of a ‘complex verb phrase’, possibly graded by complexity. Her notion of complexity is theory-laden and is not represented in 8
The idea that complex systems of knowledge undergo cyclic development is found in computer science (Boehm ) and other fields.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Hypotheses
. The A perspective in corpus linguistics (after Wallis and Nelson )
Analysis
Dataset Abstraction
Corpus
Annotation
Text
the annotation. However, she can translate it into queries (Wallis ; see also section ..) applicable to the annotated corpus. Abstraction performs a crucial task: mapping specific, concrete example structures, patterns or elements in observed sentences to general conceptual terms. Like annotation, these concepts form part of a systematic framework, but this framework belongs to the researcher, not the annotator. Usually researchers wish to extract examples of a set of linguistic events, rather than a single one. Multiple related queries are typically required, so abstraction should also collect data together in a dataset amenable to analysis, for example to determine if variable X and Y interact. The final stage, Analysis, is a process of evaluating this abstracted dataset for generalizations (‘hypotheses’ in the experimental paradigm).9 A single hypothesis, such as ‘spoken data contains a lower proportion of complex VPs than written data’, can be tested against the dataset, or multiple hypotheses may be evaluated together. Analysis can also be cyclic. The results of one experiment may trigger further refinements of the research design. The A model contains six process arcs—three up, three down—and four levels of linguistic knowledge (see Figure .). Different algorithms (taggers, collocation analysers, search tools, etc.) operate on one or more of these arcs.10 The same principle applies to manual processes under a linguist’s direct control, such as browsing examples and intuiting new concepts and queries. To take a radically different example, an ‘analysis’ stage in an automated telephone answering service might classify input patterns and trigger a response. 10 To take an example that might at first sight appear contrary to this scheme, Pintzuk () describes using the University of Pennsylvania CorpusSearch tool search-and-replace function to modify 9
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Conceiving of corpus linguistics in this way has two advantages. First, it allows us to integrate different tools in a single platform and identify tools performing parallel tasks. Second, it offers options to linguists who might otherwise fail to see a way forward. For example, analysis software may generate impressive visualizations, but if the linguist has no access to the underlying sentences they cannot verify patterns. Which set of sentences does a datapoint reflect? Are results genuine or an artefact? What follow-up experiments are necessary to further distinguish hypotheses? Only by returning to the corpus and the original sentences is it possible to find the underpinning of analytical results. Returning to the source data should not be a mere afterthought. The following sections exemplify this perspective. We discuss concordancing in section .., lexical search in .., parsing in .., software for exploring parsed corpora in .., and bottom-up exploratory methods in ...
.. Concordancing tools As corpora have grown, the need for specialized software has also grown. Although some early corpora, such as the Quirk Corpus, were built without computers, the benefits of computerization soon became obvious. Initially, corpus developers tended to use existing computer programmes, such as standard text editing software and databases, to construct early corpora. The first set of software tools developed specifically for corpus linguistics were concordancing tools. Popular current examples include AntConc (Anthony ) and WordSmith (Scott ). ‘Concordancing’ originated in Bible studies, where citations of words in context were deployed in theological discussions. Key Word In Context (KWIC) concordances display the results of a corpus search—a particular word, morpheme, part-of-speech tag, etc.—in their immediate source sentence context. A simple search for a word can generate a large set of results for a researcher to browse. Concordancing is a bottom-up exploratory technique par excellence. A simple search can uncover patterns of use ‘emerging’ from the text by exposing contrasts in a set of examples. Compare the concordance view in Figure . to the conventional ‘literary’ way we consider language in a narrative context where contrasts with other examples are unavailable. Figure . reveals that in the ICE-GB text ‘SA-’, school is often prefigured by boy’s, girl’s, local or boarding—a fact that is immediately apparent, but might not be guessed by simply reading the text.
the annotation in a corpus in order to make future searches (i.e. abstraction) more straightforward. Here CorpusSearch is used in two distinct ways, crucially, with distinct outputs. Annotation obtains a revised corpus, whereas abstraction obtains linguistic evidence.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. Example of a Key Word in Context concordance for the lexical item school in ICE-GB, showing adjacent word class labels
.. Lexical-grammatical search in tagged corpora To perform deeper research, we must exploit linguistic distinctions. In the introduction I suggested that a parsed corpus (or ‘treebank’) offered the greatest opportunities for grammatical research. The dominant movement in corpus linguistics has been to compile ever-larger wordclass-tagged corpora. The million-word Bank of English is a fraction of the commercial Collins Corpus of . billion words. Mark Davies has compiled and published several multi-million word corpora totalling . billion words and growing (see https://www.english-corpora.org). Although these corpora are not parsed, it is possible to perform searches for small phrases and clauses using sequences of word class tags and words. This method combines the benefits of large corpora with a ‘quasi-parsing’ approach using search strings and careful review. But we cannot guarantee to accurately find all examples. Thus, using the tagged Corpus of Historical American English (COHA), Bowie and Wallis () employed a search string to find examples of the to-infinitival perfect (as in to have forgotten) occurring as complement of a preceding governing verb (as in to have forgotten). The search pattern ‘, to, have, ’ finds cases like seems to have forgotten and [was] considered to have begun, but excludes others. For instance, noun phrases of varying length can appear between some governing verbs and the particle to, as in considers [the space race] to have begun. Without parsing, extensive manual analysis is the only way to reliably identify such examples: additional search strings can be devised to allow for varying numbers of intervening words, but these retrieve numerous ‘false positives’ (instances of irrelevant structures).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Corpus annotation and parsing Parsing is a considerably more complex problem than tagging. Briscoe () estimated that automatic parsing algorithms correctly parse English sentences approximately per cent of the time. Performance has improved since then, however. Chen and Manning () cite a per cent success rate for certain types of parse decisions for the Stanford Parser. Word class tagging algorithms achieve a per cent error rate for English, which we might accept, but even a per cent parsing error rate is not tolerable for linguistic purposes. The errors are likely to include interesting problems, like the dimonotransitive (see section ..), and to vary depending on scheme complexity and variation of input sentences. Even with improved parser performance, we should expect to perform considerable linguistic review and hand-annotation. Since their goal is to obtain a better parsing algorithm, natural language processing (NLP) researchers treat their algorithm and database as primary. The corpus is simply test data. Improvement occurs by modifying the algorithm and re-testing it against the corpus. Correcting the corpus seems irrelevant to improving algorithms. When developing a treebank corpus, the cost of hand-correcting parsing is substantial. However, the exercise has benefits. We have already seen how completing the annotation of a corpus can validate and extend a framework. ‘Corpus parsing’ is not merely an exercise in applying a theoretical description to language data, but a cyclic exercise causing the framework to be validated and revised (Leech and Garside ; Wallis and Nelson ). The parsing-and-correction process obtains a corrected treebank, where each tree has been comprehensively reviewed by multiple linguists.11 These trees contain a set of ‘situated parsing decisions’, i.e. decisions about how best to analyse this sentence in this context. Errors will exist, but they are less likely to be systematic than errors obtained by deterministic parsing algorithms. This is an important criterion when it comes to statistical analysis (see section .): we want to know about the grammar of sentences, not merely the performance of the parser. Once we have decided to parse a corpus, we next must decide on the annotation scheme and how it should be applied to sentences. • Which framework should we choose? Whereas there is a certain consensus in word class definitions (see Chapter ), linguists have adopted many different frameworks to describe grammatical relationships. • Can we avoid methodological ‘over-commitment’, i.e. that a corpus parsed with Framework X compromises research in Framework Y? The answer must involve abstraction, mapping terms in the researcher’s Framework Y to terms in Framework X. We discuss a solution in section .. below. • What should be done with missing elements, such as ellipted subjects, verbless clauses, and incomplete sentences?
11
This exercise requires supervisory ‘knowledge management’ protocols (Wallis and Nelson ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
The approach taken to the last question depends in part on whether we consider annotation from a ‘top-down’ or ‘bottom-up’ perspective. If we believe that a particular theoretical internal language is primary, and missing elements are considered to be performance errors, we might insert ‘null elements’ (Marcus et al. : ) or ‘hallucinated’ entities ‘recovered from the context’. In the spirit of being true to the data (bottom-up), many corpus linguists have tended not to introduce implied content. Superfluous elements such as self-corrected words and ‘slips of the tongue’, common in conversation, are often marked out by annotation (cf. struck-out ‘which’ in Figure .). Not all ellipsis involves single words. Natural language, especially conversational language, includes grammatically incomplete utterances as well as complete ones (see Chapter ). Consider this dialogue sequence from ICE-GB. B: A:
Didn’t there used to be deer in Richmond Park? There still are. [SA- #, ]
A’s ‘clause fragment’ appears perfectly intelligible to B (and readers). The conversation continues without recapitulation or repair. If we wished to insist on identifying the ‘missing’ elements of the structure, A’s fragment would need further annotation, e.g., by linking to the relevant material in the previous sentence (deer in Richmond Park), or inserting some.12
.. Grammatical exploration with ICECUP Tools for working with parsed corpora are relatively rare (see Wallis for a discussion). The International Corpus of English Corpus Utility Program (ICECUP, Nelson et al. ) is a ‘corpus workbench’ (a suite of closely connected tools on a single software platform) designed for parsed corpora. It is not the only search tool (notable search tools currently include CorpusSearch (Randall –) and TigerSearch (König et al. )), but it is arguably the most integrated platform of its type. Central to ICECUP is a query representation called a Fuzzy Tree Fragment (FTF): a diagrammatic representation of a grammatical query. It consists of information about grammatical nodes and/or words, and their possible structural relationships in the tree. A researcher constructs an FTF like the one in Figure ., and ICECUP uses it to find multiple matching examples in the corpus. Results may be concordanced (Figure .), so that each example structure is highlighted and aligned visually. If the FTF matches more than once in the same sentence, each matching case is highlighted on a separate line. The benefits of a visual representation of a ‘tree fragment’ (rather than, say, a logical expression) are threefold. First, the FTF appears as a coherent, intuitive whole. Second,
12
This issue illuminates a subtle methodological distinction between speaker parsing and hearer parsing. Conventionally, corpus researchers parse the structure as a model of the speaker’s construction process, rather than how the hearer might have reconstructed it.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. A Fuzzy Tree Fragment for an adjective phrase (AJP) containing a general adjective head (AJHD, ADJ(ge)) with at least one premodifier (AJPR) and one postmodifier (AJPO). Black lines and arrows mean that elements are immediately connected to each other; white lines mean that they are eventually connected. (For reasons of space, trees are typically drawn left-to-right. See also Figure ..)
. A simple grammatical concordance: highlighted constructions in ICE-GB, such as quite happy to meet you, match the FTF in Figure .
the tree metaphor helps us identify how it matches part of a particular tree and sentence, helping researchers refine their queries (see Figure .). Third, users can build an FTF from a tree. They select nodes they want the FTF to match, and make some simple choices in a ‘Wizard’ tool to form the FTF.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. Examining any line in the concordance displays the sentence and phrase structure tree, showing how the FTF matches a tree in the corpus. In this case the adjective phrase premodifier is realized by the adverb phrase quite, the head by the adjective happy, and the postmodifier by the subordinate clause to meet you13
This combination—a visual representation that maps from query to tree, a user interface that makes search results easy to browse, plus algorithms working in the background—produces an exploratory research platform that allows researchers to explore the rich parse annotation in the corpus and carry out experiments. What about the risk of ‘over-commitment’ noted earlier? Are ICE-GB users committed to the Quirk et al. () grammar? Must they learn every last detail? ICECUP addresses this problem by an exploration cycle (Nelson et al. : ). Users often begin with a text search or by browsing part of the corpus; find a construction of the type they are interested in; and then use the tree to create an initial FTF. The linguist does not have to ‘agree’ with the grammatical framework used: the parse analysis is simply ‘a handle on the data’ (Wallis ). Strictly, this exploration cycle combines the bottom pair of cycles in Figure . (abstraction and annotation). In practice, exploration often consists of the abstraction cycle, i.e. annotation is taken as given. Occasionally, new annotation may be required. Thus Aarts et al. () ultimately subdivided cases of modal verbs will and shall into semantic subcategories Epistemic and Root. We must be able to add distinctions if required.
13
Gloss: PU = parse unit, CL = clause, DISMK = discourse marker, REACT = reaction signal, SU = subject, NP = noun phrase, NPHD = NP head, PRON = pronoun, VB = verbal, VP = verb phrase, MVB = main verb, V = verb, OD = direct object, CS = subject complement, AJP = adjective phrase, AJPR = adjective phrase premodifier, AVP = adverb phrase, AVHD = AVP head, ADV = adverb, AJHD = AJP head, ADJ = adjective, AJPO = AJP postmodifier, TO = to-particle, PRTCL = particle. Features are not shown for reasons of space.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Bottom-up generalization algorithms Many linguists, particularly computational linguists, have used computation to automatically identify common (high-frequency) and ‘statistically interesting’ (closely associated) patterns in annotated text. These inductive algorithms abstract patterns in data to the point where we might consider whether they represent a new concept we might have missed. Here we briefly discuss some automatic methods relevant to grammarians. Lexicons. Compiling a lexicon from a corpus requires a computer algorithm to create an index for every word, every combination of word and word class tag, then combinations with morphological stems, stress patterns, etc. This process generates a vast amount of frequency information. Due to the computational effort involved, indexing is performed first and the results are explored with an interface. For example, ICECUP’s lexicon (Nelson et al. : ) makes use of indexes compiled when the corpus was created. The interface lets a researcher limit the lexicon to nouns, words matching a wild card, etc. The interface selects and aggregates terms where necessary. The same approach can be extended to grammatical nodes (ICECUP has a ‘grammaticon’ tool). Collocations. The idea of a collocation (see also section ..) is almost as old as corpus linguistics itself, and simply means ‘a sequence of words or terms which tend to co-occur more frequently than would be expected by chance’ (interaction evidence). Popular tools include AntConc (Anthony ) and WordSmith (Scott ). Collocations have obvious value to language learners or researchers in semantics, and the list of examples obtained from a corpus is much broader than published idiom lists. Using business or legal English corpora, collocation can generate domain-specific lists.14 Pairs of words may collocate for different reasons. They may represent a specific concept whose meaning is idiomatically given by the entire string (e.g. small businessman). Collocation did not originally depend on grammatical distinctions: rather, the method might reveal grammatical patterning in the text (corpus-driven linguists would say they ‘emerge’ from the corpus). Typical two-word collocations adhere to the pattern ‘adjective + noun’ (e.g., high street, primary school), or ‘verb + particle’ (e.g., found out, i.e. ‘phrasal verbs’), etc. This type of algorithm may be directed by a search word and a ‘slot’ to be filled (a gap before or after the search term): for example, to search for collocations of the form ‘X + school’, or ‘found + X’. A variation of this approach, sometimes termed colligation, is a collocation restricted by word class tag, such as ‘adjective + school’ or ‘found + particle’, and operates on a tagged corpus. N-grams. Stubbs and Barth () proposed a kind of cluster analysis to detect highfrequency sequences of words, termed n-grams. A variation of this approach, phrase 14 Various statistics have been proposed for estimating the degree of interaction between words. These include mutual information, various statistical measures (z, t, χ2 and log-likelihood), and probability difference (see Gries for a review).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
frames, permits some of the word ‘slots’ to be filled by a particular word. The Phrases in English website (http://phrasesinenglish.org) allows researchers to retrieve n-grams from the British National Corpus (BNC). The four top trigrams found are: I don’t one of the the end of part of the
, instances , , ,
These are frequent, but not very interesting!15 Stubbs and Barth comment that these n-grams are not (grammatical) linguistic units, but may ‘provide evidence which helps the analyst to identify linguistic units’. In other words, results need human interpretation. They may be useful triggers for grammatical insight, but they must be related to a theory. Collostructions. A promising approach that attempts to introduce grammatical concepts into an abstraction process is collostructional analysis (Stefanowitsch and Gries ). Collocation does not exploit grammatical information except (in the case of colligation) at a word class level. Without structural constraints the method can generate ‘false positives’ that skew results. In multi-word sequences there is a greater chance of ‘false negatives’, i.e. cases not found due to the presence of an intermediate word or phrase in sentences. Collostructions attempt to avoid this. They extend the idea of collocations to a range of constructions,16 such as [N waiting to happen], identifying ‘result’ arguments of cause, and so on. In a corpus annotated for transitivity (usually, a parsed corpus), it is possible to configure a collostruction search which ranks verbs by the extent to which they would be most typically found in the ditransitive construction. Like collocation, collostruction uses ‘attraction’ measures (interaction evidence) rather than simple frequency. The authors’ algorithm therefore rates potential slotfilling lexemes (e.g. N = accident, disaster, earthquake, etc. for [N waiting to happen]) by the probability that they appeared in the specified slot by chance, estimated with Fisher’s exact test.17 A tool that brings many of these inductive algorithms together is SketchEngine (Kilgariff et al. ). SketchEngine is the prince of bottom-up exploratory tools18 or the ultimate dictionary creator, depending on your perspective. This tool brings many
15
This algorithm exploits frequency evidence (the string is frequent) rather than interaction evidence (the string seems to co-occur more than would be expected by chance). 16 This sense of ‘construction’ spans grammar and lexicon, i.e. it is not limited to purely grammatical constructions but may include multi-word patterns. The algorithm tests for statistical association. The linguist has to decide whether this is due to shared meaning. 17 Fisher’s test is like a more accurate χ2 test (Wallis ). It can compare p(x) p(y) and p(x ∧ y) for significant difference. Ranking by error level tends to bias results towards high-frequency phenomena, which is not always desirable. 18 The term ‘sketch’ is a deliberate reference to the approximate results these algorithms obtain.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
exploratory algorithms together in a single platform, focused around the idea of a ‘word sketch’. Word sketches are a ‘one-page summary of a word’s grammatical and collocational behaviour’. The core algorithm can be thought of as a super-lexicon entry generator, exploiting automatic lemmatization, tagging, and parsing algorithms. Since no human intervention is employed, misanalysis occurs, hence the method is exploratory. A verb sketch might include the most frequently co-occurring subjects and objects, co-ordinated terms, phrasal prepositions, etc. Kilgariff et al. () use the example of the verb lemma : in an illustrative corpus, the object identified as the most strongly associated collocate is glimpse; the most frequent, eye.19 Of necessity this brief selection cannot do justice to the full range of computational abstraction methods. All these methods obtain results derived from interaction and frequency evidence, and these results must be interpreted theoretically.
. E
.................................................................................................................................. Experimental corpus linguistics travels in the opposite direction from the inductive methods of the previous section: testing an explicit hypothesis framed by a theory against corpus data. The process includes the formal ‘analysis’ process (primarily working top-down, but occasionally bottom-up) in Figure .. Unlike data from a lab experiment, a corpus is not constructed to specifically test particular hypotheses. We cannot manipulate experimental conditions to collect new data, but must work with existing data. This is sometimes termed a ‘natural experiment’, or a post hoc analysis.20 In the following sections we will consider briefly the methodological issues raised by two types of corpus experiment. • A frequency experiment. The first type is common in corpus linguistics. It investigates whether the frequency distribution of a lexical or grammatical variable changes over a sociolinguistic contrast. For illustration, we will borrow from Aarts et al. (), where the sociolinguistic variable is the binary choice: modal shall versus will in first person declarative contexts, as in I will/shall go to the park. In this diachronic study, the sociolinguistic contrast for comparing frequency distributions is time.21
19
It is hard to think of a more poetic illustration of the difference between interaction and frequency evidence. 20 Other sciences, such as astrophysics or evolutionary biology, work almost exclusively with observational data. 21 Diachronic studies offer the potential for exploring evolutionary change of grammar including grammaticalization (Traugott and Heine ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
• An interaction experiment. The second type concerns the interaction between two lexical or grammatical variables. Some examples are given in Nelson et al. (: ff), and we provide a walk-through below. Some issues are common to both types of experiment. In this section, .. deals with sampling and .. the selection of a meaningful baseline. Section .. considers interaction experiments.
.. Sampling Corpora are sampled as whole or part-texts, according to a sampling frame that specifies numbers of words per text, and numbers of texts in each category. Most corpora are collected according to the idea of a balanced sample, i.e. the distribution of material has similar proportions in each category, or is in rough proportion with the perceived availability of material. A true random sample, on the other hand, would populate the corpus from potential sources purely at random. The advantage of the balanced approach is that the corpus collector can aim for sufficient texts in any sub-sub-category of the corpus to permit research in this subdomain. However, it is difficult to control this sampling process for other variables. Sociolinguists have tended to follow Labov () in recommending stratified samples (sometimes called quota samples). The aim is to address two issues: including sufficient numbers of participants falling into specific sub-categories, and avoiding important variables being entangled (e.g., so that male participants do not tend to be older than female ones). A stratified sample allows us to require that we have examples of, say, women’s scientific writing in the s in our corpus. Maintaining strict independent partitions also can help differentiate variables in analysis. However, the more variables we include, the greater the number of combinations to be found and sampled, so ‘full’ stratification is rarely attainable. This raises an obvious question, namely: what if there is no data in a given intersection? Or what if it is extremely rare, and to include it would make the corpus unrepresentative? Principles of ‘representativeness’ and ‘inclusion’ pull in opposite directions (Wattam ). All corpora embody a compromise, because different research questions impose different requirements on data collection. So a corpus neatly stratified into sub-corpora of equal numbers of words may still generate a skewed sample, e.g. for expressions of obligation. We must accept that sampling is likely to be uneven, and try to address this in our analysis methods (Gries ). Finally, unless we intend to investigate phenomena over the length of a text, a corpus containing many short texts is preferable to one with a few long texts. The reason is due to a problem known as case interaction (Wallis )—the fact that datasets drawn from texts by queries are not actually a true random sample (the sample is not drawn
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
such that each instance is from an independent random text). Multiple cases of a particular phenomenon may be found in the same text—even the same sentence (Nelson et al. : ). The more independent texts and participants, therefore, the better.22
.. Baselines and alternation A common methodological mistake in corpus linguistics arises when baselines are not considered (Wallis forthcoming). Indeed, researchers have traditionally ‘normalized’ frequencies by quoting rates per word (or per thousand or million words). There are circumstances when this may be reasonable, but it is rarely optimal. Leech () used the Brown family of corpora to investigate whether shall or will changed in frequency over time. In both US and British English, shall accounts for a smaller proportion of the number of words in the s than in the s data. In British English, modal will is almost constant. This method does not seem to support a claim that shall is being replaced by will, although intuitively this is what we would wish to determine. We must distinguish two levels of variation—variation in opportunity to use a modal auxiliary verb, and variation in the choice of modal when that opportunity arises. The optimum baseline identifies these opportunities.23 Different baselines permit different research questions: • If we are interested in the potential to employ a modal verb, we might choose tensed verb phrases. The research question becomes ‘given a tensed VP, when is a modal employed?’ • If we are concerned with the variation of one modal verb amongst others, we might choose the set of all modals: ‘given that we use a modal, which do we choose and when?’ • In the case of shall versus will, the optimum baseline is simply the set of two modals {shall, will}, optionally extended to include ’ll (= will) and semi-modal going to. Defining baselines is not simply about identifying words. Ideally, we should try to limit data to where mutual replacement is possible. In the case of shall versus will, this meant restricting data to positive, declarative first-person contexts (Aarts et al. ). To obtain this data required the use of a parsed corpus and FTFs (section ..).
22 Case interaction affects statistical methods, but it also has implications for quasi-statistical algorithms, such as collocation tools. Several researchers have commented on this problem and proposed solutions (Nelson et al. , Brezina and Meyerhoff , Gries , Wallis ). 23 See Wallis (forthcoming) for a detailed discussion of how to select baselines. See also https://corplingstats.wordpress.com////that-vexed-problem.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
1
p
0.8 0.6 0.4 0.2 0 1955
1960
1965
1970
1975
1980
1985
1990
1995
. Comparing frequency distributions over different time periods. Declining proportion, p, of shall out of the set {shall, will} with first person positive declarative subjects, half-decade data (‘’ = – inclusive, etc.) from the Diachronic Corpus of Present-day Spoken English (after Aarts et al. ). The crosses represent mid-points of two subcorpora
In these contexts it was possible to argue that instances of shall were able to alternate (swap) with will (including ’ll), without changing the surface meaning of the clause.24 Figure . illustrates recent change whereby shall became less frequent than explicit will in first person declarative contexts in spoken British English at some point in the late s. Posing research questions in terms of the choices available to speakers is also a fruitful way of exploring how structural constraints in grammar interact.
.. Interaction experiments Interaction evidence is straightforward to explore within an experimental paradigm. A simple χ2 test can be used to explore interaction evidence in a statistically sound way (Wallis ). In an interaction experiment, both variables refer to different aspects of the same set of linguistic events (or aspects of two related events); see Nelson et al. (: ff). Consider how we might evaluate the interaction of the polarity of a question tag (‘TAGQ’) with the polarity of the preceding verb phrase within a host clause using ICE-GB. Compare: David turned up did he? That’s enough isn’t it?
24
Interrogative modals were excluded because they may have a different dominant pragmatic reading (cf. Will we go? = prediction; Shall we go? = suggestion). A similar issue applies to negation.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. FTF for retrieving a positive (‘¬neg’ = not negative) auxiliary verb, verb or VP (‘∨’ = ‘or’), followed by a tag question with a negative auxiliary or verb. This FTF may be permuted by changing both ‘neg’ features (circled) to obtain all four patterns
In the corpus, negative tag questions are identified by the presence of the feature ‘neg’ on the auxiliary verb or main verb. Negative verb phrases inherit the feature from the auxiliary or main verb. We create FTFs using the pattern in Figure ., allowing the upper node to match auxiliary or main verbs for robustness. We then manually review the extracted cases, as the ‘neg’ feature is not always accurately recorded. Frequency data from ICE-GB is in Table .. We can test the interaction between the two locations by applying a χ2 test for independence to the table. The result, as one might predict from the table, is statistically significant. Positive declarative clauses with negative question tags are the most frequent pattern, and there is a large net negative interaction (i.e. polarity consistency is rarer than inconsistency). Interaction experiments can be extended in a number of interesting ways. Wallis () summarizes an experimental design for investigating the chances of speakers/ writers making serial decisions of the same type. For example, it turns out that adding an adjective in attributive position before a noun head in a noun phrase becomes progressively less likely (‘more difficult’) the more adjectives we add. But this observation is not true for all rules. Serially adding post-modifying clauses or adding conjoined clauses after the noun phrase head, first declines and then increases in probability (‘becomes easier’) with each successive addition.
. C
.................................................................................................................................. With sufficiently flexible tools, a corpus (and a richly annotated parsed corpus in particular) has much to offer the linguist. Parsed corpora are simultaneously valuable and imperfect, so we must engage with evidence critically. Corpus linguistics is commonly associated with evidence of frequency. Few linguists would dispute that if you wish to know how common a construction might be, you need a corpus. Such distributions can be useful in directing pedagogical material
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Table . Contingency table of frequencies exploring the interaction between the polarity of question tags and the polarity of preceding verb phrases, extracted with FTFs and then manually reviewed. The verb phrase and tag never both have a negative polarity in all cases VP ICE-GB TAGQ
negative
positive
Total
negative positive
Total
towards the constructions learners might be most exposed to. Or they can focus grammatical research by identifying the patterns that account for the lion’s share of the data. However, there is one type of evidence that corpora can provide that is particularly relevant to the study of the structure of linguistic utterances. This is interaction evidence: probabilistic evidence that two linguistic events tend to co-occur. Such evidence may be simply associative but it is often possible to show a greater size of effect in one direction than another. If we know two events tend to co-occur, we are entitled to enquire about underlying causes. These reasons may range from the trivial fact of a shared topic in e.g. a conversation through to deep cognitive processes constraining the choices that speakers and writers make. In this perspective, grammar rules specify the choices available, whereas personal preference, language context, semantic and logical reasoning, communicative strategy, and cognitive processing combine to influence the choices made. Linguistic phenomena co-occur for a variety of reasons, not all grammatical, and so this evidence must be carefully considered and alternative explanations explored. Nonetheless, this type of evidence seems to be the most viable option for empirically evaluating grammar as expressed by human beings.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.............................................................................................................
APPROACHES TO ENGLISH GRAMMAR .............................................................................................................
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
.
. I
.................................................................................................................................. C Linguistics is the name of an approach to language study which has its origins in the s. Important landmarks were Langacker’s Foundations of Cognitive Grammar (), Lakoff and Johnson’s Metaphors We Live By (), and Lakoff’s Women, Fire, and Dangerous Things ()—though the outlines of the last named book had already appeared in Lakoff (). Other important figures include Talmy, for his work on semantic structure and its grammatical reflexes (his most important publications are collected in Talmy ), Fauconnier, for his work on mental spaces and, subsequently, conceptual blending (Fauconnier , Fauconnier and Turner ), Bybee, for her work on usage-based approaches to morphology and phonology (e.g., Bybee ), and Tomasello, for his studies on language acquisition (e.g., Tomasello ). Several textbook introductions (e.g., Taylor , Evans and Green ) and handbook surveys (e.g., Geeraerts and Cuyckens , Littlemore and Taylor , Dąbrowska and Divjak ) are available. An English grammar based on cognitive linguistic principles is Radden and Dirven (). Cognitive Linguistics arose largely as a reaction to what were perceived to be inadequacies and even outright errors of the prevailing generative and formalist approaches to language study. A number of themes have characterized the movement: • Language knowledge does not constitute an encapsulated module of mind, but is rather grounded in general cognitive processes, such as attention, memory, perception, categorization, abstraction, and indeed conceptualization more generally. • While language knowledge resides in the minds of individual speakers, it is nevertheless embedded in processes of socialization and acculturation, and reflects the dynamics of interpersonal communication.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. • Language acquisition proceeds as a response to input data rather than by the setting of what are presumed to be innate parameters of a mental module. The emphasis thus falls on the role of general learning mechanisms in response to features of the input. • An important feature of input is the frequency of occurrence of linguistic elements, at all levels of description, whether sounds, words, word combinations, or syntactic patterns. In any sufficiently large and representative corpus, these frequencies tend to be remarkably stable and need to be taken into account in descriptions of language structure and its mental representation. • Truth-conditional approaches to semantics, whereby the meaning of an expression lies in correspondences with states of affairs in the world (or possible worlds), are rejected. Rather, meaning resides in a speaker’s conceptualizations and construals, and involves such aspects as perspective, imagery, figure-ground organization, specificity, and epistemic commitment. • There has been a general reluctance to compartmentalize language description in terms of traditional layers of phonology, morphology, and syntax; the same goes for the distinctions between lexicon and syntax and between semantics and pragmatics. These are seen as continua, with no clean lines of demarcation.
The focus in this chapter is on Langacker’s Cognitive Grammar (though references to other trends will also be made), this being the most explicit and fully worked-out theory, especially for its account of many traditional issues in syntax. First, I outline the salient aspects of the theory (section .), then discuss its approach to some basic topics of syntax (section .). Section . addresses some selected topics illustrating the role of background cognition in several areas of English syntax.
. C G
.................................................................................................................................. Cognitive Grammar (CG) is the name which Ronald Langacker has given to his theory of language. Langacker began working on the theory in the s, largely, as he tells us (: ), through profound dissatisfaction with prevailing generative theories. The first full statement of the theory is the Foundations of Cognitive Grammar, followed four years later by a second volume, sub-titled Descriptive Applications (Langacker ). An updated presentation is Langacker (a). Many of Langacker’s papers and journal articles have been assembled in collected volumes (Langacker , , ). CG is based on minimal assumptions about the object of study. Invoking Saussure’s conception of the linguistic sign, Langacker postulates that there are only three kinds of entities relevant to linguistic description: (a) phonological representations (understood very generally to refer to language in its perceptible form); (b) semantic representations (again understood very broadly, to incorporate not only referential aspects, but also speaker attitudes, contextual effects, and sociolinguistic parameters); and (c) symbolic
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
relations between (a) and (b). To the extent that these representations have been entrenched through previous usage, they are referred to as units. A language may then be characterized as a structured inventory of linguistic units—phonological, semantic, and symbolic (Langacker : ). A major task of the cognitive grammarian is to identify and to propose substantive characterizations of these units. Learning a language is seen as a life-long process consisting in expanding one’s repertoire of units and increasing one’s mastery of them. The input to learning are usage events, that is, linguistic utterances in all their context-specific detail. Linguistic units emerge on the basis of perceived commonalities over diverse usage events. It is taken as uncontested that no two speakers of the ‘same’ language will share exactly the same inventory of linguistic units, something which is rather obvious with regard to word knowledge (and even more obvious when it comes to matters of pronunciation) but which is often discounted in studies of syntactic competence. Of primary concern in the study of syntax are symbolic units. It should be emphasized that while symbolic units unite phonological and semantic representations, the approach by no means entails that all phonological units enter into symbolic relations. Vowels, consonants, permissible consonant clusters (to the extent that these have become entrenched through frequent usage), and syllable types constitute phonological units, but it is only by happenstance that a vowel unit such as [ɑː] counts (in some of its occurrences) as the phonological pole of a symbolic unit, i.e. the word are. Likewise, as we shall see in section ., elements of semantic structure are not necessarily symbolized by phonological material. Another misunderstanding needs to be addressed. It is sometimes (erroneously) asserted that CG proposes that all aspects of syntactic structure are reflexes of, and can therefore be reduced to, matters of semantics (see e.g., Jackendoff : ). It is certainly true that syntax is taken to be inherently meaningful (Langacker : ) and decades of research by practitioners of the theory have uncovered the semantic foundations of an increasing number of seemingly arbitrary syntactic phenomena.1 However, the CG claim is actually more modest, namely, that syntax can be fully described in terms of assemblies of symbolic units. At the same time, it is acknowledged that syntactic organization is subject to a great deal of language-specific conventionalization. Even though these conventions might be motivated by semantic and pragmatic factors, and by other structural elements in the language, in the last analysis the conventions simply have to be learnt. With respect to the study of words, the relevance of Saussure’s conception of the linguistic sign is evident. To know the word tree involves knowing (a) its pronunciation, (b) its meaning, and (c) the fact that the former symbolizes the latter. (Even so, the task of stating the conceptual content of the semantic unit, and even of the phonological unit, is far from trivial; think of all the ways in which the word can be pronounced, 1
For a good overview of CG achievements in grammatical description, especially with regard to the structure of the verbal group and the fronting on non-subject elements in English, see Langacker ().
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
by different speakers of different accents,2 and all the kinds of things that can possibly be called trees.) More controversial is the application of the approach to syntax. Many would argue that one aspect of knowing the word tree is knowing its lexical category, i.e. the fact that it is a noun. In most linguistic theories, syntax constitutes an autonomous level of structure—autonomous, in the sense that its constituents (such as noun, noun phrase) and relations between constituents (such as subject-of) are unique to this level and cannot be reduced to, or explained by, elements from other levels. CG rejects this approach, claiming that syntax can be fully and insightfully described solely in terms of the constructs made available by the theory. This approach does not, of course, entail that there is no such thing as syntactic organization, nor does it deny the important role of lexical categories. The claim, rather, is that syntax and its categories can be fully described in terms of the admissible constructs. For this ‘minimalist’ approach to language study to work, some additional apparatus is required. I consider first the relations between units (..), followed by an account of how some of the basic notions in syntax are handled (..). To round off the section, I look at the role of constructions (..) and the CG approach to lexical categories (..).
.. A structured inventory As noted, knowledge of a language consists in knowledge of a structured inventory of units. The structure resides in various kinds of relations that exist amongst the units. Three relations are paramount: (a) schema-instance. One unit can be schematic for (i.e. is more abstract than) another unit. The schema abstracts what is common to a range of instances; the instances inherit the content of the schema and elaborate it in contrasting ways. The semantic unit [TREE] is schematic for [FRUIT TREE], which in turn is schematic for [APPLE TREE], [PEAR TREE], etc.; the latter are instances of [FRUIT TREE], which is an instance of [TREE]. (b) whole-part. A unit may be analysed into constituent parts; conversely, a complex unit may be formed by assembling its parts. Within the symbolic unit singer we can identify the unit sing and the agentive suffix -er; within The farmer kills the duckling we can identify, inter alia, the farmer, kills, and the duckling. (c) similarity. One unit may be perceived as being similar (in some respects) to another unit. The above relations can be recursive. [A] may be schematic for [B], which in turn is schematic for [C]; [X] may be a part of [Y], which in turn is a part of [Z]. This property greatly expands the repertoire of units that make up the grammar of a language. 2
Phonological variation in the pronunciation of words is probably no less extensive than variation in their meaning, a fact often overlooked by semanticists. See Johnson ().
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
A second point to note is that the above relations are mutually dependent, indeed, mutually defining (Taylor b). It is the similarity between apple trees and pear trees that enables the schematic unit [FRUIT TREE] to emerge. The fact that we can analyse singer as sing+er rests on its similarity with countless other examples: walker, dancer, speaker, etc., this permitting the emergence of a schema for an agentive noun [VERB+er]. It is the schema which, in turn, sanctions the assembly of agentive nouns from their parts. The network structure of linguistic knowledge (and here there are affinities with the network model proposed in e.g. Hudson’s Word Grammar) has several important consequences. First, grammaticality (or acceptability: in CG there is no reason to distinguish the concepts) of an expression is a matter of its being sanctioned by existing schemas. Second, linguistic expressions exhibit a high degree of language-internal motivation, in the sense that every expression lies at the hub, as it were, of a large number of relations to other units in the language. An expression which failed to exhibit these relations would be perceived, quite simply, as not belonging to the language at all. Even ‘idiomatic’ expressions are supported by multiple intra-language relations. See, for example, the extended discussion of the expression Bang goes my weekend! in Taylor () and of the so-called WXDY construction (What are you doing, lying on the floor?: Kay and Fillmore ) in Taylor (: –). Internal motivation contributes significantly to the learnability of a language, and facilitates the retrieval of units in acts of language usage.
.. The mechanics of syntax In CG, syntax—the means whereby smaller symbolic units (words, morphemes, etc.) are combined into larger meaningful expressions—is accounted for in terms of the three kinds of units listed in section ... Once again, some additional constructs are needed. These are best introduced by way of some examples. Take the expression the book on the table. (I ignore for the time being the role of the determiners.) Let us consider, first of all, some features of the component words: book, on, and table. Book and table designate (or profile) three-dimensional things, with characteristic shape, size, appearance, material composition, function, history, etc., these being the domains against which the concepts are understood. (Since this chapter is not primarily concerned with semantic representations, we need not go into this matter further.) On is quite different. The word profiles a relation (here: a spatial relation) between a figure and a ground (see Talmy : Chapter ), or, in Langacker’s terminology, a trajector and a landmark, trajector being defined as the cognitively more prominent entity in a relation. While trajector and landmark both feature in the profile of the preposition, they are specified only schematically, the landmark being characterized merely as a possible supporting surface, while the trajector could be practically anything that requires support. Needless to say, on has semantic values other than the one sketched above, especially in non-spatial domains. Consider such random examples as on television, on Monday,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
on holiday, on a diet. (For the polysemy of on, see, e.g., Evans .) The main point to emphasize here, however, is that on, in all of its uses, is conceptually dependent, in contrast to book and table, which are (comparatively speaking) conceptually autonomous: we cannot conceive of an on-relation without invoking, however vaguely, some notion of the entities involved in the relation, while the conceptualizations of book and table do not have this property. The syntagmatic combination of on, table, and book is possible because book and table are able to elaborate the schematic trajector and landmark in the semantic structure of the preposition. How is the combination to be analysed? Consider the sub-unit on the table. This, like the preposition, profiles a relation. On lends its profile to the expression; it is therefore the expression’s head (or, in CG terms, the profile determinant). The table functions as a complement, that is, it elaborates a schematic structure in the head. The book on the table, in contrast, does not profile a relation; it profiles the book, the prepositional phrase on the table serving to give additional information to the head; on the table functions as a modifier. The notions of profiling and elaboration, along with the notions of thing and relation, make possible a succinct definition of the traditional notions of head, complement, and modifier, the head being the item which lends its profile to the composite expression, a complement being an item which elaborates a schematic entity in the semantic structure of the head, while a modifier provides additional specification to the head. The relations of complement and modifier are not always mutually exclusive. In father of twins, the prepositional phrase of twins has features of both modifier and complement (Taylor : ). An additional syntagmatic relation may be mentioned, that of apposition. Apposition involves the combination of expressions with identical profiles, the profiled entity typically being characterized from different perspectives, or with differing degrees of precision: [my neighbour] [the butcher], [the fact] [that the earth is flat]. One value of the preposition of is to profile a relation of apposition: [the state] of [California], [the fact] of [the earth being flat]. Even [that idiot] of [a man] can be analysed in this way (Taylor : ). Consider now The book is on the table. This no longer profiles the book, but a situation, the profile determinant (head) being the verb. Be merely profiles the continuation through time of a situation, whose participants are elaborated by its complements. Note, in this connection, that on the CG approach a clausal subject is analysed as a complement of the verb, since it elaborates a schematic entity in the verb’s semantic structure; subject and direct object are distinguished by their status as trajector and landmark, respectively.
.. Constructions The account of the book on the table could be replicated on countless other examples. The similarity between these prompts the emergence of a schematic nominal construction
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
[NP PP], which in turn is able to sanction new instances. At the same time, some instantiations of the construction may well have the status, for some speakers, of entrenched units. CG accepts that the mental grammar may contain a good deal of redundancy, with schematic constructions and their (regular) instantiations both being stored in the mind. Indeed, it is only through acquaintance with a range of instances that the schema is able to emerge at all, and there is no reason to suppose that once the generalization has emerged the instances upon which it is based are then expunged from memory. It should also be noted that instances of a construction may sometimes acquire an idiomatic flavour. Consider examples like the man in the moon, food on the table, a place in the sun, reds under the bed. Although these expressions conform to the more abstract schema, they have special semantic nuances and conditions of use. The construction just discussed is one of countless constructions which make up the grammar of a language, an expression being judged grammatical to the extent that it is sanctioned by already entrenched units.3 Collectively, these perform the function of rules in many other theories of grammar. The constructions which make up the grammar of a language vary along such parameters as their internal complexity, their lexical fixedness, and their productivity. Some, such as the prepositional construction [P NP], or the transitive verb construction [NP V NP], are highly schematic, both in terms of their semantics and their phonology (the phonological pole simply specifying ‘some’ phonological material). Others have slots which can be filled only with a small number of items, and some slots may be lexically specified. For days on end instantiates a schematic construction [for X on end], where X represents a noun (typically plural) designating a period of time (Taylor a). The limiting case is fixed idioms, where no variation is possible. Examples include of course, by and large, and the introductory formula How do you do?, which cannot even be embedded in reported speech.4 Favouring the productivity of a construction is not so much the raw frequency of its occurrence, but the number of different types which instantiate it (Bybee ). The matter is familiar to morphologists. The high (token) frequency of did (as the past form of do) does not sanction a general process of verbs in [uː] forming their past tense in [d]. On the other hand, the cluster of nouns with ‘Greek’ plurals (thesis-theses, crisiscrises), none of which is particularly frequent, is nevertheless able to provide a model for innovative forms such as process-processes [siːz], bias-biases [siːz] (Taylor : ). In terms of its raw frequency, the way-construction (make one’s way to the exit, elbow one’s way through the crowd, spend one’s way out of recession: schematically: [V one’s way PP]) may not rank particularly highly. However, the fact that a large
3
The sanctioning units may not always pull in the same direction. See the discussion of That is fair to say in Taylor (: ). 4 This comment applies to present-day English. In Jane Austen’s Northanger Abbey (Chapter ), we find ‘He asked each of them how they did’. The use of How do you do? in Chapters and of the novel, however, suggests that at that time the formula had the function of a generalized greeting, rather like present-day English How are you?, and was not restricted to a first introduction.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
number of different verbs and prepositions are attested in the construction favours its productive use with new verbs and new prepositional phrases (Goldberg ).
.. Lexical categories The above-discussed example is also instructive in that it gives an insight into the CG approach to parts of speech (or lexical categories). See also Hollmann (this volume). For many linguistic theories, parts of speech are both identified and defined by their distribution. Langacker (), however, takes the position that (at least) the major parts of speech have semantic content, this residing in the nature of the profile. Nouns profile things, defined as regions in a domain. While prototypical examples can be cited (people, animals, and smallish, three-dimensional time-stable objects that a person can handle), the noun category extends far beyond these to incorporate abstracts, collectives, and nominalizations of various kinds. The distinction between count and mass nouns is crucial to the grammar of the English noun phrase. As a matter of fact, practically any noun in English can be used as count or mass, though preferences and biases obviously exist. The distinction rests on whether the instantiated instances are bounded (count) or unbounded (mass), that is, whether the boundary of the instance is contained within the profile. ‘Lawn’ may be construed as a kind of covering (We have a lot of lawn to mow) or as an instance whose boundaries are in profile (We have a large lawn to mow). Verbs designate temporal relations, i.e. relations construed as obtaining over time. The temporal component is included in the profile and it is this aspect that mainly distinguishes verbs from prepositions and adverbs. The temporal boundedness of an instantiated instance gives rise to the distinction between states and processes, the former being unbounded, the latter bounded, a matter of importance in the use of simple or progressive aspect. Langacker is by no means the first to draw a parallel with the mass–count distinction in nouns. Again, however, it would be an error to classify verbs as inherently stative or non-stative; while biases exist, practically any verb can be used in either function. Prepositions and conjunctions designate atemporal relations; while the relations may well be instantiated in the temporal domain, time does not feature in the profile. Prepositions and (subordinating) conjunctions are distinguished by the nature of their complement, a noun in the case of prepositions (before lunch), a clause in the case of conjunctions (before we had lunch). Adjectives and perhaps even more so adverbs constitute rather heterogeneous classes, with fuzzy boundaries and few unifying or defining features. Prototypically, however, an adjective (such as tall in a tall man) profiles a relation between its trajector (in this case, man) and a region in vertical space in excess of some reference norm. Similarly, a prototypical adverb (fast, as in run fast) profiles a relation between the process designated by the verb and a region in velocity space, in excess of some norm. The accounts would obviously need to be modified for other kinds of adjectives and
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
adverbs, e.g., for exclusively attributive adjectives (such as mere, in a mere child, the very idea), or for so-called sentence adverbs (obviously, presumably, etc.). Cognitive linguists have long been interested in prepositions. Here, I would draw attention to a crucial distinction, having to do with the ‘plexity’ (Talmy ) of the trajector. One set of spatial prepositions—this includes in, on, at, above—construe the place of the trajector as simplex. Another set, including across, (a)round, through, along, up, require a trajector which occupies a multiplicity of places. This property gives rise to some interesting effects. Thus, with multiplex prepositions, the trajector may be an extended entity, which simultaneously occupies a series of places (in the appropriate configuration): a road along the river, a tunnel through the mountain, a path up the hill. Alternatively, the trajector may consist of a multiplicity of entities (typically designated by a plural noun), arranged in the appropriate configuration: trees (planted) along the river, footprints through the snow. Another possibility is that the trajector is a moving entity which traces a path of the appropriate shape (I walked round the lake). Yet another possibility is that the preposition is used to designate a (simplex) place; the place, however, is construed as the endpoint of a path (He lives over the hill, The pantry is through the kitchen). Ambiguities can arise; consider houses up the hill (houses located at a place ‘up the hill’, or houses in a rising configuration). Note that if the trajector is simplex—as in the house across the field, the church up the hill—the endpoint reading is the only possible one. The complexities of the preposition over—a particular favourite with cognitive linguists!—may be due in part to the fact that the preposition has the special property of being able to function as both simplex and multiplex (Taylor : –). Consider the ambiguity of We flew over the ocean (‘above’ versus ‘above and across’).
. B
.................................................................................................................................. A major issue in linguistic theory is the problem of compositionality. It goes without saying that component units make a contribution to the properties of the complex expressions of which they are a part. The question is, whether the properties of the parts fully determine the properties of the whole. The CG position is that they do not. This is evident even with respect to phonology. The [r] which features in the (British English) pronunciation of ma [r] and pa is not contributed by the properties of any of the constituent words; it is due, rather, to a phonological requirement that syllables have a consonantal onset (Taylor : ).5 It is therefore not too surprising that compositionality does not obtain with respect to the semantic value of an expression. In the first place, the semantic representations of component items (words, in the first 5 An alternative possibility is that the syllable onset requirement is satisfied by the insertion of a glottal stop: pa [ʔ] and ma. Note that CG does not have the option of proposing that there might be a ‘latent’, ‘underlying’ [r] in the phonological representation of ma.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
instance) should not be thought of as constituting fixed packages of information. Rather, the items provide paths of access (Langacker a: ) to a potentially open-ended network of conceptual knowledge, only some facets of which are activated in a given usage event. To be sure, some paths may be very well trodden and liable to be invoked as defaults or in the absence of contrary indications, thus encouraging the (erroneous) impression that words do have fixed meanings (Taylor ). A related issue is that the interpretation of complex expressions very often makes reference to conceptual structures, relations, and processes which are not intrinsically associated with any of the component items, or even with the sanctioning schemas. Consider some possible readings of (): ()
Twenty years ago, the mayor of this city was a teenager.
Did the city have a teenager as its mayor, or is the present mayor in his/her thirties? The two readings are not due to the ambiguity of any of the component units, but rather result from alternative blendings of two temporally distinct situations, the latter interpretation emerging if the mayor of twenty years ago is taken to be the same person as the current one.6 Blending theory (Fauconnier and Turner ) addresses the role of just this kind of ‘background cognition’ (Coulson and Oakley : ), while Langacker (a: ) in this connection speaks of a largely implicit ‘conceptual substrate’, which underlies the understanding of all linguistic expressions. In fact, much of the ‘work’ in CG takes place at the semantic/conceptual level, and involves concepts which are not overtly symbolized in the language. Since there is no direct (linguistic, observational) evidence for these concepts, their discussion tends to be speculative. They can be justified by their explanatory power, and for the fact that they offer unified accounts of seemingly disparate phenomena. In the remaining part of this section, I select three topics which have been important in the development of CG: grounding, reference points, and subjectification.
.. Grounding The term ‘ground’ refers to the ‘current discourse space’ (Langacker a: ), relevant aspects being the circumstances of the speech event, primarily the speaker and hearer(s), but also the time and place, as well as preceding discourse. Grounding is the process whereby a linguistic expression is anchored to ground. The above example of The book is on the table exhibits two kinds of grounding. Firstly, the nouns are grounded by means of the definite determiner the, while the clause as a whole is grounded by the present tense of the verb. 6 There is another interpretation of (), according to which there used to be a regulation that anyone who was mayor had to be a teenager. Here, mayor designates a role, not an individual. See Taylor (: ) for discussion.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Definite determiners (primarily the, that, these, etc.) indicate that the speaker has singled out, or established mental contact with, a specific, identifiable exemplar, or exemplars, of the kind of thing designated by the noun, and presumes that the hearer can do so too. Indefinites (such as a(n), unstressed some, numerals, etc.) have two values. They may convey that the speaker has a specific exemplar (or exemplars) in mind; the speaker, however, presumes that the hearer is not able to identify it. Alternatively, on a non-specific reading, the speaker merely conjures up an arbitrary instance, or set of instances, whose identity is unknown or irrelevant; in the case of generic statements (of the kind A cat chases mice), the instance serves as representative of the class as a whole. As an ungrounded noun, coffee profiles a kind of substance; likewise strong coffee (modification of a noun, while narrowing down its range of potential referents, does not in itself constitute grounding). This coffee refers to an identifiable instantiation of the substance. Several interpretations are available. The expression could refer to a portion (‘this cup of coffee’) of the substance or to its instantiation in type space (‘this kind of coffee’). Used without a determiner, the word takes on properties of a generic: Coffee keeps me awake. Clausal grounding is achieved through tense and modality. In English, these two systems are largely in complementary distribution. Tense grounds a clause in present or past reality, whereas modals typically ground a clause in terms of its perceived possibility, likelihood, inevitability, and so on. Significantly, the modals do not bear the present tense -s inflection with third-person singular subjects. (I address below the question of whether modals inflect for past tense.) First, some remarks on the English tenses. Just as indefinite determiners can have two values (specific and non-specific), so too can present tense. It may ground the process in present reality (that is, the process is presented as obtaining at the moment of speaking) or it may invoke a ‘virtual’ reality, an aspect of ‘how the world works’. You press this button to start the machine refers to a structural property of the machine; the statement may be true even if the machine has never been operated. Similarly, This knife cuts well may be taken as true, even though the knife has never been used for cutting, its cutting ability being an aspect of its constitution. Past tense prototypically situates a process in a time prior to the moment of speaking. Past tense has other values, having to do, not with temporal, but with epistemic or pragmatic distance. If only I knew situates the predicate ‘I know’ in a counterfactual space, contrary to present reality; although morphologically past, the expression indicates a present state of not-knowing. I wanted to ask you something illustrates a pragmatic distancing of the speaker from the speech event; it is as if the speaker wishes to soften the impact of the request. Modals ground a situation in virtual reality—the world as it could be, might have been, may be, should be, will be, etc. Language is as much (perhaps even more) about imagined realities as about ‘real’ reality. These ‘realities’ are assessed from the perspective of the ground. Although forms such as could, might, should, and would may be considered to be morphologically past (at least, from a diachronic point of view), they
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
assess a situation from the perspective of the ground (the past morphology illustrating, perhaps, a kind of epistemic distancing or pragmatic weakening). You could have been killed constitutes a present assessment of a past eventuality; You shouldn’t have done that is a present-time evaluation of a past action. We noted above that the modals do not bear present-tense inflections; neither do they in general have ‘true’ past tenses,7 confirming again the complementary distribution in English of tense inflections and the modals.
.. Cognitive reference points This construct is familiar in everyday life. You state the location of one entity (the target) by reference to its proximity to, or accessibility with respect to, a more readily available entity, the reference point. The latter provides access to a range of entities in its proximity (or ‘dominion’). Natural reference points are the speaker and/or hearer, wholes with respect to their parts, prominent features of the natural environment, or entities that have been rendered prominent in previous discourse. There are many linguistic manifestations of the reference point phenomenon. One is the prenominal possessive construction. John’s car refers to a car, one which is identified from the perspective of a person identified as ‘John’, and, by default, the expression is taken as uniquely referring. Prenominal possessives generally have definite reference, the possessor phrase thus serving as a grounding element. Possessor entities are predominantly human or animate, and usually definite; typically, they are entities that have been introduced (and rendered topical) by previous discourse. Possessees, in contrast, are typically new to the discourse (Taylor ). Significantly, the reference point is stated first, word order iconically representing the path of access to the target. The nature of the relation is not specified, though possession (in the legalistic sense), kinship, and whole–part are especially common. Often, the semantics of the possessee noun strongly favours a particular interpretation. My aunt favours the kinship relation, my hand the part–whole relation, my thesis the creator–product relation (a thesis has to have been written by someone),8 and so on. The reference point phenomenon can also be observed in nominal compounds. Fingernail designates a nail conceptualized against the notion of finger; likewise toenail. Compare also car door and house door. Both designate a door (compounds are typically headed by their final element), but they characterize the door with respect to the previously mentioned entity. The examples underscore my earlier remark on the 7
There are exceptions to this generalization. In reported speech, the modality may be shifted to an earlier point in time (She said that she might come to dinner). Another exception is provided by could; in When I lived in France, I could speak French fluently (= I was able to speak French fluently) the modal refers to a past ability. See also Ziegeler, this volume. 8 To be sure, other interpretations of my thesis are possible; it could be, for example, the thesis that has been assigned to me for assessment. Still, the general point holds. A thesis, by its very nature, is a kind of thing that has to be subjected to assessment by somebody.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
flexibility of word meanings. The semantic pole of door does not constitute a fixed package of information; the context provides access to a network of conceptual knowledge, activating those facets which are relevant. The affinity between possessives and compounds is shown by the fact that the distinction is sometimes unstable, especially where the ‘possessor’ is understood generically. There is some variation in the use of user manual and user’s manual. A further manifestation of the reference point phenomenon concerns topic constructions. A topic is an initial element which provides the framework for the interpretation of the following material (see Kaltenböck, this volume). Consider the following: ()
a. Your research proposal—you need to do a lot more work on it. b. Your research proposal—a lot more work is needed.
The initial noun phrase provides the context for how a lot more work is to be understood. It is obviously not a question of more manual labour, but intellectual work, of the kind needed in the writing of a research proposal. The role of the topic is apparent, even if there is no syntactic relation between the topic and the following clause, as in (b). A second point to note is that in (a) the topic serves as the antecedent for the interpretation of the pronoun it. Normally, a pronoun follows its antecedent (as in this example). Sometimes, however, the interpretation is dependent on a following item, as in the following: ()
In his office, the vice-chancellor was reading the press reports.
His would normally be taken to be co-referential with the vice-chancellor, consistent with the prominence associated with the subject of a clause. However, if the preceding context were to attribute topical salience to another individual, one could imagine that his office could refer to the office of this other individual. A kind of topic that is deeply embedded in the grammar of English is the subject of a clause. Langacker () defines subject as the more prominent element in a temporal relation; it is typically a nominal which serves as the starting point for the interpretation of the clause. A remarkable feature of present-day English (in comparison to earlier stages of the language and to related Germanic languages) is its flexibility in the choice of subject. The major semantic roles of the subject (in a transitive clause) are to indicate the Initiator of a process (Bill kicked the ball) or the Experiencer of a Stimulus (I heard the music). Another possibility is for the subject to designate the ‘setting’ for an event or situation. In (), sentences (a) and (b) invoke a whole–part relation (analogous to that in possessives: the guitar’s string, the car’s tyre), (c) and (d) present a temporal setting, while in (e) the setting is a location. ()
a. My guitar broke a string. b. My car burst a tyre.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. c. The next day saw our departure. d. The period had witnessed many important changes. e. The tent sleeps six.
These usages are not very productive, being subject to many lexical restrictions (especially concerning the choice of verb);9 their ‘idiomatic’ nature becomes apparent once one attempts to translate them into other languages, even closely related European languages (Hawkins ). Halliday (: Chapter ) sees in sentences like these examples of ‘grammatical metaphor’, whereby a grammatical resource—here the transitive NP VP clause—is transferred from its basic function to encode a different set of relations amongst the participants. What motivates the new structure is the desire to get the reference point entity into initial position; there is an extra bonus if the initial position functions as the (inherently topical) subject of the clause. As a final example of the role of reference points, I would like briefly to mention the case of subjectless (and ungrounded) clauses. A mainstay of prescriptive advice on ‘good writing’ concerns the avoidance of dangling participles. Students are advised to avoid examples like the following: a. Soaring high above the fields, we could see the eagle clearly. b. Driven to drink by her problems, we see how Janet will come to a sticky end.10 c. Walking down the street, the street lamps came on.11
()
According to the prescriptive rule, the understood subject of the participles (note that both present and past participles are involved) has to be identifiable as the subject of the main clause. From a CG perspective the rule makes good sense; the main clause subject is a cognitively prominent entity which naturally serves as a reference point for the conceptual elaboration of the subjectless clause. Often, however, the prescriptive rule can be broken, with little or no interference with easy comprehension (nor resulting in unfortunate ambiguities). Even the most pedantic of prescriptivists would probably not want to outlaw examples like the following: ()
a. Considering her dislike of Martin, it was surprising that she invited him. (Cobuild) b. the police must be many times more vulnerable, seeing that police investigations are far less structured and not at all open to outside scrutiny. (BNC)
Here, considering and seeing (that) might be analysed as kinds of subordinating conjunctions (or even, in the case of considering, as a kind of preposition), rather 9
There are also syntactic restrictions. For example, although the sentences have the structure of transitive clauses, they do not have passive counterparts: *A string was broken by my guitar. 10 http://www.bristol.ac.uk/arts/exercises/grammar/grammar_tutorial/page_.htm. 11 https://english.ucalgary.ca/grammar/course/sentence/_c.htm.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
than as present participles, thus rendering the prescriptive rule otiose. Nevertheless, the expressions do retain a whiff of their participial origin. They prompt the question of who it is who does the considering and the seeing. The answer, of course, is the speaker/ writer, who at the same time also draws in other participants in the discourse situation, namely the hearer(s) and the intended reader(s). These ever-present features of the ground are available to serve as reference points for the interpretation of the elliptical structures. To take another proscribed example, this time involving a dangling subjectless infinitive:12 To wash the car, soap and water are needed. Who is to wash the car? Obviously, the addressee (who is presumably in need of the advice). To this extent, the proposed ‘correction’—To wash the car, you will need soap and water13—seems needlessly explicit.
.. Viewing arrangements: objectification and subjectification Langacker (e.g., a: ) refers to a default viewing arrangement, in which the ‘viewer’ (conceptualizer) is maximally distinct from the scene that is viewed (conceptualized). (Think of a theatre patron, viewing events on the stage.) The profiled ‘objective’ situation is ‘on-stage’, the conceptualizing subject (speaker/hearer) is ‘offstage’. While the observer is able to direct attention at different facets of the on-stage region, thus zooming in on particular features of it, there is no interaction between viewer and viewed. Of course, no utterance can be per cent objective; nominal and clausal grounding involve some implicit reference to the ground. Perhaps the closest we can get to a purely objective statement would be assertions of mathematical truths, of the kind 2 = . Objectification is the process whereby some aspect of the ground is put on-stage as it were, and spoken about ‘objectively’. Consider the following: () (Mother to child) a. Don’t talk to me like that! b. Don’t you talk to me like that! c. Don’t you talk to your mother like that! In (a), the speaker refers to herself using a self-referential personal pronoun. In (b), the addressee, implied in (a), is explicitly put on-stage, while in (c) the speaker refers to herself as a third-party might do. The examples illustrate the progressive objectification of features of the ground. More diverse in its effects is the process of subjectification, whereby some facet of an objectively construed situation comes to be incorporated into the conceptualization 12 Here in the text is another dangling infinitive! Who is supposed to ‘take’ another example? Well, you, the reader, of course, in partnership with me, the writer. 13 http://www.ucalgary.ca/uofc/eduweb/grammar/course/sentence/_c.htm.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
process itself.14 Take, as an example, uses of the verb go. In its ‘basic’ sense, go profiles spatial motion, whereby a trajector entity, expressed by the verb’s subject, occupies a sequence of places distributed over time (as in She went to the store). Both change of place and progress through time are involved. Go, however, has many additional uses which do not involve motion. Some of these can be understood in terms of metaphor, whereby change of state is construed metaphorically as change of place, as in The milk went sour, Prices went up, The situation went from bad to worse.15 (Note that these examples still involve progress through time.) The temporal aspect is prominent in other uses of the verb, as in I went through the alphabet (i.e. recited it, from beginning to end), The tune goes like this (followed up by the speaker humming the tune). A further use is with respect to the prospective future: ()
a. b. c. d. e.
We’re going to have to repaint the kitchen. The kitchen is going to have to be repainted. Be careful. You’re going to fall! It’s going to rain. There’s going to be trouble.
Could this use also be seen as an instance of conceptual metaphor, whereby a basically spatial term is used of the temporal domain, as suggested e.g. by Lakoff ()? A problem with this approach is the question of what, exactly, it is that ‘goes’ (metaphorically speaking). Note, in this connection, the loosening of restrictions on the properties of the subject nominal. Even in its metaphorical uses, the subject of go is the entity that changes state (The milk went sour) or that proceeds along a path (The tune goes like this). There can be no question, however, of ‘the kitchen’ in (b) changing state. Rather, it is the conceptualizer who mentally projects the evolution of the present situation into the immediate future (Langacker : ): given the present dilapidated state of the kitchen, the speaker projects a future situation in which it will be necessary to repaint it. The notion of motion, implied in objective uses of go, has here been subjectified, i.e. become a component of the conceptualization itself. Subjective motion—in Langacker’s sense—is also at issue in well-known examples like the following: ()
a. The road goes through the mountains. b. The mountain range goes from Canada to Mexico. c. There’s a scar going from his chin to his left ear.
‘Subjectification’ is often used to refer simply to an increased involvement by the speaker, as, for example, when really is used not simply to refer to ‘reality’ (Unicorns really exist) but to emphasize the speaker’s commitment to a proposition (I really believe that unicorns exist). While not inconsistent with this usage, Langacker’s understanding of subjectification is more precise, in the way described in the text. 15 It is of interest to note that ‘change of state’ go generally involves change for the worse. A person can ‘go mad’, but not ‘go sane’. Things can ‘go wrong’, after which they may ‘come right’, but hardly ‘go right’. 14
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
These sentences describe the configuration of a static entity. It makes no sense to claim that the road, scar, etc. ‘goes’, whether fictively (Talmy ) or metaphorically, from one place to another. What ‘goes’ is the conceptualizer’s attention as s/he mentally traces the elongated object from one end to the other. Significantly, the phenomenon is usually attested with respect to elongated entities, which cannot be taken in at a glance, as it were, resulting in a conceptualization which is inherently dynamic. A similar kind of subjective motion is involved in examples like the following, which exhibit the end-point focus of multiplex prepositions: () a. My wife was sitting {opposite/across the table}. b. They live over the hill. c. The pantry is through the kitchen. As already noted (section ..), across the table, over the hill, and through the kitchen designate a place construed as the end-point of a path. Again, it is the conceptualizer who mentally traces the path. The source of the virtual path may be specified: She was sitting across the table from Harry. Even here, though, it is the conceptualizer who traces the path from Harry to the designated individual. Subjective motion is inherently directional. Compare the road into the forest and the road out of the forest. These expressions could refer to exactly the same road, but differ with regard to the direction of the conceptualizer’s attention. The same goes for the contrast between the gate into the garden and the gate out of the garden. The gate is not an elongated entity which prompts a dynamic conceptualization. Nevertheless, a gate, by its very nature, is likely to feature as a location on a (virtual) path. The two expressions exemplify alternative ways in which the path can be mentally scanned. Subjectification is ubiquitous in the grammar and has been a major source of semantic change and meaning extension. I briefly consider two examples, from the domains of modality and causation. In their so-called deontic senses, the modals refer to some force acting on the realization of a situation. Take the case of must. This modal denotes the presence of a force, usually of a social, legal, moral, or bureaucratic nature, which requires people to act in a certain way: You must apply within six months. Sometimes, it is the speaker who attempts to influence another’s behaviour: You really must read this book; it’s fascinating. The speaker may even impose the obligation on herself: I really must read this book. Even in these cases, the force is construed objectively: it is a component of the situation being described. Alongside their deontic senses, modals can also have an epistemic sense, where the ‘force’ imposes itself on the speaker’s conceptualization rather than on her or another person’s actions. In You must be mistaken the facts of the matter (as I perceive them) compel me to believe that you are mistaken. The force dynamics inherent to must are incorporated into the very process of conceptualization itself. Comparable accounts are available for other models. Should implies a somewhat weaker force than must: You should do as I say. But there are also epistemic uses: They should be home by now (I conjecture, given the evidence available, that they are now
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.
home). Similarly with can’t: You can’t smoke here (prohibition) versus You can’t be right (the facts as I perceive them prevent me from believing that you are right). Interestingly, some periphrastic modals, such as have (got) to (= must) exhibit the same development: You have (got) to leave (deontic) versus You have (got) to be joking (epistemic). On the other hand, not be able to (= can’t) lacks the epistemic sense: *You are not able to be serious (compare: You can’t be serious). Finally, we may briefly cite the example of causal relations. Cause is essentially a matter of construal; one observes that event A (the temperature falls below zero) is followed by event B (the rain turns to snow), and surmises that A is the cause of B (or that B is the result of A). The causal relation—for all its problematic aspects—may still be seen as a facet of the objective world and how it works. However, alongside (a), we can also have (b), where the causal relation appears to be reversed. () a. The rain is turning to snow, because the temperature is falling below zero. b. The temperature is falling below zero, because the rain is turning to snow.16 How is this possible? The answer, of course, is that in (b) it is the conceptualizer who invokes the causal relation, deducing that the temperature must be falling (epistemic use of ‘must’!), in view of the rain turning to snow. The causal relation no longer inheres in the external world, but is a facet of the speaker’s interpretation of the world. Even a seemingly innocuous sentence like the following is open to a number of interpretations: () She left the party early, because she was feeling ill. At issue is the locus of the causal relation. Is it the woman herself, who decided to leave for the stated reason? If this is the case, what is the speaker’s authority for reporting the causal relation? Did the woman confide in the speaker, stating her reason for leaving? Or is it the speaker who, on observing the course of events, offers an interpretation of the woman’s behaviour? The complexities arising from the inherently subjective nature of causation is no doubt one reason for the proliferation of linguistic resources that are available for its expression. Some aspects of these are discussed in Taylor and Pang ().
. C
.................................................................................................................................. Cognitive Grammar offers a distinctive perspective on many traditional topics in the description of English, at the same time opening up new avenues of research. In this 16
Note that in (b), the comma (indicating an intonation break) is essential. Such is not the case with (a).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
chapter I have briefly looked at the CG account of word classes and syntactic relations, and pointed to the crucial role of constructions as an alternative to rule-oriented accounts of linguistic knowledge. A major contribution of the approach has been to elucidate aspects of ‘background cognition’ which not only inform the interpretation of linguistic expressions, but also have far-reaching effects on the grammar. The matter was illustrated using the examples of grounding, reference points, and processes of subjectification.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. T chapter discusses what Construction Grammar can offer to a readership with a primary interest in English grammar, including not only linguists of different theoretical persuasions, but also language teachers and L learners of English. I will argue that Construction Grammar presents a useful perspective on English grammar that differs from most other theoretical frameworks and that sheds light on phenomena that are relevant beyond the confines of a single linguistic theory. The chapter has two principal aims. First, it will discuss how the grammar of English can be approached from the theoretical perspective of Construction Grammar. In order to prepare the ground for that, section . offers an introduction to the framework and its central assumptions and section . defines the term ‘construction’. Section . then applies these ideas in a survey of different construction types from the grammar of English. The second aim of this chapter is to situate Construction Grammar within a wider research context. Section . relates Construction Grammar to other approaches to grammar, and section . outlines some of its implications for language learning and teaching. Section . offers a few concluding remarks.
. C G:
.................................................................................................................................. As is the case for other theoretical frameworks, the term ‘Construction Grammar’ is used for a range of similar approaches. Several of these are presented in detail in Hoffmann and Trousdale (). General introductions to the framework can be found
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
in Goldberg (), Fischer and Stefanowitsch (), Croft (a), Ziem and Lasch (), Hilpert (), or Fried (). In order to offer the reader a first orientation in the research landscape, this section will focus on ideas that are widely shared amongst different constructional approaches. Construction Grammar is a theory of what speakers know when they know a language. Its main goal is to describe the implicit, intuitive knowledge of language that is represented in speakers’ minds: What regularities and patterns do speakers have to know in order to produce and process ordinary language? With such an outlook, Construction Grammar has a lot in common with other mentalist approaches, notably Cognitive Grammar (see Chapter ) and Generative Grammar (Chapter ). What distinguishes Construction Grammar from many other mentalist approaches is one rather radical idea, namely the assumption that knowledge of language can be modelled exhaustively and exclusively as a network of symbolic form-meaning pairings (Goldberg a: ). These form-meaning pairings are referred to as constructions. Once the consequences of this idea are spelled out, it becomes apparent how different Construction Grammar really is from most other current theories of grammar. Three consequences are particularly worth considering. First, in Construction Grammar, all formal patterns in language, including words, idioms, morphological word formation processes, and abstract syntactic phrase structures, are seen as meaningful constructions. As I will explain more thoroughly in the next section, all constructions are defined as symbolic units, that is, pairings of form and meaning. This idea goes against the notion that only lexical elements have meaning, while the grammatical patterns that can be used to combine them would be exclusively formal, that is, not meaningful in themselves. Second, the idea that knowledge of language is organized as a conceptual network is directly opposed to modular theories of grammar (Sadock : ), which assume separate, encapsulated components of linguistic knowledge that handle semantic, syntactic, or phonological aspects of language on their own. If it is assumed that language processing involves constructions, which by definition combine several formal and functional characteristics, processing of meaning and form happens in an integrated fashion. A third consequence of the constructional view of language is that there is no principled distinction between lexis and grammar. Knowledge of language is seen as consisting of form-meaning pairings, some of which have fixed, short forms and specific meanings (apple, slide, fast), whereas others are highly schematic in both form and meaning (wh-clefts, subject-to-object raising, left dislocation). The elements in the first category are commonly associated with the term ‘lexis’, those in the second category with the term ‘grammar’. On the constructional view, the two are seen as end points on a continuum. What motivates this view is that between monomorphemic lexical elements and complex clausal constructions, there is a vast range of patterns with varying degrees of schematicity and complexity, including idioms, word formation processes, and semi-fixed phrasal constructions. I will discuss all of these in more detail in this chapter.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
As was suggested above, lexical elements are situated at the more concrete end of the lexis-grammar continuum, since their meanings tend to be highly specific and their phonological forms fully specified. Speakers’ knowledge of language of course includes a large number of individual lexical elements, but beyond that, it also comprises a sizable number of collocations and multi-word strings, such as thank you, happy new year, or Have I got news for you. These multi-word patterns are actually not entirely fixed, but can be varied in flexible ways. For instance, the expression happy new year can easily be extended into a happy, healthy, and successful new year. We thus have knowledge of schemas, rather than absolutely fixed strings. To flesh out this idea, an expression such as He went nuts instantiates a schematic pattern in which the verb go is paired with a subject and a predicate such as nuts to indicate that someone became very angry or displayed some other irrational behaviour. The meaning of go in this context is different from its usual, motion-related meaning (cf. He went home). Rather than encoding motion, the verb here encodes a change of state. This meaning, however, is not solely observable in He went nuts, but also in similar expressions, such as She’s gone insane or I’ll go crazy. Crucially, not all predicates can be used in this way: ?She’s gone impolite, ?I’ll go happy, or ?He went dirty may perhaps be interpreted as meaningful sentences of English, but they are not conventional ways of saying that someone turned impolite, happy, or dirty. What this means is that He went nuts is not just a fixed, idiomatic expression. Instead, the expression represents a construction that is schematic in three ways: it may accommodate different subjects, different forms of the verb go, and a range of different predicates that are related to the notions of confusion and agitation (nuts, mad, crazy, bonkers, wild, etc., cf. Bybee for an analysis of a similar construction). Speakers of English know this, and so a representation of what might be called the go crazy construction should be part of the constructional network that is meant to model the grammar of English. At the same time, it is clear that the go crazy construction is neither purely a matter of lexis, nor just a matter of syntactic rules. From the perspective of traditional grammar writing, the construction would be viewed as belonging to the periphery of grammar, rather than its core. From the view of Construction Grammar, the go crazy construction is situated closer towards the lexical end of the lexis-grammar continuum. What further motivates the continuum view is that also towards the more schematic end of the continuum, constructions can exhibit lexical preferences and idiosyncratic restrictions. An example for this would be English wh-questions with long-distance dependencies. Speakers of English know how to form wh-questions, and they are able to distinguish well-formed questions from ill-formed ones. To give an example, the question What book did you say we needed from the library? is grammatically well-formed, while the question *What book did you make the claim that we needed from the library? is not. The difference in grammaticality has been explained with regard to a formal syntactic criterion (Ross ). In the second question, the questioned constituent (the book) is part of a complex noun phrase (the claim that we needed the book from the library). The ungrammaticality of the question is related to that syntactic structure. Questioning an element from within a complex noun phrase
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
by means of putting it to the front of the sentence is barred by a syntactic constraint that forms part of the grammar of English. While this elegantly explains the difference in grammaticality between the examples above, formal syntactic factors such as the Complex Noun Phrase Constraint do not account for grammaticality judgement ratings that show differences between examples that are structurally identical. Dąbrowska (: ) observes that speakers consistently give worse ratings to Where did you swear the young man went after they found her? than to Where did you say they hid the treasure when they found out? These questions do not differ structurally, but they do differ with regard to the complement-taking verb that governs the questioned constituent. Under experimental conditions, questions with say or think receive better ratings than questions with swear or suspect, also when other variables are controlled for, as for example the pronominality of the subject. This suggests that even highly abstract syntactic patterns are connected to specific lexical elements. In a modular theory of grammar such connections are difficult to explain, whereas the network model of Construction Grammar offers a natural account. Even abstract constructions are meaningful, so that they will be associated with those lexical elements that are best compatible with those meanings. Again, the more general conclusion is that a strict separation of lexis and grammar is not possible, and a continuum view suggests itself as a viable alternative. Starting with early foundational studies such as Fillmore et al. (), Construction Grammarians have paid particular attention to structures that do not belong to the established ‘core’ of English grammar, but that rather fall somewhere in the middle of the lexis-grammar continuum. Examples include the let alone construction (I can’t imagine how to become a vegetarian, let alone a vegan), the comparative correlative construction (The more you read about it, the less you understand), the way-construction (John elbowed his way through the hallway), or the many a NOUN construction (I’ve waited many a day for this to happen). This perspective has been characterized as ‘the view from the periphery’ (Culicover and Jackendoff ). A recurring focus has been on the observation that constructions of this kind display formal and functional peculiarities that cannot be accounted for in terms of more general patterns. By now, a substantial body of work on such ‘unruly’ structures has given rise to the view that idiosyncrasies in grammatical constructions are in fact ubiquitous (cf. Hilpert and references therein). Speakers’ knowledge of grammar must therefore contain a large number of small-scale generalizations that capture the idiosyncrasies that are associated with grammatical patterns along the constructional continuum, from relatively specific ones such as the go crazy construction to highly abstract ones such as wh-questions.
. W ?
.................................................................................................................................. The last section outlined a fundamental assumption of Construction Grammar, namely that linguistic knowledge exclusively consists of a large network of constructions. A common reaction to this claim is the question of what a construction actually is.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Is everything a construction? If everything in language is a construction, is the term not entirely vacuous? Contrary to first impressions, the term ‘construction’ is defined very precisely (cf. Kay ), and the statement that ‘everything in language is a construction’ is in need of an important qualification. Starting with the latter, it is crucial to distinguish between the words and phrases that speakers utter, which are called constructs, and the more abstract constructions that underlie these constructs. A string of words such as our little farm on the prairie is not, in itself, a construction. Rather, the string and its parts instantiate a number of constructions, such as the noun phrase construction, the attributive adjective construction, and the prepositional phrase construction. As a rule, if a pattern of language use can be fully analysed in terms of more general patterns, that particular pattern of language use is not a construction. With that in mind, the notion of constructions can be approached in more positive terms. A very influential definition of the term ‘construction’ is the following (Goldberg : ): C is a CONSTRUCTION iffdef C is a form-meaning pair such that some aspect of Fi or some aspect of Si is not strictly predictable from C’s component parts or from other previously established constructions.1
This definition echoes several ideas that were touched on earlier in this chapter. First, it characterizes constructions as pairings of form (F) and meaning, here represented by an (S) for semantics. This is perhaps not so very different from definitions of constructions that are used in pedagogical settings, where students of English are introduced to the ‘passive construction’ or the ‘present progressive construction’. An idea that is rather implicit in the definition above is that constructions represent linguistic knowledge: a form-meaning pair is a connection in a speaker’s mind. To know a construction is to know that a certain form maps onto a certain meaning. Here, the definition departs from non-technical definitions, which are usually just concerned with linguistic structures, not necessarily with speakers’ knowledge of such structures. The most prominent idea in the definition, however, is that a construction is only to be seen as such if some aspect of its form or some aspect of its meaning is not strictly predictable. In plain words, something is a construction only if speakers have to memorize it as a linguistic unit in its own right. To make this more specific, an idiom such as to get carried away is a construction because its overall meaning ‘to be overcome by enthusiasm’ is not predictable from the meaning of its component words. A set phrase such as by and large likewise displays non-compositional meaning, but in addition, it is also formally unpredictable, since there is no more general pattern in the grammar of English that would license constituents that consist of a preposition, a conjunction, and an adjective. If a form-meaning pair displays non-compositional meanings and unpredictable formal characteristics, this means that it must be represented as such in speakers’ knowledge of language.
1
iffdef C = if and only if the definition of C.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
While non-predictability is evidently a sufficient criterion for identifying a construction, many researchers in Construction Grammar no longer consider it to be a necessary one. The reason for this is that even fully transparent forms may be redundantly represented as nodes in the constructional network if they are used often enough so that speakers remember them as such. Phrases such as come under scrutiny or What took you so long? are fully interpretable, even when they are heard for the first time, so that they would not be seen as constructions under the above definition. Yet, experienced speakers of English have heard these phrases often enough to form distinct memories of them and to associate them with the contextual settings in which they are typically used. There is no convincing reason to exclude this kind of knowledge from the rest of speakers’ linguistic knowledge. In a later definition of constructions, Goldberg (a: ) therefore casts the net a little wider to include also fully transparent form-meaning pairs, as long as these occur frequently enough: Any linguistic pattern is recognized as a construction as long as some aspect of its form or function is not strictly predictable from its component parts or from other constructions recognized to exist. In addition, patterns are stored as constructions even if they are fully predictable as long as they occur with sufficient frequency.
At the time of writing this chapter, this second definition represents a relatively broad consensus in the Construction Grammar community. Constructions are those formmeaning mappings that speakers have memorized in their network of linguistic knowledge, be it for reasons of formal or functional non-predictability or because of frequent exposure. It goes without saying that this is by no means the only view that is held (cf. Langacker , Sag , or Kay , amongst others). Even with a working definition in place, it is, however, not always easy to identify constructions on the basis of natural linguistic data. It will therefore be useful to round out this section by going over a few strategies that can be used for this purpose. Hilpert (: ) suggests four heuristics that tie in with several ideas that have already been mentioned. The first, and perhaps easiest strategy is to check whether an expression has formal characteristics that cannot be explained in terms of more general grammatical patterns. Whenever an expression can be said to be formally unusual or idiosyncratic, this is the case. An example would be the so-called big mess construction (Van Eynde ). In examples such as How big a mess is it? or If it’s that good a deal, you cannot refuse, an adjective precedes the determiner of an indefinite noun phrase. This particular order is not found elsewhere in the grammar of English and therefore motivates the recognition of a separate construction. The second heuristic is to look for non-compositional meanings. Idiomatic expressions such as get the ball rolling ‘initiate a process’, a smoking gun ‘incriminating evidence’, or weather the storm ‘master a difficult situation’ illustrate the phenomenon: the meanings of the parts do not correspond to the meaning of the whole expression. Yet, idioms are not the only constructions in which this can be observed. For example, the meaning of the English subject-to-object raising construction, which is exemplified
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
by sentences such as I need you to calm down, is non-compositional. At first glance, this may not be entirely obvious. For an experienced speaker of English, the immediate interpretation of this sentence is that the speaker asks the addressee to calm down (‘I need something, namely that you calm down’). However, the sentence is structurally ambiguous. A beginning learner of English might thus interpret the sentence differently, thinking that the speaker would like to calm down, which, for some reason, she or he can only do when the addressee is present (‘I need you here so that I can calm down’). How the respective roles of the participants in a raising construction are interpreted is thus a matter of convention that does not automatically follow from the meaning of the lexical words that occur in such a construction. Constructions can even be identified as such when they do not exhibit overt formal oddities or non-compositional meanings. A third strategy that can be used for this purpose is to check whether a given expression is subject to formal restrictions. Are there constraints on the use of an expression? Are there certain conditions under which a construction ‘does not work’? To give an example, the sentence Mary is a smarter lawyer than John seems to be fully compositional semantically and fully in line with canonical syntactic patterns of English. Yet, there is an unpredictable constraint on the construction that licenses the sentence. The predicate noun phrase of the sentence (a smarter lawyer than John) must be indefinite. With a definite determiner, the result *Mary is the smarter lawyer than John becomes ungrammatical. This suggests that the first, grammatical sentence instantiates a construction that is subject to an indefiniteness constraint. Even highly general and productive constructions can be shown to have constraints. For instance, Huddleston and Pullum (: ) note that English prepositional passives (This bed has been slept in, Her book was referred to) cannot be formed with certain prepositional verbs (*Boston was flown to next, *Some old letters were come across). Establishing that a construction is subject to constraints usually requires the analyst to use introspection and to construct minimal pairs of examples, which may be difficult to do for second language learners. Even for native speakers, this procedure is not error-proof, but it serves to generate hypotheses that can then be checked against naturally occurring data. The three strategies that have been discussed up to now are all concerned with discrete qualitative characteristics of constructions. Whether or not an expression is formally or semantically unpredictable, or subject to a constraint, is a question that can be answered categorically, with a ‘yes’ or a ‘no’. The fourth and last heuristic to be mentioned here is more gradual and probabilistic in nature. Importantly, speakers’ knowledge of language does not only allow them to discriminate between grammatical and ungrammatical instances of language use. A hallmark of linguistic competence that is just as significant is speakers’ ability to distinguish conventional, idiomatic language use from unusual, non-idiomatic expressions (Taylor ). More specifically, proficient speakers know what words and constructions typically occur together. For example, the sentences The results came under scrutiny and The results went under scrutiny are both grammatically well-formed and convey approximately the same
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
meaning, but when forced to choose, speakers of English will express a clear preference for the first one, as come under scrutiny is much more idiomatic than go under scrutiny. To illustrate the phenomenon in a different way, the English adjective grave, ‘serious, important’, is strongly associated with a range of nouns that convey an inherently negative meaning, such as danger, doubts, concern, mistake, or error. Yet, grave does not combine equally well with all nouns that have negative meanings. Non-idiomatic combinations such as grave shame or grave infection may be used under certain conditions, but statistically speaking, they are strongly dispreferred. Lexical collocations such as come under scrutiny are widely studied, also outside Construction Grammar. What a constructional perspective can add to the study of collocations is the observation that grammatical constructions also have associative ties to lexical elements. For example, the English modal auxiliary will and the English be going to construction exhibit collocational preferences for different sets of lexical verbs that occur with these constructions (Hilpert : ). In comparison, be going to has a greater tendency to select for agentive, intentional, and telic verbs. Collocational preferences of this kind have to be learned, and as such, they motivate the recognition of English will and be going to, in combination with an open slot for a lexical verb in the infinitive, as constructions in the technical sense. To sum up this section, constructions can be defined as form-meaning mappings that are organized in a network of linguistic knowledge. Constructions can be identified as such on the basis of non-predictable formal or functional criteria, idiosyncratic constraints on their usage, or associative ties to other constructions. With all of these theoretical preliminaries in place, the next section will proceed to offer a brief survey of constructions from different domains of the grammar of English, with a commentary on how they can be analysed from the perspective of Construction Grammar.
. E G
..................................................................................................................................
.. Argument structure: the ditransitive construction One of the foundational texts of Construction Grammar is Adele Goldberg’s study of argument structure constructions (Goldberg ), which presents analyses of the caused motion construction (John kicked the ball over the fence), the resultative construction (Mary worried herself sick), the way-construction (Bob cheated his way into law school) and, as I will discuss in more detail in this section, the ditransitive construction (Frank poured me a drink). The term ‘argument structure’ (cf. Chapter ) typically refers to a relationship that holds between a verb and its arguments, which are the participants that are projected by a given verb. In a sentence such as John ate a muffin, the verb form ate is the predicate, and the remaining parts of the sentence
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
(John, a muffin) are its arguments. Traditionally, argument structure is understood as a lexical property of verbs, that is, a given verb is associated with a specific set of arguments. A verb such as eat is thus considered to be a transitive verb, since it regularly takes a direct object, as in the sentence above. Typically however, verbs are associated with several argument structure patterns. The verb eat, for example, is often also used without an object (Let’s eat!), in which case it would be used intransitively. What motivates a constructional account of argument structure is that speakers combine verbs and arguments in ways that cannot be just a matter of established convention. Sometimes verbs are used in new and creative ways. A newspaper might report on a snake that ate itself to death. What this example conveys is that a snake died, presumably as a result of choosing an unsuitably large prey. This interpretation does not follow from the verb’s semantics, and it is in direct conflict with the conventionalized transitive argument structure of eat, which would lead us to expect that the pronoun itself denotes the participant that was eaten. What Goldberg () proposes in order to explain the facility with which readers understand such creative instances of language use is that the overall meaning of the sentence is directly associated with its syntactic structure. Combining a subject with a verb, a reflexive pronoun, and a phrase such as to death yields a formal pattern that Goldberg calls the English resultative construction. This construction can convey the idea that some action brought about a result, regardless of the specific verb that denotes that action. This analysis entails an important shift in perspective, away from the idea that only verbs can select their arguments, and towards the idea that constructions also carry meaning, select arguments, and thereby alter established argument structure patterns. The following paragraphs expand on this idea by presenting Goldberg’s analysis of the ditransitive construction in more detail. The English ditransitive construction combines a verb with a subject and two objects. For this reason, the construction is often called the double object construction. It is exemplified by sentences such as My brother sent me a letter or John played me one of his new songs. Goldberg characterizes the prototypical meaning of the ditransitive construction as a ‘transfer between a volitional agent and a willing recipient’ (: ). It is not hard to find apparent counterexamples to this prototypical meaning, as for example He gave me the flu (which has a recipient that is definitely not willing) or The smoke gives the meat an intense flavour (which does not have a volitional agent). Goldberg explains examples of this kind with reference to a conceptual metaphor that presents causal chains as agentive transfers (: ). Ditransitives that verbalize such metaphorical transfers can thus deviate from the prototype. The notions of agent and recipient map onto the different arguments of the construction. The subject argument is understood to be the volitional agent that initiates the transfer. The first object is the willing recipient, who accepts the object of transfer that is expressed in the second object. For a verb such as send, which lexically encodes the idea of a transfer, the ditransitive construction is a highly conventionalized argument structure pattern with high relative frequency. When speakers use the verb send, it is quite likely that they use it in the context of a ditransitive sentence. The verb play, by
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
contrast, does not inherently denote transfers and is not as strongly associated with ditransitive argument structure. Its occurrence in the ditransitive construction is perfectly acceptable, but already shows that the construction can make a significant semantic contribution to the overall meaning of the sentence. As a rule, that contribution becomes more substantial the less a verb is associated with the meaning of a transfer. The verb sketch, for instance, does not, by itself, evoke the idea of a transfer, and it is rarely used in the ditransitive construction. Yet, in a sentence such as I will sketch you a view of the river while you read Wordsworth to me (which is an example of authentic language use taken from the COHA corpus), it does occur in the construction, and the overall interpretation is that the speaker aims to transfer a visual impression to the hearer. Unless we assume an ad-hoc sense of sketch along the lines of ‘draw something in order to show it to someone else’, the overall meaning of the sentence does not follow from the meanings of the individual words. In other words, unless we posit a meaningful ditransitive construction, we cannot explain how a hearer might spontaneously arrive at the correct interpretation. The previous section mentioned several heuristics that can be used for the detection of constructions. One strategy was that of looking for idiosyncratic constraints. The ditransitive construction is subject to a number of such constraints. For example, the unacceptability of a sentence such as *I brought the table a glass of water shows that if a recipient fails the criterion of being ‘willing to accept’, the ditransitive cannot be used. Crucially, if the sentence is changed in such a way that the table can metonymically refer to a group of willing recipients, as in I brought the table another round of frozen margaritas, the unacceptability disappears. Other constraints on the ditransitive concern specific groups of verbs. Several verbs that describe manners of speech, such as scream, murmur, whisper, or yodel (Goldberg : ), are statistically dispreferred, although examples such as I whispered her a secret are attested. More categorically, sentences such as *John explained me the theory or *Mary donated the red cross £ are ruled out, which has been linked to the Latinate morphology and prosody of these verbs (Gropen et al. ). Verbs such as obtain, procure, provide, or postulate have meanings that could be thought to fit rather well into the set of conventionalized ditransitive verbs. However, they are conspicuously absent from the construction. Herbst () thus rightly points out that ‘there is no guarantee that a particular lexical item with certain semantic characteristics will be able to occur in a particular valency pattern simply because other lexical items with the same characteristics do’. While the relative absence of Latinate verbs in the ditransitive construction cannot be motivated semantically, speakers might still notice it in synchronic usage and draw the conclusion that certain verbs do not work as well as others. Family resemblances in the sound and morphological structure of Latinate verbs might deter speakers from experimenting more freely with them in the ditransitive construction. In summary, the fact that speakers of English use and understand creative formations such as My doctor scribbled me a prescription suggests that English ditransitive argument structure is a syntactic pattern that is endowed with meaning. The construction can thus be used to convey the idea of a transfer even when it is used with verbs
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
that are not themselves associated with that notion. A persistent problem in the study of argument structure constructions has been the issue of overgeneralization, that is, the problem that an analysis may predict the acceptability of examples that competent speakers would judge as deviant or even ungrammatical. Analyses in the tradition of Goldberg () usually posit a number of constraints that restrict how the construction can be used. Other attempts to come to terms with overgeneralization focus on constructions that are posited at lower levels of generalization (Boas ) or on valency constructions for individual verbs (Herbst ).
.. Modality: modal auxiliary constructions There is a rich scholarly tradition that addresses modality and the English modal auxiliaries (cf. Chapter ). In comparison, modal auxiliaries have attracted relatively little attention in the more recent field of Construction Grammar. This is to be seen as a consequence of ‘the view from the periphery’ that much constructional work has adopted. The idiosyncrasies and non-predictable meanings that are found in patterns like the comparative correlative construction (The more you read about it, the less you understand) do not seem to characterize constructions such as the sequence of the modal auxiliary would and a lexical verb in the infinitive. If anything, modal auxiliaries as a group seem to be highly predictable in terms of their semantic and morphosyntactic behaviour: they differ from lexical verbs in terms of their morphosyntactic behaviour, but they share these differences. Hence, a sentence such as John will wash the dishes is both structurally predictable and semantically transparent. Modal auxiliaries always take non-finite verb phrases as complements, and the interpretation of the whole sentence can often be fully derived from the meanings of the individual words. Moreover, the behaviour of will is highly similar to the behaviour of would, must, shall, and the remaining core modal auxiliaries. As is well known, they exhibit a number of shared features that set them apart from lexical verbs (Huddleston and Pullum : ). These features include the ability to undergo inversion with the subject in questions (Can / Will / Would you do it?), the ability to occur with negative clitics (I can’t / won’t / wouldn’t do it), and the lack of person agreement, amongst other things. In short, while the manifest differences between modal auxiliaries and lexical verbs would, in the light of what has been argued above, be strong evidence for the assumption of a general modal auxiliary construction, it is less clear whether any one specific modal auxiliary would have to be represented in speakers’ minds as a construction in its own right, and if so, how that construction should be defined. An argument for positing individual modal auxiliary constructions follows from the empirical observation that each of the core modals is statistically associated with a distinct set of lexical collocates (Hilpert , , ). While the general morphosyntactic behaviour of two modal auxiliaries such as would and will is largely identical, so that speakers would not have to memorize it separately for would and will, the respective associative ties to particular lexical contexts are manifestly different.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
A phrase such as I would like to thank you is highly idiomatic while I will like to thank you sounds decidedly odd. The shift in perspective that distinguishes a constructional account of modal auxiliaries from most other approaches is thus that modal auxiliary constructions not only include the auxiliary as such, but also a structural open slot for its infinitival complement and a large set of associative links to lexical verbs that typically occupy that slot. In other words, the focus is less on auxiliaries as items that display paradigmatic semantic contrasts, but rather on the syntagmatic relations that these items entertain. Cappelle and Depraetere () argue for an even more extended view that includes associations not only with the infinitival complements, but also with other collocating elements. For instance, the modal auxiliary can is associated with the idiomatic expression Not if I can help it, which conveys the meaning ‘I will try to prevent it’. Another piece of evidence that motivates such a network-based view is that over time, individual auxiliaries may change in their collocational preferences, so that some collocations fade and others become more entrenched. Hilpert () demonstrates how the recent collocational shifts of the modal auxiliary may can be linked to its gradual semantic change from a marker of deontic modality towards a marker of epistemic modality (cf. also Millar ). What this means is that a speaker of Present-Day English has internalized an associative collocational network that differs from the networks that speakers of earlier generations would have had. To conclude this section, the English modal auxiliaries are profitably viewed as constructions, even if they inherit most of their properties from broader generalizations and are thus much less idiosyncratic than the constructions that are discussed in the foundational studies of Construction Grammar. The main surplus value that Construction Grammar brings to the analysis of modal auxiliaries is the focus on the associative ties between modal auxiliaries and their lexical collocates.
.. Information packaging: it-clefts and wh-clefts An important function of syntax is known as information structuring (see Chapter ), or, as it is called in this chapter, information packaging. When speakers engage in a conversation, they not only strive to communicate certain pieces of information, they also try to present this information in a way that facilitates the hearer’s job of processing that information. In other words, speakers package the information of their utterance in such a way that it can easily be unpacked by the hearer. In order to meet the needs of the hearer, speakers design their utterances in accordance with a number of general processing-related principles. One of these is the Principle of End Focus (Quirk et al. : ). If an utterance involves both information that is old, that is, known to both interlocutors, and information that is new to the hearer, speakers will tend to present old information before new information, since the hearer can process information more effectively if an utterance contains a known starting point to which new information can then be connected. A closely related principle, the Principle of End Weight (Quirk et al. : ), biases speakers towards presenting short
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
constituents first, and heavier, more complex constituents only towards the end of the sentence. Also this principle serves to minimize the hearer’s processing effort. Processing a sentence that begins with a long, complex constituent, without knowing the grammatical function that this constituent will eventually take, puts a significant strain on the hearer’s working memory (as this very sentence may have illustrated). These principles of information packaging inform discussions of linguistic structure across different theoretical frameworks, so it is a valid question to ask how a constructional view of information packaging (Leino ) might be different from other approaches. The short answer to that question is that in a constructional approach, researchers will analyse the syntactic patterns that speakers use for the purpose of information packaging as meaningful symbolic units, that is, as information packaging constructions. While other theoretical frameworks also investigate constructions such as wh-clefts (What I need is the key to the back door), left dislocation (My dad, he’s a little crazy), or topicalization (This shirt I actually bought second hand), constructional accounts try to analyse them as formmeaning pairings that are subject to particular pragmatic constraints. On the basis of ideas discussed by Lambrecht (: ), Hilpert (: ) defines information packaging constructions in terms of the following three characteristics: Information packaging constructions are sentence-level constructions [] that speakers use to express complex meanings [] in a way that shows awareness of the current knowledge of the hearer []. In contrast to argument structure constructions like the ditransitive construction, information packaging constructions are not inherently tied to a meaning like ‘transfer’. Rather, they allow speakers to signal to the hearer what information in their utterance is new, unexpected, or particularly relevant, given a prior context. Lambrecht () offers a comprehensive constructional account of cleft sentences that informs the following paragraphs. Cleft sentences such as What I need is the key to the back door (wh-cleft) or It is the key to the back door that I need (it-cleft) convey a proposition that can be expressed with a simple transitive construction, as in I need the key to the back door. The fact that speakers choose a more complex form suggests that when cleft sentences are used, they in fact communicate something more than just their propositional information. The added value of wh-clefts and it-clefts includes the following points. First, both types of cleft flag some information as new, and some as old. By uttering them, the speaker treats ‘the key to the back door’ as newsworthy, while presenting the fact that ‘something is needed’ as something that the hearer already knows, or can at least infer from the context. Which type of cleft a speaker will choose in a given situation depends on several factors (Lambrecht : ). One aspect is the relative length of the constituent that is presented as new, the so-called focus phrase. In accordance with the Principle of End Weight, a long focus phrase will bias speakers towards the wh-cleft, so that the focus phrase is positioned at the end. A second aspect concerns the
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
information that is presented in the relative clause constituent of the cleft sentences, i.e. ‘something is needed’. If this information is highly discourse-prominent to both interlocutors, for example because the speaker was just asked What do you need?, a wh-cleft is pragmatically the much more appropriate choice, since it presents highly prominent, topical information first, in line with the Principle of End Focus. By contrast, an it-cleft is preferred when the information in the relative clause can only be understood through the non-linguistic context. To illustrate, if the speaker, while searching through a large collection of keys, is being asked What are you doing with those keys?, an it-cleft (It is the key to the back door that I need) would be a much more fitting answer than a wh-cleft. A third feature that distinguishes wh-clefts and it-clefts is that some examples of the latter simply do not have wh-cleft counterparts. Sentences such as It was with relief that she heard the door close or It was in no time that they had finished their ice creams cannot be paraphrased in such a way that the result is a well-formed wh-cleft (Hilpert : ). Speakers of English have internalized the structural and pragmatic constraints on the use of cleft sentences and choose between them according to the demands of the communicative situation. This motivates the recognition of clefts, and other information packaging constructions, as conventionalized form-meaning pairings.
.. Morphological constructions The introductory sections of this chapter presented Construction Grammar as a model of linguistic knowledge that takes the shape of a network in which symbolic units of varying schematicity and complexity are connected. It was discussed that the network contains lexical elements, idiomatic expressions, and more abstract syntactic patterns. Beyond that, the network also contains morphological constructions, that is, symbolic units that allow speakers to produce and understand complex words, such as cats, challenging, or printer. In constructional approaches to morphology (Booij , ), inflectional morphological rules (such as plural formation with -s) and derivational word formation processes (such as nominalization with -er) are re-thought as partly schematic form-meaning pairings. To give an example, a noun such as printer is seen as an instantiation of a construction that has both a form and a meaning. Formally, the construction consists of an open slot for a verb that combines with a fixed element, namely the nominalizing suffix -er. The meaning of the construction conveys the idea of someone or something actively involved in the process that is described by the verb (cf. Booij : ), so that a printer is understood as ‘something that prints’. What motivates a constructional approach to morphology? Several characteristics of constructions that were discussed in section . can be seen as well in word formation processes. Most importantly, morphological constructions are subject to formal and functional constraints, just like syntactic constructions. The fact that -er is known as a highly productive nominalization suffix of English should not be taken to mean that the verb-er construction is unconstrained. To describe someone who falls as a *faller is just as unconventional as calling something that exists an *exister. Typical uses of the
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
construction, such as leader, hunter, or speaker involve verbs that are agentive. Verbs that are semantically too far removed from this prototype are not usable in the construction, which hints at the presence of a semantic constraint. Many morphological constructions also display formal constraints. In fact, the observation that the open slot in the verb-er construction does not readily accommodate nouns (*keyer) or adjectives (*expensiver) already constitutes a formal constraint. Constraints on form may also concern phonological aspects of the construction. For instance, verbs such as blacken, tighten, or broaden instantiate a construction that licenses de-adjectival causal verbs. The construction is subject to strict phonological constraints: only monosyllabic adjectives that end in an obstruent may be verbalized with the -en suffix (cf. Plag : ). Another shared characteristic of word formation processes and constructions in the sense of Construction Grammar is non-compositional meaning. This can be illustrated with the verb-er construction that was mentioned above. When the construction occurs with verbs such as burn, which describe spontaneous processes, the resultant form does not simply denote ‘something that burns’. The constructional meaning imposes an agentive or causative interpretation on the complex word, so that a burner is conventionally understood as ‘something that burns something else (usually fuel)’, not ‘something that burns (itself)’. This meaning cannot be attributed to either the verb or the suffix, but it emerges from the combination of the two. A third consideration relates back to the discussion of modal auxiliary constructions earlier in this section. There, it was argued that auxiliaries such as English would, followed by a verbal complement, can be seen as constructions because each individual modal auxiliary has a distinct profile of collocational preferences. The same argument can be applied to constructions that involve morphological marking, such as the English present progressive, which is a periphrastic construction that includes a form of the verb be and a lexical verb that takes the inflectional -ing suffix. Stefanowitsch and Gries () use the method of collexeme analysis to investigate the associative ties between the present progressive and the lexical verbs that occur in the construction, finding that verbs such as talk, work, sit, and wait are highly overrepresented. Constructions such as the progressive thus constitute complex networks with numerous associations between a schematic pattern and lexical elements. Taken together, the presence of unpredictable constraints, non-compositional meanings, and idiosyncratic collocational preferences motivates a view of inflectional and derivational morphology as cut from the same cloth as other constructions in the grammar of English.
. C G
.................................................................................................................................. Based on the analyses that were presented in the last section, it is now time to consider the question of how Construction Grammar relates to other approaches to grammar. Two hallmark characteristics of Construction Grammar are particularly important
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
in this regard, namely its mentalist stance on the one hand, and its assumption that linguistic knowledge reduces to knowledge of constructions on the other. Given these assumptions, how does Construction Grammar fit into the wider research landscape? In comparisons of linguistic theories, a frequently made distinction is the contrast between so-called formalist and functionalist approaches (Newmeyer ). Assigning Construction Grammar to one of these categories is less than straight forward. Certain varieties, such as Sign-Based Construction Grammar (Sag ) and Fluid Construction Grammar (Van Trijp ) have developed sophisticated formalisms and computationally implemented models of speakers’ knowledge, which makes them, to all intents and purposes, ‘formalist’ theories. The work of Goldberg (, a), on the other hand, places great emphasis on functional and cognitive considerations and is consequently known as Cognitive Construction Grammar. What this suggests is that placing Construction Grammar in a wider context requires more complex criteria than just the formalist-functionalist divide. Working towards such a goal, Butler and Gonzálvez-García () present an overview of different functional and cognitive frameworks (cf. Chapter ) that includes several varieties of Construction Grammar next to theories such as Role and Reference Grammar (Van Valin ) and Functional Discourse Grammar (Hengeveld and Mackenzie ). Their overview is based on a fifty-eight–item questionnaire that was filled out by researchers practising in the respective frameworks. The questionnaire items concern the scope and coverage of the approaches, methodological practices, aspects of theory construction, as well as pedagogical and computational applications. Several results that emerge from the data reveal interesting differences between Construction Grammar and other theoretical frameworks. Among the issues that receive comparatively little attention in Construction Grammar, Butler and Gonzálvez-García (: ) list discourse and the properties of whole texts, the sociocultural context of language, paradigmatic relations, and the pursuit of theoretical parsimony (Sign-Based Construction Grammar is an exception with regard to the last item). Issues that are relatively prominent include the assumption that the language system is non-autonomous and non-modular, the network metaphor of linguistic knowledge, and the important role of cognitive factors (again, Sign-Based Construction Grammar does not align with regard to the latter). These observations, short as they are, offer a first indication of how Construction Grammar relates to its closest theoretical neighbours.
. C G
.................................................................................................................................. The areas of first and second language acquisition are important testing grounds for linguistic theories. How well do the assumptions of a given theory align with empirical observations of learner data? The constructional view of linguistic knowledge goes along with the usage-based approach to language acquisition (Tomasello , Diessel ), which holds that language learning proceeds in a gradual, piecemeal fashion,
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
starting with the memorization of fixed chunks of language that gradually expand into more abstract generalizations. Research of this kind has inspired investigations of second language learners and their use of constructions (Ellis ). A general conclusion emerging from this work is that learners gradually build up constructional generalizations in their L by associating structural patterns with meanings, figuring out constraints, and internalizing collocational preferences. Importantly, this process is influenced by the learners’ L, which guides their attention and induces transfer, i.e. the analysis of a phenomenon in the L in terms of structures that are already known from the L. From the perspective of Construction Grammar, becoming a proficient L speaker is equivalent to learning to approximate native speakers’ usage of constructions, so that learners use the L not only correctly, but also idiomatically. Experimental results by Gries and Wulff () illustrate how this process works. In a sentence completion task, German learners of English were primed with sentence fragments such as The racing driver showed the helpful mechanic . . . (which strongly evokes the ditransitive construction) or fragments such as The racing driver showed the torn overall . . . (which is most aptly completed as a prepositional dative construction, i.e. The racing driver showed the torn overall to the helpful mechanic). After these primes, participants saw another sentence fragment containing just a subject and a verb, and they had to construct a completion of that fragment. The results show a significant priming effect. When learners have been recently exposed to a syntactic construction of their L, they are likely to re-use that structure. The data further reveal that this priming effect is not the same for different verbs that were used as stimuli. Both the ditransitive construction and the prepositional dative construction are strongly associated with certain sets of verbs: show, give, and send are typical ditransitive verbs, whereas hand, sell, and post typically occur with the prepositional dative (Gries and Wulff : ). Crucially, L learners appear to know this. When primed with a ditransitive, learners are very likely to complete a sentence with send as a ditransitive. The same prime, followed by a fragment with sell, is less likely to elicit a ditransitive construction. This suggests that learners have been fine-tuning their knowledge of English argument structure constructions in such a way that they have learned a network of lexico-grammatical associations. Results like these suggest that Construction Grammar has implications even for the second language classroom. If a big part of L learning is the formation of associations between constructions and lexical items, then it becomes very relevant to ask how such associations are taught and learned most effectively (Madlener ).
. C
.................................................................................................................................. It was the aim of this chapter to offer readers a first point of entry into the field of Construction Grammar in order to facilitate comparisons with other approaches and to provide an overview of the main ideas that characterize the constructional view of
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
language. This ‘view from the periphery’ strongly influences the choice of phenomena that are deemed worthy of study in Construction Grammar. Some of the constructions discussed in this chapter will undoubtedly fall outside the purview of grammar in other approaches that are discussed in this book. Likewise, some of the characteristics of constructions that were highlighted in this chapter, including matters such as collocational preferences or idiosyncratic constraints, will only be of limited interest in other theories. Yet, it would be short-sighted to characterize Construction Grammar merely in opposition to the remaining research landscape in linguistics. As the discussions of argument structure, modality, information packaging, inflection and derivation, as well as second language acquisition have shown, constructional analyses tend to produce results that are relevant even beyond the confines of a single theoretical framework.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. D theory is a structuralist model of syntax that differs from other structuralist models in that it does not establish hierarchies between the constituents of a sentence in terms of a part-whole-relationship. The driving force behind dependency hierarchies comes from the insight that certain elements in a sentence can only occur if certain other elements are present. Thus, a definite article can only occur if there is also a noun, an accusative noun can only occur if the sentence also contains a verb that requires or permits an accusative. In this sense, a verb governs subject and object in a sentence, which can be seen as dependents of the governor. The property of verbs to determine number and type of other elements occurring with it has become known as valency. The model of dependency grammar is principally associated with the name of Lucien Tesnière, whose Éléments de syntaxe structurale, published posthumously in , outlines the fundamental ideas of this approach. The fact that this book was published in an English translation as recently as can be taken as an indication of the importance of Tesnière’s ideas for present-day linguistics. And the fact that the first English translation of Tesnière’s work appeared more than half a century after the original and long after translations into German, Spanish, Italian, and Russian can perhaps also be taken as an indication of a somewhat delayed interest of English linguistics in the principles of a dependency-based approach to grammar.1
See also Mel’čuk (: ) for reasons for this relatively late interest in Tesnière’s work. For detailed outlines of specific aspects of dependency and valency theory and different approaches within these frameworks, I would like to refer readers to the volumes edited by Ágel et al. (, ) and by Gerdes, Hajičová, and Wanner (a). See also Klotz () for an extensive survey. 1
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
a[in]a
b[the]b
c[kitchen]c
. Dependency representation of in the kitchen (Matthews : ) a c b
in
the
kitchen
. Dependency tree diagram of in the kitchen (Matthews : )
What exactly is dependency? Matthews (: ) points out that ‘in the traditional language of grammarians, many constructions are described in terms of a subordination of one element to another’. So the term ‘govern’ has traditionally been used for verbs and prepositions with respect to their objects, for example, whereas the ‘term “modifies” implies the reverse’. These kinds of relations are dependency relations, which Matthews (: ) represents in two different formats (Figures .A and .B). Dependency is thus a co-occurrence relation in the sense that the occurrence of the dependent item presupposes the occurrence of the governor. Hudson (: ), referring to Haas (: ), defines it as follows:2 A dependency relation is a relation between two features where one of the features is present only when the other is present – in other words, when one depends for its presence in a structure on the presence of the other ( . . . )
Although, as pointed out before, the concept of dependency has not nearly been as popular in contemporary linguistics as that of phrase structure analysis, it has formed the basis for a number of different models over the years. Like many other frameworks, however, dependency approaches do not present one unified theory but differ in a number of important points (such as types and directions of the dependencies established). At the same time, they are united in making dependency a central element of the respective theories. Some of these differences will be outlined in section .. Mel’čuk (: ) speaks of dependency as ‘a non-symmetrical relation, of the same type as implication: one element ‘presupposes’ in some sense the other, but—generally speaking—not vice versa.’ See also Hudson (b: ; Hudson a: – and –). For the arbitrary character of dependency relations see Engel (: ). Compare in particular different views on the dependency relations in the noun phrase (e.g. Hudson : – and b: and Mel’čuk : ). For different types of dependency see Engel (). 2
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
It is the aim of this chapter to present an outline of the basic principles of dependency models, in particular those that have been applied to English, but also to demonstrate how some of the central concepts of Tesnière’s thinking have been taken up or developed in a very similar manner in other approaches.3 The basic characteristics of Tesnière’s model will be outlined in section . and some more recent applications and modifications of the principles of dependency in section .. Section . will focus on one particular aspect of dependency grammar, valency theory, which has received considerable attention both in linguistic theory and its application to foreign language teaching and lexicography. Section . discusses dependency and valency in constructionist frameworks, and section . contains concluding remarks.
. F: T` ’
..................................................................................................................................
.. The basics of the model Tesnière’s dependency grammar can be seen as one of four main schools of post-de Saussurean structuralism—alongside with the Prague School, Hjelmslev’s glossematics, and American structuralism. Compared with the latter, Tesnière’s approach differs fundamentally in two respects: • the role of semantics; • the representation of hierarchical relations within the clause. What makes Tesnière’s approach rather modern is that he explicitly sees a sentence such as ()
Wycliffe brooded.
as consisting of three elements—Wycliffe, brooded, and the ‘connection that unites them—without which there would be no sentence’ (Chapter §),4 which is similar to the role of constructions in Construction Grammar (see Hilpert, this volume). A further parallel to modern linguistics is provided by the fact that Tesnière explicitly addresses cognitive aspects (albeit without any experimental evidence) when he says that ‘[t]he mind perceives connections between a word and its neighbors’ (Chapter §). 3 Gerdes, Hajičová, and Wanner (b: x) state a new interest in dependency grammar in natural language processing. 4 Note that in references to Éleménts indications of year and page refer to the English translation by Osborne & Kahane (Tesnière /), but chapter and paragraph numbers are also provided for the benefit of readers working with the French original or a translation into another language.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
.. Sentence hierarchy Tesnière makes a distinction between the linear order and the structural order of a sentence and states ‘that speaking a language involves transforming structural order to linear order, and conversely, understanding a language involves transforming linear order to structural order’ (Tesnière : , Chapter §).5 Thus the linear sequence of Wycliffe brooded corresponds to a structural order that can be represented in the form of the stemma in Figure .: brooded
Wycliffe
. Stemma representing the structural order of ()
The verb form brooded is the governor on which the subordinate Wycliffe depends.6 The hierarchy established here is quite different from the one represented by phrase structure (or constituency) trees typical of American structuralism and generative transformational grammar (see Lohndal and Haegeman, this volume).7 This becomes immediately obvious when one compares the analyses provided by both models for a sentence such as (): () Cosmic rays include heavy particles. A phrase structure tree such as Figure .A shows a part-whole relationship, with the whole sentence (S) being the highest element in the hierarchy, which is divided into smaller constituents following the principle of binary divisions down to the so-called ultimate constituents (i.e. the constituents that cannot be split up into any further constituents). A dependency stemma, on the other hand, differs from this in a number of ways. Most importantly, it establishes a hierarchy of elements which, in a constituency analysis, would be at the same level of constituency. A stemma such as the one shown in (Figure .B) represents relations of dependency, in which every governor forms a node, which Tesnière (: , Chapter § ) defines as ‘a set consisting of a governor and all of the subordinates that are directly or indirectly dependent on the governor’. Thus, for example, rays is the governor of cosmic whereas include is the 5
For the significance of this see Kahane and Osborne (: xii–xiii). Terminologies vary, Matthews (: ) uses the terms ‘controller’ and ‘dependent’, for instance. When referring to Tesnière, I make use of the terminology employed in the English translation (Tesnière /). 7 For a comparison of the constituency and dependency approach see e.g. Matthews (: –). 6
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
S NP
Adj
Cosmic
VP
N
V
rays
NP
include
Adj
N
heavy
particles.
. Phrase structure (constituent structure) of ()
include
rays
particles
cosmic
heavy
. Dependency stemma of ()
governor of rays (and cosmic) as well as of particles (and heavy). Include represents the central node since it governs all other nodes in this sentence.8 Other accounts of dependency use arrows to represent these relations (Matthews : ), as in Figure .c: cosmic
rays
include
heavy
particles
. Dependency representation of ()
.. Junction and transfer Apart from connection, two further concepts are central to Tesnière’s account of structural syntax—junction (jonction) and transfer (translation). Junction (i.e. coordination) is represented by horizontal lines. A sentence such as () can then be represented as in Figure . (Tesnière : , Chapter §): 8
For the distinction between node and nucleus see Tesnière (: Chapter ).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
Alfred and Bernard work and Charles sings and laughs. work
Alfred
and
and
sings
Bernard
and
laughs
Charles
. Dependency stemma of () (Tesnière : , Chapter )
Junction is described by Tesnière (: , Chapter §–) as a quantitative phenomenon, whereas transfer is seen as a qualitative phenomenon in that it refers to a change in grammatical category (Tesnière : , Chapter ).9 Thus Pierre in phrases such as Pierre’s book or le livre de Pierre is envisaged as undergoing a change from a noun to an adjective. The reason Tesnière (: , Chapter § ) gives for this analysis is that the noun Pierre, ‘though it is not morphologically an adjective, it acquires the syntactic characteristics of an adjective, that is, it acquires adjectival value’, drawing a parallel between le livre de Pierre and le livre rouge. A de
A Pierre
Pierre
's
. Representation of transfer in the case of de Pierre and Pierre’s (see Tesnière : , Chapter )
This change is indicated in the stemma in Figure . by 's and de respectively: the category symbol above the line indicates the target category (A for adjective), whereas below the horizontal line the preposed (de) or postposed ('s) translative and the source (Pierre) are given, with the lower hook of the vertical line indicating the difference between preposed and postposed translatives.
.. Valency One of the most central and probably the most influential notions of Tesnière’s Éléments is that of valency—a concept that has since been taken over by many linguists, also outside the dependency grammar framework.10 In his famous comparison of the verbal node to a theatrical performance, Tesnière (: , Chapter §, §, §) introduces the distinction between ‘process’, ‘actants’, and ‘circumstants’ which was to stimulate a great amount of research: 9 For a discussion of the word classes involved, junctives and translatives, see Tesnière (: , Chapter ). 10 See also Hudson (: ).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
§ The verb expresses the process. . . . § The actants are the beings or things, of whatever sort these might be, that participate in the process, even as simple extras or in the most passive way. § Actants are always nouns or the equivalents of nouns. In return, nouns in principle always assume the function of actants in the sentence. § Circumstants express the circumstances of time, place, manner, etc. . . . § Circumstants are always adverbs (of time, of place, of manner, etc.) . . . or the equivalents of adverbs. In return, adverbs in principle always assume the function of circumstants.
The fact that verbs can occur with different numbers of actants (ranging from none to three) is described by Tesnière (: , Chapter §) in terms of another metaphor in his definition of valency. The verb may therefore be compared to a sort of atom, susceptible to attracting a greater or lesser number of actants, according to the number of bonds the verb has available to keep them as dependents. The number of bonds a verb has constitutes what we call the verb’s valency.
The central role of verb valency for the structure of the sentence is acknowledged by the fact that the verb—provided there is one11—forms the central node of the sentence, which governs the actants and circumstants (shown in a Tesnièrian stemma from left to right but without marking the distinction in any way, see Figure .). changed she
mind
then
her
. Stemma for Then she changed her mind. > If p then q) b. Be nice to your neighbours or they will hate you. (p! or q >> If not-p then q)
The exclamation mark is meant to indicate that the relevant clause is an imperative.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
The interpretations added in brackets to these constructional schemas show that the choice of coordinating conjunction gives rise to different interpretations (van der Auwera ). In contrast to conjunctive coordination, disjunctive coordination generally expresses undesirable consequences in the second coordinate. A further interesting problem related to this contrast is the question of whether or not imperatives keep their meaning and directive force in such complex constructions. Examples like () suggest that they do not. That example can hardly be understood as an invitation, request, or command to hit the speaker. The answer given to our question with regard to conjunctive coordination in Huddleston and Pullum (: –) is therefore a negative one and the relevant constructions are analysed as indirect speech acts. Given that illocutions are generally associated with main clauses rather than subordinate clauses, this analysis is prima facie the most adequate one. Such an analysis does not seem to be plausible for cases like (), however. In such cases, the directive force is obviously not backgrounded or invisible. The examples in () can be used to perform a directive speech act, albeit not necessarily one intended for immediate execution, such as a piece of general advice (cf. Huddleston and Pullum (: –). The puzzle presented by the contradictory impressions of the two types of examples disappears, once we realize that the first coordinates of sentences like () and () frequently allow an analysis both as reduced declaratives and as imperatives (cf. Russell ). In those cases where only an analysis as an imperative is possible, i.e. with overt subjects or with initial negation, the directive force generally associated with imperatives is maintained in the conditional coordination construction. Note that such sentences are felicitous only if the consequent denotes a desirable situation: ()
a. Don’t do the crime and you won’t do the time. b. Somebody help me carry this suitcase and I will buy you a drink later.
Analogously, whenever a bare (subjectless) VP-construction is a reduced finite clause (declarative or other), there is no directive force in strikingly similar constructions. In the following case, the use of negative polarity any more excludes an imperative interpretation and characterizes the relevant VP as a reduction of a finite clause: () (You) Drink any more beer and you’ll be sick. The descriptive generalization then is the following: imperatives in conditional coordination constructions always keep their directive force in combination with disjunctive coordination (or). In cases of coordination with and, the directive force is kept for imperatives with or without subjects, but it is not found in combination with bare VPantecedents, i.e. reduced versions of finite clauses that may have the same shape as imperatives.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. T
.................................................................................................................................. The view of two recent comprehensive grammars of English (Quirk et al. ; Huddleston and Pullum ) that only clauses that are introduced by the interrogative words what and how and do not manifest subject–auxiliary inversion should be analysed as instantiating the exclamative clause type in English is strongly supported by data from actual usage in corpora. On the basis of a thorough assessment of the data found in the British National Corpus (BNC) and the International Corpus of English, British Component (ICE-GB), Siemund () has shown that these two construction types are by far the most frequently used constructions for the expression of exclamations. Moreover, what is typically found in actual use are reduced clauses without verbs (How awful! What a good idea!). There are of course other constructions for expressing exclamations or emotional reactions more generally, but at least some of those can be analysed as indirect speech acts: ()
a. Isn’t this wonderful! b. That I should live to see this!/That my own son should be capable of doing such a thing! c. That I should finally be able to live a life in freedom! d. Mary is such a wonderful person! She is so helpful!
Our first example is a negated interrogative and thus a biased question, which expects its positive declarative counterpart as an answer. Whether it is used as an exclamation depends on the property under discussion. The choice of rather mediocre in (a) would not permit such an interpretation. Insubordination constructions like (b–c), i.e. the use of subordinate clauses as main clauses (cf. Evans ), express positive or negative emotional reactions, but do not have the degree component found to be characteristic of exclamatives above. Finally, both so and such are used as intensifiers, but assert a quality, rather than expressing surprise at an unexpected degree of a property. Examples like () and (), however, cannot be analysed as indirect speech acts. They do have the essential degree component commented on in an exclamation and are typically used as exclamations, but lack the formal properties regarded as characteristic of exclamatives above. To do justice to their specific formal properties (NPs with relative clauses) and their typical use as exclamations, it makes most sense to analyse them as a ‘minor clause type’ or a non-sentential construction in the sense of Culicover and Jackendoff (: ff.), rather than subsume them under the category of exclamatives (cf. Rett ). ()
a. The speed they drive on the freeway! b. The places John has visited! c. The people he knows!
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
()
̈ (Boy), does John bake delicious desserts!
Constructions like (), with a focal stress on either boy or the auxiliary verb (verum focus), cannot be used as questions, unless they repeat a previous unanswered question, and so an analysis as indirect speech act, parallel to their negated counterpart, is excluded. They presuppose the content of their declarative counterpart and express surprise at an unexpected degree. Again, they are therefore best analysed as an instance of minor clause types, associated with a use as exclamation.
. A
.................................................................................................................................. During the last thirty years or so, contrastive and typological studies (Sadock and Zwicky ; König and Siemund , ) have identified many parameters of crosslinguistic variation and locating English on these parameters has revealed insights into the structure of this language not available to traditional grammar writing. In the domain under discussion the following features can be described as characteristic properties of English. Like all Germanic languages, English differentiates declaratives from interrogatives through constituent order. In contrast to other Germanic languages, it is, however, always a finite auxiliary verb that precedes the subject of the clause. The ‘dummy’ verb do is introduced in those cases where the corresponding declarative does not contain another auxiliary. Interrogative tags are special in (Standard) English, since they are not fixed phrases or words, like n’est-ce pas or pas vrai in French or right in standard varieties and innit in non-standard varieties of English, but differ in their form depending on the form of the preceding declarative or imperative clause. Open interrogatives are characterized by an interrogative pro-form or phrase in initial position (wh-movement). Multiple fronting of such pro-forms or phrases, as in Slavic languages, is not admissible. In sentences with several interrogative pro-forms it is the subject that has to be moved to the front (‘superiority effect’: Who said what to whom?). These properties mean that interrogatives in English are recognized early as questions, in contrast to languages with final interrogative particles and interrogative pro-forms in situ (e.g., Mandarin Chinese). As far as the inventory of interrogative words is concerned, a remarkable fact is that English lost the distinction between interrogative pro-forms for location (where) and direction (Early Modern English whither, whence). Moreover, English has no interrogative pro-form for ordinal numbers (*the how-manieth) and lacks an interrogative pro-form for verbs, like all other European languages (cf. Hagège ). In the domain of extraction, English manifests looser restrictions than many other Germanic and European languages, since an initial interrogative pro-form, or relative pronoun for that matter, may relate to a gap more deeply embedded than the next following (non-finite
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
or finite) clause (cf. Hawkins ; Borsley, this volume), which is generally excluded in languages like German: ()
Whati did you ask me [to try [to repair __i]]?
The exclusive use of the interrogative pro-forms how and what in exclamatives is semantically motivated, since they relate to manner, degree, and quality, just like the analogous former demonstratives so (< swa) and such (< swilc), which have largely lost their gestural use and developed a use as specific type intensifiers or boosters denoting a high degree of a property (John is such a charming person! He is so considerate!). The inventory of imperative constructions in English is very small, restricted as it is to the second person, and only with the grammaticalization of additional lexical expressions (let) has English extended the relevant paradigm to first-person plural imperatives (‘hortatives’). In contrast to other Germanic languages, such as German, English allows the introduction of imperatives by adverbials, so that the verb can occur in second position (Slowly shift into lower gear).
A I am much indebted to Rodney Huddleston, Gunther Kaltenböck, an anonymous reviewer, and the editors of this volume for critical comments and helpful suggestions on an earlier draft of this chapter. I am fully responsible for all errors that remain.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I: A, ,
.................................................................................................................................. C the following paragraph: ()
Here’s what happened: I left home around 1 pm yesterday, after I had had lunch. As I was walking down the street I ran into an old friend, whom I have always remembered very fondly. He went to live in Australia after college so I was eager to catch up. He told me he was working for an insurance company, which was located just round the corner from where I live. He was also writing a book about his Australian experience. We had coffee and then parted. We promised we would keep in touch. I will definitely make sure I give him a call by next week. In fact, I’m dialling his number right now.
A first observation is that the situations as they are referred to by the speaker are different in nature: some of them hardly take up time (leave, run into someone); others are durative and have a clear temporal contour (have lunch). But leave and have lunch also share a feature, that is, both of them have intrinsic endpoints (they are said to be ‘telic’), unlike, for instance, work for an insurance company and be located round the corner. Even though we know that people can change jobs and that buildings can be removed, the way in which these two situations are referred to (lexicalized) in the sentence does not reveal anything about potential endpoints: in other words, there are no special markers indicating the presence of endpoints. At the same time, work for an insurance company and be located round the corner also differ in a crucial way: while working requires input of energy, being located somewhere does not. More technically, the former is a dynamic situation and the latter is a state.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
These distinctions are just some of the ones discussed under the heading of ‘Aktionsart’, ‘actionality’ or ‘situation types’ in the literature. The standard classification, which has inspired many variants and finer-grained taxonomies, is based on work by Vendler (), who distinguishes Achievements (run into someone), Accomplishments (have lunch), Activities (work), and States (be located somewhere). As will be discussed in subsequent sections, States are different from all other situation types in that they involve no transmission of energy (and thus, no agency) and no internal development (and thus, no endpoints); all other situation types are dynamic, in that there is transmission of energy and possibly change, but differ along the parameters of duration and the availability of endpoints. As will be clear from the examples, situation types do not merely relate to the verbs that are used, but they also crucially involve other parts of the verb phrase, too. Thus, although it is tempting to describe such distinctions as ‘lexical aspect’ (and contrast them with aspectual distinctions that are grammatically encoded, such as the progressive construction), it should be clear that situation type is not simply a lexical property of the verb: for example, the verb have may seem to be lexically specified as a State verb, but clearly it expresses a dynamic situation in a case like have lunch. The speaker does not just choose to capture what happened in terms of specific situation types; she also communicates information concerning the perspective she takes on what happened: she may take a bird’s eye view and represent a situation in its entirety (I left home) or she may choose to zoom in on the middle of the situation (I’m dialling his number right now), leaving the beginning and the end of the situation out of focus. This distinction is said to be aspectual and it is grammaticalized in English through the use of a form of the auxiliary be combined with a present participle (-ing), which is called the progressive construction. The passage in () illustrates the effect of the progressive form, which is the only uncontroversial aspectual marker in English: the Activity work for an insurance company and the Accomplishment write a book are represented as ongoing (at some interval of time in the past) and so is dialling his number at the moment of speech (MoS). In the latter two cases, an additional effect of the progressive is that the beginning and the endpoint inherent in the Accomplishment are out of focus. In other words, the effect of the progressive aspectual marker will be (slightly) different depending on the Aktionsart type it interacts with. When combined with a punctual situation (Achievement), for instance, the effect of the progressive will be to give a temporal contour to that situation and in many cases this will mean that a repetitive interpretation is brought about (I’m leaving home at pm nowadays). The semantics of the progressive also constrains the situation type it can combine with. For instance, the progressive marker is normally not compatible with a State verb: one cannot say that a building ‘is being in the main street’. Ongoingness seems to require input of energy and internal development and hence reference to a dynamic situation, as will be further discussed in subsequent sections. Apart from aspectual distinctions like ‘progressiveness’, verbs also convey temporal information about the time of the events as well as about the chronological order in which they occur. This kind of information is grammaticalized as
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
tense.1 Tense is most often used to locate a situation relative to the MoS and is thus considered a ‘deictic’ system: situations are generally described as prior to or coincidental with or following the MoS; this basic understanding of past, present, and future as points related deictically to the MoS will be obvious at various stages in the discussion below. Moreover, deixis has been used as a defining characteristic of tense, as opposed to aspect: although both categories are grammatical means for presenting situations, it is only tense that is a deictic category.2 The events in () are mainly located in the past and there are several forms that establish past time reference: there is the morphologically past inflection of the verb (as in happened or promised), as well as an adverb (for instance, yesterday). However, this is not the only way a situation is understood to have happened before now: the combination of have with a past participle in have remembered, i.e. a verbal periphrastic construction, likewise refers to a situation that happened before the time of speaking (even though it may continue up to the point of speaking—and beyond). Furthermore, the events do not happen in the linear order in which they are mentioned, but we know that lunch took place before leaving home thanks to the use of a slightly different verbal periphrastic construction, that is, the past perfect (had had lunch). In this case, the conjunction after also helps to establish the relationship of anteriority between the two situations. The forms commented upon in the previous paragraph are only some of the markers that impact on how time is formally represented in discourse. As briefly mentioned above, tense may be defined as the way that a grammar encodes the notion of time, but there are various complications. For example, although tenses are usually realized as bound inflectional morphemes added to the verb,3 as we just saw, very often periphrastic expressions may seem to have similar effects (see the relevant description in Huddleston and Pullum (: Chapters and ) and Spencer (this volume)). For similar reasons, it is rather a matter of debate whether free morphemes, such as will and shall in English, can be regarded as markers of tense, that is, markers of future time. Formulating an answer to this question involves issues other than morphological considerations (see section .. below and Ziegeler’s chapter in this volume), but morphology has an important role to play (see Spencer (this volume)). Likewise, the perfect periphrasis (a form of have and a past participle) has been considered as a tense by some (e.g. Huddleston and Pullum : describe it as a ‘secondary tense’
1 It is not the case that all languages grammaticalize this area in exactly the same fashion; for example, different languages may or may not have a distinct future marker; or they may have more than one systematic means for referring to the future; and so on and so forth. Moreover, there is a more technical sense in which grammaticalization research has been relevant to the analysis of the diachronic development of tense and aspect systems; the relevant literature spans across distinct theoretical frameworks and can be found in works such as Bybee et al. ; Hengeveld ; Roberts and Roussou : Chapter . 2 See, amongst others, Dahl (: ), who makes it clear that deixis can only be a definitional property of absolute tenses. The distinction between absolute (or deictic) and relative (or anaphoric) tenses will be relevant to the discussion below. 3 A statement like this might appear to ignore various complications (discussed in section ..), such as formations like ran, left, or went, where inflectional elements may be hard to identify.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
distinction), but is analysed in terms of aspect by most scholars (see section ..). It follows from these observations that there is no unanimous view on the number of tenses—even if we restrict attention to the formal distinctions. The situation is far more complicated if semantic nuances are taken into account, as shown in the relevant debates (for example, contrast the discussion of future time reference in Comrie , Huddleston and Pullum : –, Salkie , and Ziegeler (this volume)). A standard view is that there are two tenses in English (the past tense and the present (or non-past) tense—e.g., Quirk et al. : –, Biber et al. : –, and Aarts : Chapter ).4 In what follows, we will address the three basic concepts introduced so far: those of Aktionsart, aspect, and tense. It should be obvious that the three categories have to be expressed in the same sentence and are thus expected to interact in systematic ways. It is also obvious that they can be approached from different perspectives, focusing either on their formal expression (e.g., morphological versus syntactic or inflectional versus periphrastic), or on their interpretation (in semantic and pragmatic terms). Moreover, a specific form is not uniquely associated with a specific function (for instance, the past does not necessarily express past time, as in I was wondering if you could give a hand) and one particular function may be realized by different forms (for instance, adverbs and conjunctions can also provide temporal information and thus affect the interpretation of a sentence). Besides, tense and aspect/Aktionsart have also been known to interact with neighbouring categories, such as voice, mood and modality, evidentiality etc. In what follows a host of relevant issues will be addressed in a rather selective manner, providing a general sketch of the variety of questions involved and referring the interested reader to other chapters in this volume and more detailed accounts such as Bybee et al. , Dahl , Declerck , Binnick . More specifically, the chapter is organized as follows: in the following section we will discuss the forms that are usually classified as markers of aspect and tense in English; in section . the discussion will focus on the interaction of the categories involved and the factors affecting their interpretation and finally section . will summarize the issues addressed.
. A E
.................................................................................................................................. In this section we turn to the description of particular formal facts relating to the expression of aspect and tense in English. We will address some properties of progressive marking (..), past tense (..), the perfect (..), the conditional (..), the future (..), and a number of issues relating to the syntactic effects of tense and aspect (..). The discussion will identify a number of problems relating to the description and analysis of the relevant data. 4
But see Declerck et al. (), for example, who argue for eight different tenses.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Progressive aspect Aspect is concerned with ‘the internal temporal constituency’ (Comrie : ) of a situation, in other words, with linguistic markers that focus on facets of the situation (such as the beginning, the middle, the end of a situation or the entire situation). The marking of progressive aspect is the one formal distinction which is unanimously regarded as an aspectual contrast in English (Comrie , Quirk et al. , Huddleston and Pullum , Aarts ). Moreover, the marking of the progressive is surprisingly regular, yielding a fully symmetrical paradigm of verbal forms which express tense and aspect distinctions (see Table .)—at least as far as the marking of the [+/ past] is regarded as a tense distinction and the marking of [+/ progressive] is regarded as an aspectual distinction.5 In effect, each one of the ‘simple’ combinations in Table . can be matched by a progressive counterpart; each pair is usually transparently named as ‘simple present’ and ‘present progressive’, ‘simple past’ and ‘past progressive’, and so on. In this sense it is always easy to distinguish between simple and progressive forms on the basis of presence or absence of the be+-ing marking. What is not easy in this case is to make a general statement that can capture the semantic content of the opposition: ‘The choice of viewpoint is syntactically obligatory for all clauses, so that the system is symmetrical from the syntactic point of view. However the symmetry is not complete at the semantic level’ (Smith : ). Crucially, as discussed in more detail in section . below, the interpretation of progressive marking interacts in significant ways both with tense and Aktionsart features in a sentence (see Binnick : –). Table . Simple and progressive forms of the ‘tenses’ Aspect
Simple
Progressive
He writes
He is writing
Past
He wrote
He was writing
Future
He will write
He will be writing
Conditional
He would write
He would be writing
Present perfect
He has written
He has been writing
Past perfect
He had written
He had been writing
Future perfect
He will have written
He will have been writing
Conditional perfect
He would have written
He would have been writing
Tense Present
5 It should be noted that the forms listed are not unanimously regarded as tenses. For some the perfect combinations should be excluded; similarly, all modal combinations (involving either will or would) are often excluded; or, alternatively, the combinations of will are considered future tenses—but the combinations with would are not. Cf. the alternative analyses summarized in Salkie : –. Table . can be seen as an application of Declerck’s () Eight Tense analysis.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In many cases, i.e. in the clearest examples which also support the name ‘progressive’, the effect is that of presenting a situation as, quite literally, in progress and thus as the background for another event: ()
a. John was writing a letter when the telephone rang. b. The telephone was ringing when John entered the room. c. John was entering the room when the telephone rang.
Clearly, however, this backgrounding effect is not always available. For one thing, it is not always the case that a progressive situation is presented in relation to a nonprogressive one, as in He was also writing a book about his Australian experience in () as well as the examples in () and (). In these examples, there is no clause in the sentence or immediate discourse in which a non-progressive tense is used with respect to which the progressive form is interpreted: ()
a. John was hoping to see you. b. Mary is visiting her grandparents at the moment. c. I have been waiting for some time.
While the progressive marking is best described as an aspectual category, one of its most obvious functions is its obligatory marking of non-stative verbs to effect a ‘true present’ interpretation: the situations in () below can only be interpreted as simultaneous with the MoS if marked for the progressive. ()
a. I am reading a book right now. (versus I read a book in the evenings.) b. She is walking to work. (versus She walks to work regularly.)
Thus, despite the expectations arising from its usual name, the simple present is not the best way to refer to a situation that is taking place at the time of speaking. Indeed, it is only stative verbs that can ordinarily appear in the simple form and still receive a truly present (i.e. coincident with the MoS) interpretation:6 ()
6
a. She is very old. (and not *She is being very old) b. He has three children. (and not *He is having three children7) c. She resembles her mother. (and not *She is resembling her mother)
There are arguably some exceptions to this generalization, which involve commentaries, etc. on which see the discussion below, regarding our examples in (9). Moreover, it is precisely in this sense that ordinarily needs to be understood: the present progressive of dynamic verbs is generally interpreted as true present, without any special genre requirements. 7 This might be acceptable with some different interpretation, despite the masculine gender; thus, the ungrammaticality concerns only the intended interpretation as a stative predicate, as also discussed in relation to the examples that follow.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In other words, stative verbs are normally not marked for the progressive.8 If progressive marking is indeed possible, they often receive a marked, non-stative, interpretation: ()
a. She is having a baby. b. He is being stubborn.
These usually describe dynamic situations presented as currently in progress (or future): in (a) she is currently giving birth (or expecting to give birth) to a child and in (b) he is acting stubbornly. This pair contrasts with truly stative interpretations of the same sentences which are only available in the non-progressive versions in (): ()
a. She has a baby. b. He is stubborn.
A similar point is often made in relation to modal combinations: normally a modal + simple form of the infinitive cannot refer to the MoS (unless stative) and the effect of progressive marking of a dynamic verb in a modal combination can be comparable to that in a non-modal form. In Langacker’s (: –) words, modals are ‘aspectually transparent’: ()
a. She may jump. (simple dynamic: impossible to locate the jump at the MoS) b. She may be jumping. (progressive dynamic: MoS possible) c. She may be angry. (simple stative: MoS possible)
In other words, the examples in () behave in exactly the same manner that would be expected if the modals were absent: as in the examples in (), the non-progressive, simple form of dynamic verbs in the present cannot receive a MoS interpretation.9 By contrast, there is a well-defined set of special constructions and contexts that allow the use of the simple present of a dynamic verb to refer to the MoS. These include performatives (as in a), live reports (as in b), and demonstrations (as in c): ()
a. I promise I’ll do it. b. He shoots, he scores. c. I now add the eggs and give the mixture a soft beating.
8 Despite the validity of this statement in the general case, one must not ignore the fairly universal appearance of the progressive with stative verbs in the English of Southeast Asia, East Africa, and India, to name a few, as seen in the ICE-corpora and studies based on them; see, for example, Schilk and Hammel (). Siemund (this volume) also mentions extended use of the progressive with statives in non-standard regional varieties. 9 This restriction may be regarded as an instance of the cross-linguistically attested ‘present perfective paradox’, i.e. the impossibility of MoS interpretation of perfective presents—as discussed in Comrie , Dahl , Malchukov . As Langacker : puts it: ‘Location in the present (coincident with the time of speaking) is precluded for the same reason that true present-tense perfectives are normally infelicitous: it cannot be presumed that a perfective event has exactly the same duration as a speech event describing it.’
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Conversely, there are instances of the progressive which seem to contradict its usual effect (of making reference to the MoS possible), as in the following examples, which need not necessarily describe a situation that is actually ongoing at the MoS: ()
I’m reading her latest novel. (= not necessarily right now)
It should be noted that the examples given in the initial passage in () are similar, i.e. ‘he was writing a book about his Australian experience’ doesn’t refer necessarily to the particular time of the speaker’s meeting in the past—but, instead, to an extended period around that point. Examples like () have been discussed in terms of temporal reference to an ‘extended now’ and obviously they could also involve an extended period in the past (as in ()). A similar situation is also noted in the use of the present progressive with reference to plans and arrangements in the near future, as in: ()
She is flying to London tomorrow.
A corresponding effect would involve future-in-the-past as in (): ()
She said she was flying to London on the following day.
Overall, it should be obvious that a unified account of all the uses of the progressive clearly requires some abstraction, thus making reference to the absence of boundaries of the situations described by the progressive. The exact interpretation will then depend on the Aktionsart properties of the verb phrase which is marked for progressive, as discussed in section . below.10
.. Past versus non-past The ending -ed used for most regular verbs in English is often considered the inflectional marking of past tense. There are two problems with a definition of this type: the same ending is very often used to form the ‘past participle’ (i.e. not only in I talked but also in I have talked). Secondly, past tense marking is not as regular as progressive marking, in that it is not always marked through the affixation of -ed, but may involve a number of alternative means (including sound changes as in took, ran, made, sent or even zero-marking as in cut or quite unpredictable (‘suppletive’) forms as in went, was/were etc.). However, the distinction marked by presence of a past marker (be it -ed 10
A number of other uses of the progressive have also been noted in the literature, such as the Resultative Imperfective (Smith : ) in cases like Your drink is sitting on the table and The picture was hanging on the wall; the Interpretive Progressive (Aarts : ), in cases like Oh, you’re kidding and If Nick says he’ll repair the roof, he’s deceiving you; and the Progressive of Affect (Depraetere and Langford : ) as in My sister is constantly embarrassing me in front of my friends and also Sarah’s quite a mom. She’s always putting her kids first.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
or any corresponding change) is both obligatory and systematic; in this sense it is clearly a grammatical distinction which is probably best described as past-tense marking, on the basis of its prototypical effect. As argued in Spencer (Chapters and , this volume), -ed is a highly polyfunctional marker, and this calls into question the usual assumptions concerning inflectional categories. In the sentence: ()
We had coffee and then parted
there is no doubt that both events precede the MoS. In other words, what is effected by the presence of past-tense marking is the location of a situation at a time earlier than the MoS—a clearly deictic notion. However, there are also well known cases in which the effect of past marking is clearly not temporal—but instead may be described as ‘modal’: in these sentences the past tense forms do not mark past time. Instead, their use may be regarded as a means of marking ‘counterfactuality’: ()
a. I wish they were here now. b. If I were you, I wouldn’t do that.11
In (a) they are not here now (so were does not relate to a past situation, but only contrasts with a sentence that would assert that they are here now); and in (b) I am not you (so again there is no reference to the past—only a contrast between the nonfactual situation entertained, i.e. ‘I am you’, and the fact that ‘I am not you’). However, it has also been suggested (e.g. in Langacker : –) that the effect of -ed marking may be described as encoding ‘distance’: neutrally, this distance may be applied to a temporal past, when a past form is used to refer to a past event, but it may also be applied to a modal, unreal, potential universe (which could be present or future), whenever it is used modally. In any case, these examples constitute a problem for the general description, as the past tense forms do not always mark past time.12 The use of the past morphology in situations described as ‘Sequence of Tense’ (see also the morphological issues raised in Spencer’s chapter on inflection and derivation
11 It is not surprising, in view of similar examples, that the form were has been described as a ‘past subjunctive’ (e.g. in Quirk et al. : –), a term intended to capture both its past morphology and its modal meaning. In the case of the verb be only, one may argue for the assumption of a special subjunctive form (were for all persons, including first and third singular); all other verbs appear in the usual past form in this type of conditional construction. Huddleston and Pullum (: –) argue that this form is best termed irrealis. However, it is clear that the same form that is used in examples of this type is also regularly used to refer to past situations. See also Ziegeler (this volume) on Mood and modality. 12 We will not consider the problems arising from the homophonous ‘perfect participle’ and ‘passive participle’ forms—as discussed in Spencer (this volume); these pose interesting questions for the morphological analysis of a form like walked, caught, or fed considered in isolation; yet, in any sentential environment, it will always be clear whether any such form is meant to function as a finite (tensed) or non-finite (participial) form.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
(this volume)) has also been used as an argument against the assumption of a truly temporal semantics of the past / non-past opposition; for example ()
We said we would be in touch.
In a case like this, would is the past form of will which need not be interpreted as past in any deictic sense: it does not necessarily refer to past time. Its presence is only motivated by grammatical considerations, namely the fact that the ‘reporting’ verb is marked for past, and this triggers the shift of all reported verbs into a past form. Declerck () explains this observation as follows: would + V is a relative or anaphoric tense that expresses a relationship of posteriority between a past moment in time and another situation. The defining feature of anaphoric tenses is that they express a temporal relation between two moments in time, none of which is the MoS. In this sense, the ‘deictic centre’ (which may be described as ‘reference time’ or ‘time of orientation’) in this case is not the MoS, but the time specified by the main, embedding verb, i.e. the time at which ‘we said . . . ’. Thus, it is not surprising that these cases have been regarded as the past counterparts of pure futures: just as a future locates an event at a time later than the moment of speech, the effect of these forms is to locate an event at a time later than a past moment which serves as the base time (see e.g., DavidsenNielsen ). It is only the temporal relation of posteriority that is grammaticalized and as a result, the temporal location of the situation of ‘being in touch’ is left vague: it could lie in the past (e.g., later that week), in the present (e.g., today), or in the future (e.g., next week) (see section ..).
.. The perfect As observed in the introduction, views differ on whether the perfect is an aspect or a tense. The characterizing formal feature of the perfect in English is that it involves a (tensed or untensed) form of have combined with a past participle: has been, had been, will have been, would have been, and to have been / having been. It will be recalled that the very fact that there is no bound morpheme that expresses perfect meaning may be a first reason not to consider the perfect as a tense, at least in those approaches that insist on inflectional expression as a definitional property of tenses. In the temporal analysis of this periphrase, the relationship of anteriority expressed by have is in the foreground. Indeed, it was the analysis of perfect constructions that led Hans Reichenbach () to the assumption of the Reference point, which needs to be present in their semantics: there is always a relationship of anteriority between some reference time (R) (which may be located in the past (had been), the present (have been) or the future (will have been)) and the time of the event (E). On the aspectual approach, the perfect is seen as representing a situation as completed and as a stativizing operator, that is a form that implies reference to a (resulting) state in some way or another; consider, for example, textbook instances of ‘experiential perfects’, such as I have never been to America
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
(i.e. at the MoS I am still inexperienced) or the ‘perfect of result’, as in I haven’t had breakfast this morning (implying that, at the MoS, I am still hungry). In other words, it should be obvious that even on the aspectual approach, considerations of temporal structure are not completely out of the picture (see Ritz : –; for further discussion of the status of the perfect, see, e.g., McCoard , Declerck , Alexiadou et al. ). What is the difference between the past tense and the present perfect when referring to past situations? For example, the sentences below could both be true at the same time and can, arguably, describe exactly the same situation: ()
a. I had breakfast this morning. b. I have had breakfast this morning.
This could mean (a) that the present perfect is an alternative means of referring to (at least some) past situations and (b) that the distinction between present perfect and simple past is not one of tense, as they can both make reference to exactly the same temporal location. However, it has been argued that they are semantically different in that the present perfect, unlike the past tense, communicates current relevance (see e.g., Elsness : –). This semantic feature explains why the perfect is not compatible with adverbials that refer to a period of time that is perceived as lying completely before/being disconnected from the MoS:13 () *I have had breakfast at . a.m. / yesterday / last week / ten days ago / in . By contrast, in a number of non-finite contexts (as briefly discussed in section . below), the perfect is the only means available to effect any kind of past time reference, as in the following cases—which can in fact be modified by deictic past adverbials:14 ()
a. They must have seen her ten days ago. b. Having seen her only last week, I can say that she hasn’t changed in appearance. c. He seems to have seen her yesterday.
Any other kind of marking of tense on verbs is completely unavailable in all these contexts (e.g., *They must saw her then; *He seems to saw her yesterday). In this sense, the anteriority signalled by the perfect is not to be mistaken for deictic past time reference signalled by the simple past, as the apparent coincidence of the situations
13
In the system developed by Hans Reichenbach, the difference is captured through the representation of the perfect as E_R,S as opposed to the past (E,R_S). This can then be further developed to account for their syntactic and semantic differences, as in Comrie , Hornstein , Giorgi and Pianesi , and Borik . 14 Interestingly, Kearns (: –) regards examples like (a) as instances of the tense perfect while examples like (b) would be classified as instances of the aspectual perfect (Kearns : –).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
described by the breakfast examples is precisely only apparent: the past tense version asserts the location of the event at a time earlier than the MoS, while the perfect asserts that a situation has taken place at a time earlier than the Reference time, which generally coincides with the MoS.
.. The conditional: tense, modality, or mood? The status of forms consisting of would + infinitive (simple, progressive, or perfect) has not always received due attention in discussions of tense and aspect, owing to the complicated semantics of modality and aspect that seem to be involved. Moreover, given the primarily deictic function of tense, relating situations to the MoS, these forms cannot always be seen to perform a temporal function, as they tend to relate situations to times other than the MoS and even to some hypothetical universe. Thus, a contrast may be made between a temporal (‘future-in-the-past’) and a non-temporal (‘modal’ or ‘counterfactual’) use of the same form, as in the following pair of examples: ()
a. We promised we would be in touch. b. We would be in touch if we had the time.
In (a) the being-in-touch situation is not located in time in relation to the MoS: temporal location is understood to be future in relation to the moment in time at which the promise was made: that moment is clearly (= deictically) marked as earlier than the MoS (through the past marking on promised) and thus the effect of ‘would+V’ is quite literally ‘future-in-the-past’. The being-in-touch is only located in the future of that past promise—but there is no indication as to whether the contact has already been effected (earlier than the MoS), or is currently under way (at the MoS) or is still to take place (later than the MoS). In other words, the ‘would+V’ construction does not locate the being-in-touch at any moment in relation to the MoS: it could precede, coincide with, or follow the MoS. In this sense it is ‘underspecified’ in terms of deixis—and this may be regarded as a good reason for not regarding it as a tense. However, this is not necessarily the conclusion to draw. As we saw in section .. above, there are many frameworks, most often in English Language Teaching, but also in a number of more theoretical approaches, that acknowledge the inclusion of would combinations in tense inventories (see Declerck et al. and Depraetere and Langford ).15
As also mentioned above (see section ..), Declerck and Declerck et al. distinguish between deictic (absolute) tenses, such as the past tense, and anaphoric (relative tenses), such as the conditional. Whereas deictic tenses express a relation between two moments in time one of which if the MoS, anaphoric tenses express a relation between two moments in time none of which is the MoS. Declerck and Reed (: especially Chapters and ) discuss the use of various tenses in the case of conditionals in English. 15
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
It is perhaps even more common for this construction to appear in examples such as (b) above, where, again, the situation of being in touch is not related to the MoS directly; the most natural interpretation locates it as a potential outcome of the realization of the situation described in the conditional clause—in the present or future. The sentence may be a modalized prediction about a future occurrence of being in touch, on condition of a quite improbable present or future situation: if we had the time (right now or in the future), we would be in touch—but the speaker is not as certain (or as optimistic) as if she had uttered instead We will be in touch if we have the time. Compare also I would tell you if I found out the answer to I will tell you if I find out the answer, with the former suggesting I think it less likely that I will find out the answer. On the basis of examples such as (b), this form is often described as ‘conditional’— and this seems to suggest that it is not to be regarded as a tense but rather as a modal element (corresponding to forms like the conditional mood in French; moods are generally defined as verbal forms that realize modal meanings, such as counterfactuality, non-factuality, or hypotheticality—see Ziegeler, this volume). However, it has also been suggested on the basis of cross-linguistic evidence that conditional forms that seem to involve a combination of future and past markers are best treated as tenses, rather than moods (Thieroff ).
.. Does English have a future tense? The status of the future tense in English and in general has been questioned, as there is a clear sense in which the future cannot be known in the same way as the present or the past: strictly speaking, one can only really know things that have happened or are happening—but not things that may be predicted to happen (see Giannakidou and references therein). However, speakers seem to be willing to treat the future as if it were part of known reality. Thus, there are ‘future indicative’ forms in the grammar of many languages—and speakers, although not epistemologically justified, are often seen to consider the future on a par with the present and past. In Comrie’s (: –) much quoted example, speakers of English will tend to consider a prediction like It will rain tomorrow as false (if it does not in fact rain), though not so in the case of a purely modal statement like It may rain tomorrow. In the case of English, there are obvious formal facts that set will apart and call into question its status as a marker of future tense. Firstly, will is not an inflection (unlike the past tense marker -ed (in the case of regular verbs like she talked, he walked, it worked, etc., as discussed previously in section ..), or the present inflection -s manifested in third person singular present tense, as in she talks, he walks, it works, etc. Secondly, as shown in section .., it seems to have a past tense form would. If so, this means that will is a present tense form. In various frameworks, from structuralist considerations to systems of formal logic, it is difficult to accept future as a member of the same taxonomy as present and past, given the possibility of future+present (in the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
case of will) and future+past (in the case of would).16 Obviously, these two objections are similar to those against treating the perfect as a tense in English—as discussed in section .. above. However, in the case of will, there are further objections to treating it as a future tense marker in English, namely that will is clearly a member of the (formally defined) set of modal verbs in English, and thus matches the expectations that modals should anyway relate to futurity as much as to irreality, as discussed in Ziegeler (this volume) among others. A further common objection to the recognition of will as a future tense marker relates to the various modal nuances that may be expressed by it, more so in cases where futurity seems to be absent (as in That’ll be the postman—see Ziegeler, this volume), but also in various cases in which futurity cannot be separated from other modal nuances signalled at the same time (as in volitional examples like We can’t find a publisher who will publish it or ‘characteristic behaviour’ examples as in Oil will float on water—see Ziegeler ). In other words, the objection has been two-fold: formally, will is not a (tense) inflection but a member of the set of modals; semantically, its contribution is not always temporal—it may express some modal notion (instead of or in addition to future time reference). Finally, it is often noted that will is only one of many means available for the expression of future time reference and it need not enjoy any more special treatment than all the other modals, or future periphrases like be going to and be to, or futurate uses of the simple and the progressive present,17 or even the imperative and the prohibitive, since they can all refer to the future in some way or other. Given that futures have been observed to have special status cross-linguistically,18 it is not surprising that it has been argued (most recently in Salkie ) that will is best treated as a future tense marker. Clearly, will is a member of the class of modal verbs on formal grounds. However, this does not necessarily mean that it cannot be treated as a future tense marker. Moreover, on semantic grounds, as noted in McCawley (: ), it can be contrasted with the rest of the modals in that it can have non-modal, pure future time reference as its sole meaning. Interestingly, Salkie (: ) reports that ‘Most corpus studies agree that the proportion of pure future time uses is over per cent’.
16 The same idea is captured by the assumption in various formal semantics approaches of an abstract modal woll which surfaces as will when combined with present and as would when combined with past (see, for example Wurmbrand : and references therein). It should be clear, however, that this particular problem disappears if will is not regarded as a future tense marker: if will is not a member of a three-way tense distinction, then it can freely combine with both members of the binary opposition between [+/-past]. 17 Relevant examples include We paint/are painting the fence tomorrow and When do lectures end this year? (discussed in Smith : – and Huddleston and Pullum : –, respectively); our example in () would also be considered an instance of the futurate; moreover, examples like our () are described as preterite futurate in Huddleston and Pullum : ). 18 Futures have been reported to constitute the most marked member of many tense systems, especially in terms of their formal expression: they tend to favour periphrastic expression much more than past tenses (as noted in Dahl , Bybee et al. , Velupillai : ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Syntactic tense and aspect Moving to levels beyond the word and verbal morphology, questions that need to be addressed relate to the ways tense and aspect seem to affect the properties of sentences and larger texts. For example, it is not a trivial observation that sentence meaning can be affected in varying ways by tense and aspect marking. In some instances, there is a clear semantic contrast. Thus, for example, the following sentences would often not all be true at the same time: ()
a. My youngest brother is twenty-one (e.g., today) b. My youngest brother was twenty-one (e.g., last year) c. My youngest brother will be twenty-one (e.g., next year)
Moreover, the choice of tense will also make (im)possible the appearance of particular adverbials in a sentence: ()
a. *My youngest brother was twenty-one next year. b. *My youngest brother will be twenty-one last year.
In this sense, tense is an important part of sentential meaning with important consequences on syntactic restrictions. This is mostly, but not necessarily the case: the tense alternations in the following pairs do not seem to have significant effects, assuming the sentences are uttered today (and that, therefore, (a) is interpreted as a historical present): ()
a. She will speak at the Party convention tomorrow. b. She speaks at the Party convention tomorrow.
()
a. In January 2008 Cyprus joins the Eurozone. b. In January 2008 Cyprus joined the Eurozone.
Tense choice seems to have important consequences as illustrated in () and (); pairs like () and () are harder to find (and mostly require some discourse specification). By contrast, it is quite easy to find examples which seem to have the same interpretation, and which are true at the same time, although they are marked for opposing aspectual values: ()
a. He told me he was working for an insurance company. b. He told me he worked for an insurance company.
()
a. They lived in Athens at the time. b. They were living in Athens at the time.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
()
a. I’ll send the message tomorrow. b. I’ll be sending the message tomorrow.
The existence of such pairs constitutes evidence for treating aspect as a more ‘subjective’ category: rather than affecting truth values, its contribution may even be at some expressive level, as when it suggests annoyance on the part of the speaker (see footnote on the Progressive of Affect): ()
He is always making his stupid jokes (versus He always makes his stupid jokes).
However, the most obvious effect of aspectual choices concerns what Smith calls the visibility of endpoints. She argues: ‘Aspectual viewpoints focus all or part of a situation; what is in focus has a special status, which I will call “visibility”. Only what is visible is asserted’ (Smith : ), which accounts for the following contrast (Smith : ): ()
a. Mary was walking to school but she didn’t actually get there. b. *Mary walked to school but she didn’t actually get there.
The importance of tense marking has been acknowledged in syntactic frameworks that have taken Tense to be a central part of sentence meaning, in fact even suggesting that Tense can be regarded as ‘the head of the sentence’ (an idea central in formal syntax; see Haegeman : Chapter ). The motivation is arguably rooted in a quite obvious semantic consideration: clearly, tense choice may affect truth values, as was seen above. Moreover, it is quite impossible to evaluate the truth of a sentence if it is not tensed: ‘Semantic assertability, or the property of being a potential truth-value bearer, is coded syntactically as what we call finiteness, which is realized as tense in English’ (Kearns : ). It is impossible to evaluate the truth or falsity of non-finite / untensed clauses: ()
a. To leave home around 1 p.m. . . . b. Leaving home around 1 p.m. . . .
Semantically, these can only be interpreted in relation to some tensed clause, which will provide the necessary connection with the MoS and thus the basis for evaluation. In effect, such non-finite clauses will normally be ungrammatical, unless they are part of a larger construction, which crucially would need to involve a finite verb form.19 19
An exception to this generalization would be those apparently main clause uses of infinitives, as in To think that he was once the most powerful man in the land! (Huddleston and Pullum : –); these are described as ‘clauses with the subordinate form’ and classified as one of their ‘minor clause types’. Their interpretation is described as ‘close to that of exclamatives’. Yet, as Huddleston and Pullum show, this structure is not available to infinitives only (but may also involve finite subordinate clauses, such as That it should have come to this!). Indeed, they suggest that all these would be best described as cases of ‘a subordinate clause but with the matrix frame omitted.’
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Thus, syntactically, being non-finite entails that these clauses cannot stand on their own as main clauses. Moreover, they cannot license a nominative subject: ()
a. *We to leave home around 1 p.m. b. *They leaving home around 1 p.m.
The exact relationship of tense and finiteness cannot be discussed in great detail here— as there are various points of interest that would be relevant; see Nikolaeva . Moreover, as Huddleston and Pullum (: –) argue, in the case of English, ‘finite’ cannot be equated with ‘tensed’ in morphological terms, given the very small number of inflectionally distinct verb forms.20
. T , , A
.................................................................................................................................. In this section, we will address situation types or Aktionsart, a type of aspectual information that we introduced in section ., and show how this feature impacts on the temporal interpretation of a clause (..). Further interaction between temporal and aspectual features will be addressed in section .. under the heading of coercion.
.. Situation types This chapter started from the observation that the choice of specific linguistic resources to refer to a situation results in Aktionsart distinctions, the standard Vendlerian typology involving four types: Achievements, Accomplishments, Activities, and States. Table . sums up their defining features.
Table . Vendler’s Aktionsart classes and their defining features
Duration Dynamicity Inherent endpoint (telicity)
State
Activity
Accomplishment
Achievement
+
+ +
+ + +
+ +
Even if one ignores morphology, the connection between tense and finiteness is also relevant to the temporal interpretation of infinitives—as in Stowell and Wurmbrand . 20
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
One of the problems related to the discussion of Aktionsart concerns the unavailability of markers dedicated to the expression of particular types; although the four traditional types can be precisely defined in terms of the interaction of semantic features such as [+/ durative], [+/ dynamic], and [+/ telic] (see Table .), it is not possible to identify any formal properties that will unambiguously diagnose a verb or verb phrase as expressing, say, an Accomplishment, an Achievement, a State, or an Activity. At best, some syntactic properties of particular types may be noted—but these will more often reflect tendencies rather than rules of absolute validity. For example, one may often come across the distinction between an Activity like walk in the park as opposed to an Accomplishment like walk to the park—or, equivalently, an Activity like sing as opposed to an Accomplishment like sing a song. In the case of Accomplishments the function of the PP to the park and of the NP a song is to specify the endpoints that may be reached to complete the situation (the notion of telicity, that is, reference to an inherent endpoint to the situation, being what distinguishes Activities from Accomplishments). Similarly, the kind of modification provided by adverbials like for an hour or in an hour will vary depending on the Aktionsart properties of situations (see Vendler ). Modification by for adverbials is only felicitous with situations that are considered [+durative], as shown in the difference between the following examples: ()
a. She looked for her keys for two hours/?? in two hours. b. She found her keys ??for two hours/in two hours.
Moreover, the interpretation of in adverbials is different depending on whether it combines with an Accomplishment or with an Achievement. The following examples are taken from Kearns (: ) who considers them ‘linguistic signs of aspectual class’, in that ‘with an accomplishment sentence the in adverbial modifies the duration of the event’ (as in a), while ‘with an achievement sentence an in adverbial is interpreted as stating the time which elapses before the event, which occurs at the end of the stated interval’: ()
a. He could eat a meat pie in seconds. b. He recognized her in a minute or so.
Aktionsart distinctions have a crucial role to play in relation to Tense and Aspect marking, first of all, because they determine whether the progressive marker can be added. As we observed in the introduction and in section .., States cannot usually be used in the progressive. Moreover, the effect of the progressive is slightly different depending on the situation type that it combines with: the progressive with an Accomplishment leaves the beginning and the end of the situation out of focus (Jim was preparing dinner when I called), and the progressive combined with an Achievement results in a repetitive or habitual reading (Jim was kicking the ball against the wall). When combined with an Activity, the situation is represented as ongoing
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
(I’m enjoying every minute of this); the effects of similar interactions will be discussed further in section ..—but first we also need to note another interesting property of Aktionsart distinctions.
.. Situation types and movement of time The basic function of tenses is to locate situations in time but they also serve to give information about the temporal relation between situations: for instance, in We promised we would keep in touch, would makes it clear that the situation of keeping in touch (which may be located in the past, the present, or the future) is posterior to the situation of promising. In a similar way, the past perfect signals anteriority of a situation in relation to a reference point in the past (e.g., We said that it had been a pleasure meeting them). However, when the tenses do not grammaticalize a temporal relation of anteriority, simultaneity, or posteriority, it is the combined effect of Aktionsart and the progressive that determines whether there is temporal progression or not. Accomplishments and Achievements that are not in the progressive are understood as referring to situations that follow each other, whereas Activities and States are understood as being simultaneous with other situations. Consider again example (), repeated below for convenience as (): ()
Here’s what happened: I left home around pm yesterday, after I had had lunch. As I was walking down the street I ran into an old friend, whom I have always remembered very fondly. He went to live in Australia after college so I was eager to catch up. He told me he was working for an insurance company, which was located just round the corner from where I live. He was also writing a book about his Australian experience. We had coffee and then parted. We promised we would keep in touch. I will definitely make sure I give him a call next week. In fact, I’m dialling his number right now.
The situations of leaving home (Achievement) and running into someone (Achievement) will be understood as a sequence; so will those of having coffee (Accomplishment) and parting (Achievement). The situation of working for an insurance company (Activity) and telling (Accomplishment) seem to overlap in time, and there is likewise a relationship of simultaneity between was working for an insurance company (Activity) and was located round the corner (State). This principle of temporal progression has been addressed under different headings, for example the Temporal Discourse Interpretation Principle (Dowty ), the Principle of Unmarked Temporal Interpretation (Declerck ), or Narrative Mode (Smith ). It is important to bear in mind that temporal order is exclusively determined by Aktionsart categories only in case the tenses or other markers do not make the temporal relations explicit: it will be clear that temporal adverbs and conjunctions also give information about the temporal relation between situations. For instance, in () after tells us that having lunch happened before
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
leaving home. In this case the past perfect just confirms the relationship of anteriority. A final observation is that our world knowledge occasionally overrules the default interpretation that the Aktionsart types suggest in the absence of specific temporal relations being expressed by specific tenses. For instance, the Achievement ran into a friend is followed by the Accomplishment went to live in Australia. Here the unmarked temporal order is clearly overruled by our knowledge of the world: it is clear from the context that the Accomplishment precedes the Achievement.
.. Coercion We observed that it is only when Accomplishments and Achievements are in the nonprogressive form that they bring about a sequential reading. The following examples illustrate the effect of the progressive marker: ()
I called Henry. He opened the door.
()
I called Henry. He was opening the door.
Scholars agree that the progressive has an impact on the temporal interpretation: in () there is a sequence ‘call Henry’—‘open the door’; in () calling Henry happens while he is opening the door. However, views differ on how this effect should be explained. One view is to say that the progressive ‘coerces’, i.e. transforms or changes the Accomplishment (or Achievement for that matter) into an Activity (Dowty ; Moens and Steedman ). An approach in terms of coercion will argue that in (), the Accomplishment has been coerced into an Activity and that therefore, there is a relationship of simultaneity between the situations as they are referred to in the sentence. In other words, on this approach, the progressive impacts on the situation type. Examples of this type have been used to support what Smith calls her ‘twocomponent theory of aspect’, which aims at accounting for the interaction between viewpoint aspect (as in English simple versus progressive) and Aktionsart, effecting what Smith calls ‘situation-type shift.’ Depraetere () and Declerck et al. () have argued that a distinction should be made between Aktionsart types, the definition of which crucially depends on inherent endpoints, and the concept of (un)boundedness, which is defined in terms of temporal boundaries. While there is an inherent endpoint to ‘open the door’, this inherent endpoint may be represented as having been reached (as in ()), through the use of the non-progressive, or not (as in ()), through the use of the progressive). In the former case, the situation is said to be bounded; in the latter case, it is not. But in both () and () there is an inherent endpoint to the situation of opening a door, and it is therefore an Accomplishment, irrespective of whether the progressive is used or not. On this approach, it is boundedness that crucially determines the unmarked temporal interpretation (bounded situations imply a sequence; unbounded situations communicate temporal overlap), rather than Aktionsart types.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In section .. we argued that it is only State verbs in the present tense that can get a genuine MoS interpretation (see examples in () and ()). In order to establish reference to the MoS in sentences with an Accomplishment or an Activity a progressive construction has to be used.21 ()
a. I am watching an episode of Friends. (versus I watch an episode of Friends— habitually, e.g., before I go to sleep.) b. My son is parking my car in the garage. (versus I park my car in the garage— habitually, e.g., when I get home)
Seen from a different perspective, sentences like () can be said to illustrate another type of coercion, triggered, this time, by (the present) tense: while ‘watch an episode of Friends’ is an Accomplishment, when combined with a present tense marker, it gets a habitual stative reading. While the sentence does not contain a State verb, it shares with a State the feature of not having an inherent endpoint; the habitual stative reading may be the result of modification of any verb type by a frequency adverbial (as in Smith’s examples: Tom is often in love and Dad plays bridge on Sundays), but, further, as Smith notes (: –), it also arises ‘in present tense event sentences’, when not marked for the progressive, e.g., in the case of Accomplishments and Achievements such as Mary feeds the cat and John opens the mail—or our examples in (). Note that the use of the past tense, while not excluding a habitual reading, does not necessarily coerce a stative reading. ()
a. I was watching an episode of Friends. (versus I watched an episode of Friends: habitual (stative) reading or non-habitual (Acccomplishment) reading.) b. I was parking my car in the garage. (versus I parked my car in the garage: habitual (stative) reading or non-habitual (Acccomplishment) reading.)
In other words, in view of the various interpretations of examples like () and (), it may be argued that tense can also be seen to affect Aktionsart types.
. C
.................................................................................................................................. Aspect, tense, and Aktionsart are grammaticalized to different degrees in English: the former two are linguistically expressed through inflectional markers that are added to the verb stem and periphrastic constructions which are also systematically available to all verbs. These are generally easy to identify (with the possible exception of irregular / 21 The same applies to Achievements generally, but often with two possible effects of getting extra temporal contour: ‘slow-motion’ (as in The bus is stopping) and repetition (as in He is kicking a ball.); see Depraetere and Langford : .
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
suppletive forms) and our section . above has presented some of the debates concerning the exact status of a number of features: it may be easy to identify endings such as -s and -ed and constructions involving various auxiliaries, such as be, have, will, and would, but determining the exact contribution of each marker may be more difficult, making their classification a rather complex question. By contrast, Aktionsart is never marked overtly and depends not just on the verb, but also on a number of features available in the context in which the verb appears—most importantly within the verb phrase; for example, the type of NP complements (mass or count: He drank beer / He drank a beer.) or the specification of (telic) endpoints through particular modifiers (as in We walked to the park / We walked in the park). This observation implies that both the morphological and the syntactic level need to be addressed—and we have seen a number of ways in which both levels are relevant to our discussion. Moving on to the semantics of tense, it may appear that this can be captured in a relatively straightforward way, as in a basic definition that refers to the grammaticalization of location in time and of temporal relations. Yet, even a less-than-detailed description of many tense markers and distinctions such as the one we attempted in section . can prove the inadequacy of any such definition: there is extensive polyfunctionality and numerous exceptions to any generalization. Likewise, the semantic characterization of aspectual constructions such as the progressive and, arguably, the perfect, needs to be highly schematic to aspire to any generality. Even then, as was seen in section ., Aktionsart, which was defined in relation to distinctions in terms of event type, interacts in important ways with all tense and aspect features—making certain combinations impossible and determining the special interpretation of others.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. O of the most frequently disputed areas of research in the grammar of English is to be found in the study of English modality. Modality, according to Palmer (: ) is the semantics of mood, that is, the description of non-actualized/non-factual events and states in a language. We may characterize mood as the grammatical expression of modality, often presented as a contrast between inflected indicative versus inflected subjunctive verbal forms. Although much of the research that has covered the domain of modality in the past thirty or forty years has stemmed from research into the semantics of the English modal verbs, there is also considerable coverage of the grammatical expression of modality. Important studies may be traced back to Lyons () and a little later, Palmer (, ). Palmer published the first Cambridge standard text book study on the semantics of modality, later republished in . Modality has continued to be studied from a semantic perspective for more than thirty years, for example Coates (), Horn (), Traugott (), Sweetser (), Warner (), Bybee et al. (), and Bybee and Fleischman () provide important accounts. Others, such as Nordlinger and Traugott (), Abraham (), van der Auwera and Plungian (), Goossens (), Traugott and Dasher (), Nuyts (, ), Frawley (), Fischer (), Ziegeler (, , , , a, ), Abraham and Leiss (), and Bowie et al. () illustrate more innovative perspectives on the topic. Novel perspectives are given also in Condoravdi and Kaufmann (), Gueron and Lecarme (), and Narrog (a), though some of these authors do not focus exclusively on the modality of English. Another significant contribution comes from the corpus study by Collins (). Although the semantic inclination of most of these studies allows little scope for considerations which can be said to be driven by a particular grammatical theory (apart from, in some cases, the diverging explanations of the diachronic development of the modals in English, see also Hundt, this volume), there is still scope for discussion of the grammatical aspects of modality.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
The grammarian’s approach to modality often creates a dichotomy between grammar and semantics, something which is impossible to maintain in an accurate review of the current state-of-the-art of a topic in which the grammar and the semantics of modality are unavoidably integrated into a unified domain of research. It goes without saying that one of the functions of language is to describe what is known and actualized, and the mood used to describe factual modality (the indicative) is less often the subject of intensive research than that used to describe non-factual modality (the subjunctive). However, the description of non-actualized/non-factual events and states entails that we may be faced with the task of describing what is not there. Many languages employ different devices to express (non-factual) modality, and there is no universal ‘rule’ across languages to determine when the grammatical inflections of indicative and subjunctive mood may or may not be used. These inflections are often used to divide the semantic notions of fact from non-fact, but languages vary in deciding what is actual and what is not (see, for example, Bybee et al. () and Givón ()). In English, the term ‘modality’ may extend beyond the semantics of simply unreality and non-fact to refer to the broader notions of stance, evaluation, source of evidence, as well as the meanings covered by the modal verbs, such as ability, possibility, necessity, and future time reference. However, in all such cases, the grammatical form is not only used to express literal notions of ability, possibility, and similar. It is being used to infer the meanings of non-factivity, which are essential to all subjunctive modal expressions.1 In the present chapter, the approach will be to focus primarily on the semantics and categorization of mood and the modal verbs in English (section .). Section . will provide an account of the diachronic development of the modal verbs in English, comparing the standard grammaticalization studies with competing alternatives in other theoretical domains.2 The section will also discuss the development of marginal modals, or semi-modals. Section . will focus on the relatively recent notion of subjectivity and subjectification in relation to modal meanings, while section . summarizes the chapter.
. C
..................................................................................................................................
.. Mood Mood has often been defined as the grammatical inflectional expression of modality (e.g., Palmer , Bybee et al. ), and in many accounts, mood is linked mainly to 1 For example, in a modal expression such as He might be at home, the absence of factual knowledge about the whereabouts of the subject are inferable from the use of the modal might. 2 Grammaticalization studies deal with the study of the diachronic evolution of a grammatical form from a lexical source. For a basic survey reference, see Hopper and Traugott () (see also Hundt, this volume).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
the category of the indicative, the subjunctive, and the imperative. Palmer (: ), however, emphasizes that ‘“mood” is only one type of grammatical sub-category within a wider grammatical category’ with the term ‘modality’ being used for the wider category, which comprises mood and the modal system, i.e. the system of modal verbs in English.3 He goes on to maintain that mood entails a binary system—indicative or subjunctive, realis or irrealis, with the indicative (realis) being described as nonmodal (e.g., (a)) and the subjunctive (irrealis) as modal (e.g., (b)). Allan () identifies mood as a clause type. Clause types are recognized as one of the linguistic expressions of modality also by authors like Huddleston and Pullum (), for example. Allan distinguishes the optative-subjunctive (a restricted category in English—see e.g., () below), the declarative (e.g., a), the imperative (as in commands such as Be quiet!), and the interrogative (which deals with questions—see Chapter for more discussion on clause types). However, as noted above, although these clausetypes can all be described under the categorization of ‘mood’, the adjective ‘modal’ according to Palmer (: ) can only apply to irrealis clauses, e.g., (b) below: ()
a. They are in the office. (non-modal/realis) b. They may/must/’ll be in the office. (modal/irrealis)
Lyons (: ) describes mood in terms of the grammatical manifestation of subjective modality and other kinds of expressive meaning.4 He also maintains that the modal verbs in English have been taking over the functions of the subjunctive mood since Old English times. The remains of the subjunctive in English are found in a few frozen expressions: ()
a. If I were you,
and the complements of hypothetical predicates, such as wish: b. I wish it weren’t so dark and gloomy here. The subjunctive may also be attributed to the verbs of subordinate clauses in mandative contexts:5 c. They insisted that the old system be changed immediately.
3 Palmer’s () definitions are changed in his () account, which describes modality as a grammatical category rather than a semantic category. 4 Subjectivity is discussed at length in section ., essentially as the involvement of the speaker in the modal (or other) expression. 5 See Hundt, this volume, for more discussion on these types of subjunctives.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
in which the bare infinitive verb form survives today, particularly in the more formal dialects of US English (Trudgill and Hannah ). Siemund (: ) maintains that in such uses an uninflected form of the verb is simply given a subjunctive interpretation.6 In the Romance languages such as Spanish, Italian, and French, an inflectional subjunctive mood is still actively employed, and the modal verb system is less grammaticalized than in English (in that English has more modal verbs). Perhaps a more restrictive definition of the subjunctive mood is related to the etymology of the term as associated with subordinate clauses (Palmer (), amongst others, discusses such a relation), but the defining characteristic of non-assertion which is also associated with subordinate clauses obviously links such environments to those of imperative and interrogative clauses. It is also close to the definition proposed by Givón () in comparing the subjunctive with notions of irrealis mood in other languages such as creoles. The realis/irrealis distinction is used by some languages to express functions similar to those expressed by the indicative/subjunctive mood categories in Indo-European languages, generally, the distinctions between the expression of actualized versus non-actualized events. Like Allan () and Huddleston and Pullum (), Aarts () also comes to the conclusion that mood is a clause type, in reference to the subjunctive clause type in English. He discusses the debates over whether a subjunctive clause is finite or nonfinite, and concludes that the subjunctive mood is found in a particular clause type which contains neither a finite nor non-finite verb. He describes it as a form of gradient construction that exhibits some of the properties of finiteness but not all; for example, finite clauses have person and number agreement, but subjunctives do not (: ). It could be argued that in Old English, the subjunctive clause type did once have person and number agreement, and that what we are left with today are the vestiges of an ancient system of marking mood, which is overlapped by the grammaticalization of the modal verb system in all but a few remaining functions, as noted above. But Aarts rightly notes that it is still possible to recognize a subjunctive clause type in today’s English (: ), and it must also be conceded that knowledge of the diachronic progress of the function expressed by the subjunctive does not help a great deal in describing or defining the feature with any precision for pedagogical purposes today. The expression of modality in English, though, is not restricted to clause types, or even modal verbs. Modal adjectives and even modal adverbs (such as probably, possibly—see, e.g., Nuyts , Traugott ), as well as non-referential nouns (Ziegeler ),7 have been discussed as exponents of modal meanings in English. Other studies which have dealt intensively with the expression of modality through adjectives include Van linden et al. (), who look at the grammaticalization of the
Roberts (: n) interpreted the bare form as resulting from the possibility that the complement contains an ‘empty’ or phonologically null modal, and modals must always appear with bare infinitives. Huddleston and Pullum () simply label the bare subjunctive form ‘the plain form’. 7 This category includes the subject nouns in generic constructions such as Cows eat grass, etc. 6
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
adjectives essential and crucial into deontic modal adjectives. Hoye (: ), in particular, questions the ‘apparent unassailability’ of the modal verbs in English, and raises the important point that it is not modal verbs in isolation that are shown in corpus studies to express the modality of the discourse, but modal verb-adverb collocations, which create modal synergy in combining two independent modal expressions; e.g., might just possibly catch on, almost certainly will be etc. (Hoye : ). Similar uses of reinforcing adverbs were discussed by Traugott () as contributing to the diachronic grammaticalization of epistemic functions in modal verbs in the history of English (see section .).
.. Modal verbs The modal verb category consists of the nine core modals, listed below as present and past forms of each other: Present will can shall may must
Past would could should might
Other forms include the semi-modals, such as have to, ought to, be supposed to, etc. (see e.g., Krug for a detailed study of semi-modals). It should be noted that the existence of present and past forms does not necessarily entail present and past time reference; English modal verbs have polyfunctional properties (as suggested by van der Auwera ).8 Polyfunctional properties, in turn, mean that the modal verbs can be used in various functions, including expressing hypothetical modality using past tense forms such as would and could. Such uses may have less to do with time reference and more with variations in the type of non-factive meanings they express (see, e.g., Ziegeler for a study on the development of such meanings out of past tense could, and also Spencer, Chapter , this volume for a discussion of the present/past tense distinction of English auxiliaries from a morphological perspective).
8
I am reminded by the editors of this volume that the present-past formal oppositions of the modal verbs are still active in some cases, especially with can, could, will, and would, and that they are treated in Huddleston and Pullum () as different forms of the same lexeme (see also Collins for a frequency study of preterite modal verbs). Polyfunctionality does not exclude such treatments, while not necessarily accepting that the verbs are polysemous. See also section . on the historical reasons for treating the modal verbs as polyfunctional; unfortunately, scope does not permit a more in-depth discussion of the monosemy/polysemy debate.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
As auxiliaries, the core modal verbs in English also exhibit all of the Huddleston () so-called ‘NICE’ properties: they include the capacity to be directly negated, e.g.: ()
a. They mightn’t come tomorrow.
inversion in questions, e.g.: b. Would this be the correct one? code (or ellipsis), e.g.: c. He can swim, can’t he? and emphasis, e.g.: d. This must be the correct one! but unlike other auxiliaries the modals have no third person singular suffix, and as Palmer (: ) notes, there is no co-occurrence of two (core) modals together, at least in standard English varieties (though double modals are found in the English of the southern states of the US, Scotland, and Northern Ireland, as pointed out by Nagle (), and discussed in Verstraete ()). Other grammatical properties of core modals include (i) a lack of non-finite forms, e.g., *I am musting work late, and (ii) no to-infinitive forms, e.g., * to must do something (Huddleston and Pullum : –). The group of modal verbs in English may be divided into categories according to the various functions they serve, listed in Palmer () as dynamic, deontic, and epistemic functions (this does not necessarily imply that all the modal verbs serve all the functions listed). Dynamic modality will be discussed in the following section ..; section .. will discuss deontic modality within the classification of other non-epistemic types. Epistemic modality is the subject of section ...
.. Dynamic modality In their dynamic uses modal verbs usually refer to the characteristics and capacities of their subjects, e.g., John can speak Italian (Palmer : ). However, they may just as easily present gnomic facts about their subjects, as pointed out by Dahl () and Bybee (), who both refer to timeless truths in their reference to such modality, e.g.: ()
The arctic hare will turn white in winter. (Bybee : )
It could be argued that timeless truths cannot be defined as modal, since they refer to facts, not to non-factivity. However, at the same time, it could also be argued that the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
facts they refer to are modal in that they lack an anchorage in time and space (Ziegeler a), since expressions of timeless truth refer either to non-referential subjects or to non-referential events (e.g. ()); i.e., those without reference to actualization. The most common dynamic modal verbs are can and will, though other modals are also found (Palmer (: ) mentions a dynamic may). Some accounts have labelled the use of will in such functions as generic and therefore not modal, e.g., Salkie (), (though without actually determining what grammatical category is represented by genericity). The use of will as a tense-marker rather than a modal verb is discussed further in Depraetere and Tsangalidis, this volume. Gisborne () observes that dynamic modality differs from other types of modality in English because it lacks subjectivity (see section .), it is restricted to two modals (can and will, according to Gisborne), and importantly, is part of the propositional content of the clause rather than an operator having scope over it (many accounts suggest that an operator analysis is the appropriate one for deontic and epistemic modals). Gisborne () also cites Foolen (), Papafragou (: ), and Palmer (: ) who claim that dynamic modals are not modals at all. Gisborne (), however, attributes their doubts to the fact that such modals are retentions from earlier stages of diachronic grammaticalization (see section .), and therefore display a historical ‘lag’ by comparison to the other modals. Ziegeler () distinguishes a category of generic modals, equivalent roughly to the dynamic uses of can and will of Palmer (, , ), but including also examples of other modals used to denote what Palmer (: ) calls existential modality, a category to which he gives little attention and which is illustrated in () and () below: () Lions can be dangerous. (Leech : ) ()
. . . the lamellae may arise de nuovo from the middle of the cell and migrate to the periphery. (Huddleston : –)
Palmer’s existential modals were relabelled generic in Ziegeler () in order to account for the vast range of uses found in the earlier diachronic stages of modal verbs such as will and could, and semi-modals such as be supposed to, be able to, and have to, which, it could be argued, represented their earliest function as grammaticalized auxiliaries. What the class of generic modals have in common is their ability to characterize their subjects, yielding meanings which are omnitemporal and impervious to time reference. These functions are important for synchronic classification, as they give rise to the meanings of prediction: what is observed to be an omnitemporal fact is also predictable in the future. In describing the origins of modal senses in such verbs, it could therefore be predicted that propositions expressing facts which do not refer to any particular event moment in time or space are the most obvious source domains for the creation of future-projecting functions of modal verbs and permit predictive functions to arise diachronically. The diachronic development of modal verbs will be discussed further in section ..
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Other non-epistemic modality types Early descriptive accounts of modal verbs in English divided the categories into root and epistemic modality, e.g., Coates (), who referred to root modality as encompassing the semantic domains of obligation and permission (as had Sweetser ), but also including Palmer’s dynamic class under the same umbrella, i.e., anything that was not epistemic. For the purposes of the present chapter, the term ‘non-epistemic’ will be used to refer to root modality including dynamic as well as deontic modality. Deontic modal domains have more frequently been isolated as the semantic domains of obligation and permission, the term ‘deontic’ coming from the Greek form deon meaning ‘be necessary, be appropriate’, and they include examples such as: ()
a. b. c. d.
You must remember this. (obligation) You have to remember this. (obligation) We may stay here as long as we like. (permission) You can’t drive a car under the age of . (denied permission)
Common to all non-epistemic modalities is the fact that, for the most part, they refer to situations which are ‘goal-centred’ (Frawley : ), perhaps explained better as ‘pre-cursors or antecedents to action’ (James (: ) (e.g., in (a), the obligation of remembering something is the goal). The latter account includes even modal meanings of ability, a meaning which could be argued to reflect a time-stable characteristic of the subject, and therefore classifiable as dynamic. However, even ability is a pre-cursor to action, which is probably why it has been included by Bybee et al. () as an ‘agentoriented’ sense. The term ‘agent-oriented’ has also been used for different types of nonepistemic modality, described as ‘the existence of internal and external conditions on an agent with respect to the completion of the action expressed in the main predicate’ (Bybee et al. : ), and is used to describe meanings such as ability, necessity, obligation, and desire. However, it is somewhat misleading as a term for non-epistemic types, as actions which have not yet taken place cannot be assigned an agent (the term ‘potentially agent-oriented’ might serve better), e.g., in You may leave now, the potential agent-subject, you, is not yet an agent of the act of leaving as the leaving has not yet taken place, and therefore cannot have an agent beyond the mere prediction that you will be the agent once the leaving has taken place. Given that such modal meanings all refer to some future action, it was considered appropriate in studies such as Ziegeler (, ) to refer to all non-epistemic (including future will) categories as ‘future-projecting’. This categorization also included the past tense forms of the nine core modals, as long as they could express projection into the future from a past reference point (e.g., When I was younger, I could walk five miles). Givón (: ) had noted that even non-epistemic modals were not exempt from the influence of speaker knowledge, belief, or evaluation of a situation: the utterance of a deontic modal expression carries the presupposition that what is expressed in the complement of the modal has not yet taken place. Heine (: )
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
had indicated that German agent-oriented modalities had always conveyed the conceptual property referred to as L, standing for the later time reference of the proposition predicated of the modal, and Traugott and Dasher (: ) also emphasized the posterior time reference of deontic modality, as summarized in Ziegeler (: ). For example, in () both examples indicate to the addressee that what is referred to by the complement of the modal verb (i.e. the going home event, or the speaking Italian event) has not yet taken place at speaker-time, and it is this characteristic that distinguishes clearly the non-epistemic categories of modal verbs from the epistemic ones: ()
a. She should go home. (Givón : ) b. You can speak Italian (if you wish).
.. Epistemic modality Epistemic modal verbs have been described as referring to the speaker’s judgement on the truth of an expression (Palmer : ); the word epistemic comes from the Greek for ‘knowledge’. While non-epistemic modal verbs have always presented classificatory problems in previous accounts, there has been little argument over the categorizing of epistemic modal verbs. In most cases, they can be seen to be co-occurring with a stative verb, as with must and should in That must be true/That should be what I’m looking for. The categorization of non-epistemics as future-projecting allows for the objective criterion of time reference to determine classificatory definitions, so that epistemic modal verbs, on the other hand, can be considered as referring to deductions and evaluations made about present speaker knowledge. Although classification does not present so many problems for epistemic modality as for non-epistemic, it is not always so easy to identify an epistemic use of a modal verb out of context, and ambiguities can sometimes occur (e.g., (a) could be taken as epistemic: ‘you must remember this—you were there at the time!’). Nevertheless, some contexts nearly always receive an epistemic interpretation, e.g., the progressive aspect is usually associated with epistemic modality rather than non-epistemic: ()
Russia may be meddling in US politics.9
Epistemic meanings are also associated with the modal perfect: ()
9
Stonehenge may have been first erected in Wales, evidence suggests.10
https://www.theguardian.com/commentisfree//aug//russia-putin-trump-us-politics-sanderssupporters-response (last accessed April ). 10 https://www.theguardian.com/uk-news//dec//stonehenge-first-erected-in-wales-secondhandmonument (last accessed April ).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
All the modal verbs can function with epistemic meanings, even future will has been often described as epistemic in its function of expressing prediction (e.g., Bybee et al. : –). Palmer (: ), however, had difficulties with associating future will with epistemic uses, in particular because of the suppletion of will by shall in the first person: shall could never be used epistemically in the same way, e.g., I shall/will be finishing soon expresses a future function, not an epistemic one, and the epistemic use of will in That will be the postman cannot be alternatively expressed as That shall be the postman and preserve the same meaning. A description of epistemic uses as pertaining to evaluations and beliefs about current circumstances (including expressions of deductive reasoning) would also allow little scope for the use of future will as an epistemic modal. Even classic examples such as That will be the postman are more likely to express a deductive inference made in the present to be confirmed in the future rather than a clear predictive statement. According to Langacker, in using epistemic modals the speaker is striving for control over the knowledge of the situation, or striving to validate a proposition by assessing it as conforming to reality (: ). Langacker’s () classifications of modal verbs also replicate the two-part divisions of Coates and others, and what is common to most accounts is that the categorization of epistemic modals is fairly undisputed, while nonepistemic types allow for greater variation in their description, with consideration often given to the nature of the source of the modality.
.. Modal source The nature of the modal source is distinguished in accounts such as van der Auwera and Plungian (), who describe non-epistemic modal classifications in terms of participants. Their categorization does not ignore their own claims for the logical oppositions of possibility and necessity as intrinsic to modal meaning; however, they distinguish between participant-internal modality referring to the possibility and necessity meanings internal to a participant, see (b) below, which expresses possibility through ability (elsewhere such modal uses are described as dynamic or even ‘propensity’ meanings, e.g., in Huddleston and Pullum ()), and meanings referring to circumstances external to a participant that enable the possibility or necessity of a certain state of affairs, see for example (a) below (van der Auwera and Plungian : ): ()
a. To get to the station, you can take bus . b. Boris can get by with sleeping five hours a night.
Again, the distinctions are not formally marked in English, and it is only the semantic and pragmatic context in which the construction is uttered that determines which applies.11 This is made particularly evident in the classification of deontic modality as a 11
See also Depraetere and Reed () for an alternative account of the classification of root possibility.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
sub-class of participant-external modality in which the speaker (or some social or ethical norm, according to van der Auwera and Plungian : ) is understood as the source of the modality. But what, it may be asked, linguistically distinguishes a social or ethical norm from a scientific norm as in (), in terms of the modal classification system? There is no difference in the grammatical use of the modals, only in the pragmatic understanding of their functions. ()
a. The pollen may be taken from the stamens of one rose and transferred to the stigma of another. b. But to reach an orbit an object must accelerate to a speed of about , miles per hour (, kilometers per hour, called satellite speed or orbital velocity) in a horizontal direction; and it must reach an altitude of more than miles ( kilometers), in order to be clear of the atmosphere.
Nordlinger and Traugott (), on the other hand, posit a class of ‘wide-scope’ deontics to allow for deontic passives, for example, in which the source of the modality is not necessarily the speaker, and often cannot be specified at all; see, for example, (a) from Nordlinger and Traugott (: ), citing Coates (: ). Verstraete (: ) points out similar cases and suggests that the necessity expressed by the modal, e.g., must in (b), cannot in any way be assigned to the speaker, noting that its use might be labelled dynamic instead. It could be argued that the speaker is only endorsing the necessity understood as external to the situation of utterance, but this is not marked in the grammatical expression any more than in a deontic use involving speaker commitment; in fact, there is nothing in the grammatical morphology of a deontic modal expression which entails speaker commitment at all. Depraetere and Verhulst () also discuss the difficulties of locating the source of the modality in similar examples, in their comprehensive study of the necessity modals must and have to.
... Performativity Verstraete () discusses the performative nature of some modal verbs in bringing about ‘a particular position of commitment with respect to the propositional content of the utterance’ (: ). The term ‘performative’ has often been used to describe speech acts as a general category. Verstraete, however, is not likening the role of modal verbs to the nature of performativity associated with the illocutionary force of an utterance, but to the expression of subjectivity in modality (this is discussed in section . further below). In doing so, he expands the conventional definition of the term ‘performative’ generally used in relation to speech acts. Another use of the term ‘performative’ in describing modal verbs has been adopted by Palmer (: ), with respect to the category of deontic modal verbs. Deontic modal verbs are often contrasted with epistemic modal verbs insofar as they are used to give
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
direction to the interlocutor. Palmer associates all deontic modals with the function of laying obligation by the speaker, e.g.: ()
They must come in now.
Palmer (: ) relates deontic modality to performativity by the fact that it is directive, and predicates events which are controlled by circumstances external to the subject. However, a deontic modal utterance need not always involve the laying of an obligation by the speaker, who may simply be reporting the obligation laid down by another speaker (as is possible in ()). Depraetere and Verhulst () discuss the various possible ways in which obligation modality originates in sources other than the speaker herself.
. G
.................................................................................................................................. One of the earliest studies to focus on the diachronic development of modal verbs was Lightfoot’s () generative study of catastrophic or unpredictable change, which determined that not only the modals, but the entire category of auxiliary verbs suddenly emerged in the sixteenth century, see Lightfoot () for instance.12 More functionally-oriented studies were published in the s, including Plank (), a critique of Lightfoot (), who proclaimed the development of the English modals as a ‘paradigm case of grammaticization’ (: ).13 Grammatic(al)ization involves the diachronic development of grammatical markers out of lexical items, and their further development to become even more grammatical (this may simply mean a wider grammatical distribution). The original lexical sources of the English modal verbs in Old English were what are now known as the ‘preterite-present’ forms of verbs; i.e., a class of verbs that expressed completed action which gave rise to a result state in the present (see, e.g., Traugott and Dasher : ). An example of a pre-core modal illustrated by Traugott and Dasher () is that of mot- the source for the present-day modal verb must. Its original meaning was lexical, with meanings of ability or measurement (: ). Over time, the deontic, non-epistemic senses of first permission, and then obligation, emerged, and it was not until much later that epistemic meanings became common. All the modal verbs known today can be traced to similar lexical origins. After Plank () came the pioneering studies of Bybee () and Bybee and Pagliuca (, ), investigating more deeply the role of grammaticalization in the diachronic development of the English modals. Bybee and Pagliuca (: ) put 12 13
See also Chapter for more details on the diachrony of modality. Plank used the term ‘grammaticization’ in the same way as later researchers used ‘grammaticalization’.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
forward the view that the epistemic functions of the modals evolved uni-directionally from the non-epistemic ones. Thirty years on, there has been little disagreement regarding the diachronic link between non-epistemic modality and epistemic modality, apart from a few studies which have discussed the possible simultaneous emergence of both non-epistemic and epistemic functions in Old English (e.g., Warner and citing Denison : ). However, even many of Warner’s questionable () examples were described by him as neutralizing the epistemic-dynamic distinction, and often requiring reinforcement with adverbs expressing the same meaning, e.g. (: ): þæt ic nat, þu þe () Eaðe mæg gewurden þæt þu wite easily may be that you know that I not you who þar andweard wære. there present were ‘It easily/possibly may be that you know what I do-not-know, you who were present there’ The Old English ‘Apollonius of Tyre’, . (ed. P. Goolden, Oxford University Press, ) Warner notes that eaðe was used to mean ‘possibly’ or ‘perhaps’ as well as ‘easily’, thus reinforcing the epistemic meaning of mæg ‘may’ in (). It can be seen then that collocations involving adverbial forms with modal verbs were not uncommon even early in the history of English. Warner also illustrates examples of deontic may used in Old English expressing subjectivity.14 What he labels future epistemic must (: ) is exemplified in (): ()
Ealle we moton sweltan. all we must die ‘We will all necessarily die’ The Old English Version of the Heptateuch, Exodus . (ed. S. J. Crawford, EETS )
() is discussed also by Traugott and Dasher () as an example of an intermediate stage between deontic necessity and inevitability, hence expressing epistemic truth pertaining to the future. It should be noted, however, that the lexical aspect of the verb sweltan ‘die’ is that of an Achievement in the classification of Vendler (), a class of verbs which refer to punctual events, but events which happen to the subject rather than being within the subject’s control. As such, they make an ideal aspectual environment in which to form a transition from deontic to epistemic functions: there is an absence of the subject control deemed by Langacker () as necessary for his non-epistemic class of modals, and yet the aspect of the main verb is not stative, as would normally be 14
See also Taylor’s chapter (this volume) on cognitive linguistic approaches.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
expected with the loss of subject control in epistemic functions. Similar transitory environments may also have included the aspectually hybrid contexts of the progressive aspect (which uses a dynamic main verb in a stative or imperfective grammatical aspect) and passives, also implying dynamic change through the expression of a stative situation.
.. Metaphor and metonymy Other accounts which have questioned the deontic-to-epistemic route have included Narrog (a), who describes () as an example of wide-scope deontic modality. Narrog’s (a) assumptions that epistemic meanings did not always emerge from deontic ones are based on data from Japanese; it is difficult to conclude that the same pathways may be found in every language. Nuyts (: ) also questions the unidirectionality of modal development, suggesting that dynamic modals formed the source for both deontic and epistemic uses in parallel, but his data mainly come from Dutch. In the case of the English modals, though, the debates have questioned not so much the order of development of the functions (non-epistemic before epistemic), as the way in which such changes took place. Sweetser’s () seminal study put forward the cognitive mechanism of metaphor as the motivation for the development of epistemic from non-epistemic (her ‘root’) meanings in the modals, suggesting that the imagery of forces and barriers could be applied metaphorically to the modal meanings of possibility and necessity, with necessity seen as the imposition of a barrier, and possibility as the lifting of such opposing forces. The metaphoric interpretation allows for the application of forces and barriers in the concrete, socio-physical world of the subject (deontic modality) to become analogized into the more abstract, psychological belief world of the speaker (epistemic modality) in which the forces and barriers control the speaker’s judgements on a given situation. For example in (a) below the subject is not barred from going by any authority, while in (b) the speaker is not barred by her premises from drawing the conclusion that John is there (Sweetser : ). ()
a. John may go. (deontic) b. John may be there. (epistemic)
Note that in the shift from deontic to epistemic meanings the metaphor of lifting a barrier is transferred from the subject of the sentence in (a) to the speaker in (b). However, although the analysis could be reconstructed plausibly in terms of a cognitive schema in today’s English, it is less easy to find detailed evidence for its actuation in terms of a historical process, since it lacks the element of gradualness and semantic continuity necessary to establish more plausible diachronic theories. Thus, while the unidirectional order of such developments is usually accepted, Sweetser’s () metaphorical interpretation of them is challenged by later studies
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
which attempt to emphasize the gradualness of historical change using a different cognitive mechanism, that of metonymy and invited inferences. Traugott and Dasher’s () Invited Inferences Theory of Semantic Change (IITSC), builds upon an idea by Levinson () and an earlier study by Traugott () and invokes instead a mechanism in which change is seen as incremental and taking place by the slow, progressive conventionalization of conversational implicatures over time within the same conceptual domain, rather than the sudden ‘leaps’ from one conceptual domain to another, unrelated one, which are characteristic of metaphor. In this way, examples like () could be better explained as sharing meaning aspects related to both earlier and later stages of development, hence in the ambiguity of the modal function we find the semantic continuity necessary to explain the often unconscious micro-shifts and reanalyses of meaning which contribute to semantic change in any diachronic situation. The presence of functionally ambiguous examples in the diachronic data also allows for what is known in grammaticalization accounts as ‘bridging contexts’ (Heine ), or contexts which most readily favour the change from a source item to a target item. For example, in the development of the semi-modal of obligation, have to, the most likely environments for the meanings of obligation to develop out of the earlier, possession meanings are those in which both possession and obligation could readily be interpreted from the same context, but also those in which the earlier meanings of possession are less plausible in their literal sense (cf. Ziegeler : ). This is illustrated in () below: ()
we sall pray especially for þe person or for þe vikar of þis kirke þat has ʒoure saules for to kepe. ‘We shall pray especially for the person or for the vicar of this church that has your souls to keep’. (–. Bidding Prayer III in Lay Folks Mass Book, , example from van der Gaaf )
The bridging context of a customary duty, along with the knowledge that the vicar could not literally possess and keep people’s souls, allows for the possession meanings in have (for) to to be reinterpreted as obligation and the complement of have to can now be understood as the entire infinitival clause.15 The pre-posing of the direct object (ʒoure saules) does not affect the reinterpretation, as the vicar of the church could be said to be in possession of a duty, not the actual souls of the church members (Ziegeler : ). The mechanism of metonymy in this way allows for such contexts, since meaning shifts can already begin to occur in the source context itself (here, possession) as subtle transfers of semantic focus (possession of an object > possession of a duty). The distant conceptual domains of metaphoric shifts cannot allow for the same semantic contiguity familiar to diachronic changes in grammaticalization studies. 15
The use of the infinitive marker for to also contributes to the reinterpretation since it implies a purposive sense in the possessive expression.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In this way, it can readily be seen in certain selected examples from the history of English that specific changes were already in progress, as in the following example from Middle English (Traugott and Dasher : ): ()
Ho-so hath with him godes grace: is dede mot who-so-ever has with him god’s grace his deed must nede beo guod. necessarily be good ‘He who has god’s grace necessarily is required to be/we can conclude is good.’ Life of St. Edmund, c.
() again illustrates the type of ‘bridging’ context in which changes between deontic and epistemic functions must have taken place: the use of mot ‘must’ can be understood as expressing either function, deontic (‘is required to be’) or epistemic (‘we can conclude’), in spite of the fact that it is followed by a stative verb which would most likely render an epistemic meaning. The modality is further reinforced by the use of nede ‘necessarily’, a modal adverb. Even in today’s English, be in the sense of ‘behave’ is understood as a temporary stative situation over which the subject may exercise control. It was such contexts of ambiguity that enabled the almost invisible transitions of meaning of the modals to later epistemic uses.
.. Other explanations The motivation of semantic changes in grammaticalization has not always been the main approach adopted in explaining the diachronic changes that took place in the modal verbs, though, and it is worth also considering the syntactic approaches in accounts such as Fischer () and Krug (). Fischer (), in particular, provides studies which compare English with Germanic languages such as Dutch and German, in which the syntactic changes from SOV to SVO word order have not yet progressed to completion. According to Fischer, the reason for the more rapid auxiliation of the English modals is that the changes to SVO word order, which were complete by (according to Hopper and Traugott ), contributed to their more expansive distributional range. In English the changes of word order would ensure that the modal verb and the infinitive often fell together syntactically, which in turn promoted the semantic bleaching of the modal element as it grammaticalized to become an auxiliary. Similar hypotheses are proposed by Krug (), who provides a major study of the emerging semi-modals, such as have/have got to, want to/wanna, need (to), dare (to), and ought to, using data from the British National Corpus and the ARCHER corpus. Krug also evaluates the syntactic alignment of the modal and the infinitive as contributing to the developing auxiliary role of the modal verb in examples from Shakespeare (Early Modern English); this is especially visible in cases where the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
developing semi-modal appears in a relative clause with the fronted relative pronoun standing for the object of the modal verb (Krug : , cited in Ziegeler : ): ()
Come, let me see what task I have to do. Titus Andronicus III, I,
Such examples are representative of a pivotal construction using have to in which the older, possessive meanings of the semi-modal form give way to the later meanings of obligation, ambiguously, in the same construction. Ziegeler () questions the plausibility of ambiguity in such examples as permitting the switch context for the developing modal meanings, since it presupposes that both meanings were already available at the time and that either could be interpreted from the same example; instead the development of obligation meanings are seen as incumbent on the optimal lexical context in which the form appears. As we have seen in (), there is already a sense of social obligation present in the context even without have to. The studies which look at string frequency and adjacency as a catalyst for the development of modal meanings from their original lexical source verbs (in most cases, the Old English preterite-presents) also include Fleischman’s () analysis of the Romance future suffixes, cited by Fischer (: –). However, less attention is given in such syntagmatic explanations to the development of epistemic modality from non-epistemic. Such changes can be accounted for more readily by investigating what can be substituted for a particular functional role in a sentence string. Generative studies also appear to have failed to provide the same convincing explanation for the shifts that enabled epistemic meanings to begin to develop from non-epistemic ones, while offering instead an alternative means of description of the stages represented in the grammaticalization of the modal verbs from their lexical sources, usually verbs expressing ability, volition, obligation, and power (e.g., Roberts and Roberts and Rousseau , cited in Traugott ). Traugott (: –) discusses Roberts and Roussou’s () description of grammaticalization in minimalist terms as the movement of lexical items (i.e. the pre-modals) upwards in the structure to occupy positions as functional heads, which they associate with the wellknown process of ‘bleaching’ in grammaticalization, or loss of non-logical meaning. She suggests, at the same time, that Roberts and Roussou’s approach to bleaching is not able to account for further shifts within the development of a modal once it achieves grammatical status (: ). Van Kemenade () summarizes similar approaches, and describes syntactically the ‘movements’ which took place at the time the modal verbs were grammaticalizing from their original lexical sources, suggesting that originally both main verbs and pre-auxiliaries were found in the functional head position of ‘Mood’ (: ). The loss of the subjunctive inflections on main verbs was accompanied by the emergence of the early modal verbs, which is well-known in grammaticalization theory simply as a functional renewal, but van Kemenade () discusses the loss of subjunctive morphology accompanied by the fixture of the modal verbs as derived from the position ‘Mood’ (: ). In spite of their elegant
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
descriptions, none of these studies begins to explain why such movements occurred, and what they had to do with speakers’ actual manipulations of meanings at the time of the movement. Common amongst all such studies, including Lightfoot’s first () account, is the need to maintain that the actuation of such changes occurs across entire generations of speakers, so that it is always the child learners of a particular generation who reinterpret the functions of the former lexical sources of the modal verbs as auxiliaries. Such explanations assume that children are capable of highly sophisticated pragmatic inferencing procedures, which many later researchers found implausible (cf. Slobin ), and lacking empirical evidence from historical data. Little attention, again, appears to be devoted to explaining the functional shifts from root meanings to epistemic. Many such accounts are heavily formalized and unconcerned with the need to match the diachronic changes with the evidence of semantic continuity. In the beginning of this chapter it was emphasized that mood and modality cannot be studied outside of a semantic approach. The evidence from the study of the diachrony of the modal verbs and their grammaticalization paths may seem to indicate that any approach should regard the study of modality as a primarily grammar-driven task. However, in spite of the fact that the modal verbs are ‘grammaticalized’ in English, and mood as a category no longer has much relevance as an inflectional category, the presence of semantic constraints on the present-day grammatical distribution of modal verbs is still dominant. Note the infelicity of (), for example: () ??I wish you would be tall. (Ziegeler ) () illustrates what is known as lexical retention in grammaticalization. The modal verb would has retained some of its volitional meaning and therefore cannot be used when the subject has no control over the state of being tall. This represents a stark reminder of the impact of lexical source meanings on grammatical distribution at later stages of development. In this way it can be seen that grammatical distributional constraints are governed to a certain extent by the semantic origins of the modal verb, and that a grammatical study of modality in today’s English cannot be undertaken without considerable reference to an accompanying semantic and historical perspective.
. S
.................................................................................................................................. Another reason for rejecting a syntactic approach to the diachronic study of modality is the existence of the notion of subjectivity in modality. This feature is recognized in the contexts in which it is found, but is not seen to correspond with any formal encoding, at least in the English modal system, whereas syntactic approaches rely on explicit
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
form-function correspondences. It had been long observed by Palmer that epistemic modals possess the characteristic of being subjective (e.g. : ), due to their expression of degrees of speaker commitment, but as Nuyts (: ) notes, Lyons () had also used the term ‘subjective’ to discuss the meanings of deontic modals, comparing them with objective uses occurring ambiguously in the same (often-quoted) example: () Alfred may be unmarried. As Nuyts (: ) points out, Lyons () claims that examples like () can be interpreted as having either a subjective or an objective modality. If the sentence is understood as referring to the speaker expressing his or her own uncertainty about the marital status of Alfred, then, according to Lyons, it is interpreted as having subjective modality. If, on the other hand, it is understood as the speaker simply voicing an opinion based on the probability of a computable fact, then, according to Lyons, it is interpreted as having an objective modality. Such an account is similar to Palmer’s (: ) explanation for subjectivity, in that even epistemic possibility can allow for objective, rather than subjective, interpretations, as long as the speaker is not expressing his or her own opinion. Whatever the case may be, such isolated examples are semantically underspecified when taken out of context, and often ambiguous. Nuyts, however, justifiably notes that Lyons was questioning the source of evidence in offering such possibilities, i.e., that the distinction between subjectivity and objectivity is dependent on who was responsible for the modal evaluation (Nuyts : ). Nuyts, instead, redefines the objective modality of Lyons () as a form of ‘intersubjectivity’— a modal evaluation which can be shared with others, to be contrasted with a subjective modal evaluation, belonging only to the speaker.16 Narrog (b) specifically points out that subjectivity is not necessarily a characteristic of modality, but the earlier work by Traugott () and Traugott and Dasher () seems to belie such assumptions, in that the use of epistemic expressions in particular is unavoidably subjective, and reflects the gradual subjectification of the modal meanings over time, which are ‘ . . . increasingly based in the speaker’s subjective belief state/attitude toward the proposition’ (Traugott : ). Traugott’s earlier study is focused on the general emergence of epistemic meaning, and not specifically on epistemic modality. The question of the presence of subjectivity in modal meanings has often been misunderstood, having been described variously in different accounts. Ziegeler (b) and Grossman and Polis (), for example, refer to the weakening of selection restrictions between the subject and the modal verb, or the loss of relations between the verb and subject; the latter account also relates the spread of 16
Intersubjectivity has been discussed further in the recent literature by others such as Traugott (), who describes it differently as the speaker’s accommodation of the addressee’s self-image, ‘face’, beliefs, and attitudes, and there have been further expansions of the term, but these are not necessarily related to the expression of modality in particular.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
speaker-oriented senses to the loss of subject-oriented ones. The loss of subject control means that the only remaining source for the control can now be the speaker herself. Note, for instance, the following typical example from a Google site: ()
there may be instances where years of military training cannot prepare oneself 17
In such examples, there is no possibility of a deontic interpretation as the subject of may is pleonastic there, a dummy subject which cannot be the target of a binding social obligation. It is impossible for the subject there to control the instances where years of military training cannot prepare oneself. Thus, in the reduction of the relations between the verb and the subject, the original (social) obligation senses implying root possibility become reinterpreted as epistemic possibility to allow for the statement to be true. This involves a shifting of control from the sentence subject to the speaker’s domain of knowledge, since there is nowhere else for the subject control to go—it is thus inferred as belonging to the speaker. The difference between Traugott’s () and Traugott and Dasher’s () approach to subjectivity and that of Langacker () is that in the latter account, subjectivity is not seen as the product of a diachronic process, whereas in the former accounts it is seen as the gradual accretion over time of speaker-oriented meaning. Subjectification, according to Traugott (: ), can be observed in the development of inferences based in the speaker’s world of reasoning. In support she cites examples of not just modal verbs such as must, but also modal adverbs like probably, which once meant ‘in a provable manner’, but developed to embrace the speaker’s perspective in the sense ‘presumably’. Another of Traugott’s examples is the modal discourse marker actually, which not only provides subjective inferences in its present-day use, but also intersubjective ones, e.g., when interpreted along the lines of ‘I am telling you this in confidence’, using a definition of intersubjectivity that differs slightly from that of Nuyts () mentioned above.18 Thus, while in examples like () the speaker’s judgement or beliefs about the situation are expressed using the epistemic modal may, the range of linguistic items that can be described as diachronically subjectified is not in the least limited to the categories of modal verbs and semi-modals alone.
. C
.................................................................................................................................. Any study of mood and modality in English is unavoidably a semantic study, though researchers in the past have attempted to group the modal verbs with the study of other 17
http://uk.businessinsider.com/this-may-be-mattis-one-mistake--?r=US&IR=T (last accessed March ). 18 Traugott’s () definitions referred to the accommodation of the addressee’s evaluations in the speech event, e.g., Actually, we’re getting married.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
auxiliary forms in English. In the English language, the class of core modal verbs is now seen to serve the principal functions that were once carried out by the subjunctive mood in Old English. This once formed part of a binary category indicative/subjunctive, used to distinguish meanings of assertion and non-assertion, or non-factuality in the clause. Although mood has since been associated with a clause type (Allan , Aarts ), it is no longer encoded as an inflectional category in present-day English, having yielded its functions to the modal verbs. The principal problems of studying the class of modal verbs in English relate mainly to methods of categorization, with the difficulties arising from the variable stages of grammaticalization at which the modals can be found today. Grammaticalization is accepted almost universally as the most likely explanation for their development from verbs of lexical origin in Old English and for their replacing the now virtually obsolete English subjunctive. However, the categorization of the modal verbs, including the semi-modals, varies. Some approaches focus on the participants and the source of the modality (e.g., van der Auwera and Plungian ), others on the typological contrasts of sentence roles (if, for example, we take into account Bybee et al.’s () classifications of agent-oriented modality). Yet other approaches take temporal reference and aspectual considerations to be central to the categorization of modal meanings (e.g., Ziegeler ). However, all such accounts struggle with the classification of non-epistemic rather than epistemic categories. Perhaps the classification of epistemic modality is facilitated by the understanding of subjectivity in the epistemic modal, an aspect of pragmatic meaning related to the gradual transfer of control from the modal subject to the speaker. Co-occurrence with stative rather than non-stative main verbs seems to be another possible means of differentiation, but like all categories, the boundaries are not always so discrete and there are often exceptions. The study of English modality will continue to present such challenges for research for many years to come.
A Acknowledgements are gratefully extended to the editors, in particular to Geri Popova, for their tireless efforts in helping to perfect this chapter. Any remaining inconsistencies are naturally my own.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. G affords us various means of signalling the relationship between entities and events. Consider the two examples in (). () a. He was Irish and she was English. b. He believed that she was English. In both (a) and (b) there are two clauses, both of which contain a finite verb in the past tense. The two clauses in (a) are linked explicitly by and. The two are of equal status and each could stand on its own two feet, as it were. One could substitute a full stop for and without materially altering the truth conditions of the two predications. The clauses are said to be coordinated, and is said to be a coordinator and the relationship between the two clauses is called coordination. The two clauses in (b), on the other hand, are not of equal status. If one were to substitute a full stop for that, one would be left with two unrelated predications, the first of which is clearly incomplete. The second clause encodes the object of belief of the referent of He, the subject of the first verb. It is thus parallel to a nominal object, as in He believed her story. Like this nominal object, the object in the form of a clause in (b) is said to be subordinate to the verb believed. The linking element that is called a subordinating conjunction or subordinator and the relationship between the two clauses, one of which envelops the other, is labelled subordination (the enveloped clause is said to be embedded in the larger structure). It is the two relationships coordination and subordination that are the topic of this chapter.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
The example of coordination in (a) consists of two finite clauses, joined together by and. It is not just clauses that may be joined in this way. We can join together phrases, as in the two preposition phrases [up the ladder] and [down the stairs], and the two noun phrases in [unpopular men] and [popular women]. We can join together words, as in popular [men and women]. We can even join together parts of words, in particular prefixes, as in pre- and post-natal. The important point is that the two items to be joined together are similar in form and function. In other words, we generally join like with like (though see .. for some exceptions). The example of subordination in (b) consists of one clause which is subordinate to the verb in another clause. We also find subordinate relations in phrases, as in (a) and (b).1 () a. the girl who lives upstairs b. the girl living upstairs In predications containing one of these phrases, such as The girl living upstairs is studying law, one can omit the subordinate clause but not the noun and still be left with a meaningful sentence. Living upstairs is therefore considered subordinate to girl, which may be labelled the ‘head’ of the phrase.2 The term ‘subordination’ is sometimes used in the broad sense of a dependency relationship and sometimes with a narrower focus on clauses as dependent elements. Subordination in the broader sense of dependency is a pervasive grammatical phenomenon, and subordinate (or dependent) elements within clausal and phrasal structures can take many forms. For instance, a noun phrase functioning as object is a subordinate element within a clause, as mentioned above; adjectives and preposition phrases can be subordinate elements within noun phrases (as in the young girl, the girl in the photo). Such dependency relationships are discussed in various other chapters in this volume.3 The discussion of subordination in this chapter will therefore focus mainly on subordinate clauses. The chapter is structured as follows. Section . discusses subordination, focusing specifically on subordinate clauses; section . deals with coordination, both on the phrasal and clausal level; and section . presents grey areas between subordination and coordination. Section . contains a brief summary and conclusion.
1
Note that in some formal approaches to grammar clauses are treated as phrases (see Borsley, section .., this volume). 2 For more on phrasal heads, see Borsley section ., Huddleston and Pullum section ., Keizer section .., and Taylor section .., all this volume. Note that some approaches would analyse the in the girl as the head of the phrase (see Keizer section ., this volume). Such an approach has no material consequences for our analysis of subordinate clauses. 3 See Footnotes 2 and 4 for references to relevant material in other chapters.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. S
.................................................................................................................................. There are two main types of subordination, modification and complementation. These are distinguished in .. in relation to subordinate clauses, though the distinction applies more generally to other types of dependent, as discussed in other chapters.4 Sections .. and .. present the forms and functions of finite and non-finite subordinate clauses respectively.5
.. Modification versus complementation In this section I introduce the two main functions of subordinate clauses, as modifiers and complements. As a first illustration of the difference, consider the two subordinate clauses in (): () She said that he was incompetent because she wanted the job herself. In () one can omit the italicized because clause and still be left with a complete predication. The because clause is optional and functions as a modifier within the main clause. Units that function as modifier within a clause are also called adjuncts. The that clause, on the other hand, is licensed by the verb said for the latter to make sense in context, and is called a complement. Modifiers occur more freely within structures, while a complement depends on the presence of a head that licenses it, i.e. permits it to occur. In other words ‘x licenses y’ is the equivalent of ‘x can take y as a complement’ and, on the other side of the coin, ‘y can only occur as a complement with a certain type of x’. For instance, some verbs, like say, allow (license) clausal complements while others, like speak, do not (*She spoke that he was incompetent). Complements, unlike adjuncts, are often obligatory, since they are often not only permitted by the head but required by it. While there are many constructions that are clear instances of modification and complementation, the borderline between the two forms of dependency may at times be difficult to draw.6 In other words, one may have identified the head and the subordinate item in a construction, but be unsure as to whether this construction Further details of phrasal complementation and modification may be found in Borsley section . and Keizer section ., both this volume, and of clausal complementation and modification in Duffley section . and Herbst sections . and ., both this volume. For an exhaustive description, the reader is referred to the major reference grammars of Quirk et al. () and Huddleston and Pullum (), as well as the earlier grammars of Poutsma (–), Kruisinga (–), and Jespersen (–). 5 For the distinction between finite and non-finite predications, see Depraetere and Tsangalidis, this volume. 6 One structure that has given rise to considerable debate in the literature consists of what are sometimes referred to as comment clauses, such as I think in I think he gave her flowers. For various analyses of these, see Thompson (), Newmeyer (), and Kaltenböck (). 4
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
instantiates modification or complementation. The fact that there are instances that are hard to classify as one or the other suggests either that our system of categorization is flawed or that both categories of relationship are prototype-based rather than Aristotelian in nature (see Aarts et al. : – and Aarts a).
.. Finite subordinate clauses Finite subordinate clauses resemble main clauses in that they contain a tensed verb. This means that they can be related in time to the point of utterance. We can say that they are anchored, or ‘grounded’, in the universe of discourse (Lyons : , Langacker a: , Taylor section .., this volume). This grounding may be explicitly coded on the verb in the form of past tense morphemes or third person present singular -s. It is not coded explicitly on other persons in the present, except in the case of the verb be. The main difference between main clauses and subordinate clauses, both of which can contain tensed verbs, is that subordinate clauses generally cannot stand on their own but require some element to be subordinate to, i.e. a head of some sort, which may be a verb, a noun, an adjective, an adverb, a preposition or indeed a whole clause. Table . contains an overview of the main functions of finite subordinate clauses.7 Table . Main functions of finite subordinate clauses Level
Relation
Function
Example
Clause
Complement
Subject Object Predicative Adjunct
That he is always late irritates them. They complain that he is always late.8 The problem is that he is always late. They were already irritated when he arrived. They were irritated because he was always late.
of Preposition of Noun of Adjective of Noun
They were arguing about who was at fault. The assertion that he is always late is false. They were certain that he would be late. The student who was always late failed his finals.
Modifier Phrase
Complement
Modifier
7 Various grammars adopt different approaches to the classification of subordinate clauses. For instance, Quirk et al. () favour a functional analysis, dividing subordinate clauses into ‘noun clauses’, ‘adjective clauses’, and ‘adverb clauses’, according to whether the role they play resembles more that of a noun, an adjective, or an adverb. Huddleston and Pullum (), on the other hand, favour classifying them on the basis of their internal structure into ‘content clauses’, ‘relative clauses’, and ‘comparative clauses’ (see Huddleston and Pullum, section ., this volume). 8 Huddleston and Pullum (: ff.) prefer not to extend the label ‘object’ to include such finite subordinate clauses, arguing that they behave quite differently from NP objects (although they make an
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
According to Table ., clausal complements of verbs may function as subjects, objects, and predicatives.9 In Huddleston and Pullum (: ) subjects are classified as complements, but not ones of ‘equal status’ with others. The subject is considered to be an ‘external complement (outside the VP)’. It differs from objects which are said to be ‘internal complements’, and which are licensed by some verbs but not others. Finally it should be noted that when the clausal complement is a subject, it is often extraposed, with It irritates them that he is always late being preferred to That he is always late irritates them (see Kaltenböck, this volume, for a discussion of extraposition). The subordinate clause in They were irritated because he was always late is analysed in Table . as containing a subordinator or subordinating conjunction because, which serves a similar function to when in They were already irritated when he arrived. We find this sort of analysis in most traditional grammars, including Quirk et al. (). In an alternative analysis, found in Huddleston and Pullum (), because is classified as a preposition taking a finite clausal complement.10 Many of the examples in Table . contain the subordinator that. However, it is by no means the case that all subordinate clauses contain a subordinator. Thus the main and subordinate clauses in () comprise the exact same string of lexemes as the last two main clauses in (). () He asked her the time. She told him. He had to leave. () She told him he had to leave. () She told him that he had to leave. In () the implied direct object of the verb tell in the ditransitive construction is understood to be the current time. She told him and He had to leave constitute two separate tone units, signalled in writing by punctuation and capitalization. In (), on the other hand, which consists of just one tone unit and is written as a single sentence, the explicit direct object of tell is the clause he had to leave. Its occurrence in this slot in the ditransitive construction confirms its subordinate status. Should we wish to make this status even more explicit, we can do so by inserting that, as in (). Addressees are aided in their interpretation of the status of post-verbal subordinate clauses, as in (), by their familiarity with the construction types in which the verbs in
exception for examples like That he lost his temper I find odd, containing a preposed object in a complextransitive clause). 9 Not all grammarians would accept that subjects should be classified as complements. Approaches that do not recognize subjects as complements include Generative Grammar (Siewierska : ) and Systemic Functional Grammar (Fawcett : ). Approaches that include subjects as complements of equal status with other obligatory elements in a construction include Cognitive Grammar (Langacker a: , Taylor section .., this volume), Role and Reference Grammar (Van Valin and LaPolla : ), and Word Grammar (Hudson a: ). 10 For more on items such as because, variously classified as subordinators and prepositions, see Aarts section .., Huddleston and Pullum section ., and Taylor section .., all this volume.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
question typically occur. No such clues are available for finite clauses in subject position, as in () and (). () That he succeeded in this venture came as a surprise to her. () How he always succeeded in ventures like this amazed her. The subordinate status of finite clauses in subject position has to be signalled explicitly, in order to avoid the addressee interpreting them as main clauses. In () this signal takes the form of the presence of That, without which an addressee would take succeeded to be a main clause predicate. In () the subordinate status of the clause is signalled by the occurrence of its subject he in the position immediately following How, with no intervening verb. The fact that this subject is not preceded by a verb is sufficient for addressees to exclude the alternative interpretation that they are faced with a question. In addition to who and how, various other wh-forms, when followed by elements with the requisite Subject–Verb constituent order, can introduce subordinate clauses, as in ()–(). Other common markers of subordination when a finite clause functions as a complement are if/whether as in (). () The doctor asked him why he did not exercise. () The doctor told him where he should exercise. () The doctor told him when he should exercise. () The doctor asked him if/whether he took any exercise. There is a much wider range of possibilities in the case of modifying subordinate clauses functioning as adjuncts, which may encode various types of relationship to main predications, such as temporal (), conditional (), concessive (), and reason (), among others. () He liked to go up to London when(ever) he had the train fare. () He liked to go up to London if he could afford the train fare. () He liked to go up to London though he was often too tired to travel. () He liked to go up to London because he enjoyed visiting museums. Note again that some scholars (e.g., Huddleston and Pullum ) would analyse the modifiers in one or more of these examples as preposition phrases in which the prepositions take clausal complements. In actual usage adjunct subordinate clauses (or preposition complement clauses) will often contain pro-forms or ellipsis, as in () and (). () He liked to go up to London when(ever) he could. () He liked to go up to London if he could afford to/it.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
When a subordinate clause is employed to modify a noun, as in the final example of Table ., it is called a relative clause. Relative clauses differ from subordinate clauses functioning as complements of nouns, in that they may be used to modify any noun, whereas only a minority of nouns take complements. They are also distinctive in establishing an interpretive link between the head noun and an element of the clause. In () there is a ‘gap’ in the object position after wrote, the object of which is understood to be the essays which the relative clause modifies. This missing element may be overtly expressed by a fronted relative pronoun, or left unexpressed. In () the gap is in the complement position of the preposition for, understood as the lecturer, and again overt expression of this element by a fronted relative pronoun is optional.11 () the essays (which/that) the students wrote () the lecturer (who/that) the students wrote the essays for In some instances overt marking of relativization is obligatory, most notably when the subject in the relative clause is understood as being coreferential with the head noun (the noun being modified), as in () and (). () The students who/that finished their essays late were penalized. () The students, who all finished their essays on time, were awarded pass grades. In these examples the overt marking cannot be omitted for reasons similar to those that apply to complement clauses in subject position, as in () and (). The function of a relative clause such as the one in () is to pin down the referent of the head noun The students. Because of this restricting function, it is commonly known as a restrictive relative clause. In () the relative clause merely adds some extra information about the head noun. It is called non-restrictive.12 Relative pronouns are always required in non-restrictive relative clauses, irrespective of the function of the head noun. Compare in this respect the non-restrictive clause in () to the restrictive one in (). () I spoke to her husband, who I met at the party. () I spoke to the neighbour (who/that) I met at the party. When a restrictive relative clause contains a subject that is not coreferential with the head being modified, as in (), we can choose not to re-encode the referent of the head by means of a relative pronoun. 11
Some grammars analyse that in examples like (19) and (20) not as a relative pronoun which provides overt expression of an otherwise missing element but as a subordinator which simply marks subordination (e.g., Huddleston and Pullum ). 12 Huddleston and Pullum (: ff.) prefer to distinguish ‘integrated’ and ‘supplementary’ relative clauses, noting instances where the information is presented as integral to the message but does not serve to narrow down the referent (e.g., He has a wife he can rely on).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
To sum up, in this section we have distinguished between main clauses and finite subordinate clauses. We have seen that finite clauses may function as both complements and modifiers on the clause and phrase level. Various means of signalling subordination have been described. Finally, we distinguished between restrictive and non-restrictive relative clauses.
.. Non-finite subordinate clauses Non-finite clauses are almost always subordinate.13 There are four main forms of nonfinite clause, illustrated by ()–(). () She helped clean the bathroom. (bare infinitive clause) () She tried to clean the bathroom. (to-infinitive clause) () She enjoyed cleaning the bathroom. (-ing clause) () The bathroom located on the ground floor was cleaned. (-ed clause) In some analyses the verb inside the -ing clause in () would be called a gerund, or gerund-participle (see Aarts section .., this volume, for a discussion of the merits of various classifications). The -ed clause in () is also called a past participle clause. This clause, which may be seen as a reduced relative clause (as in the bathroom (which was) located . . . ), functions as a modifier in a noun phrase. The underlined strings in ()–() function as complements of the higher-clause verb. Table . presents an overview of the main functions of non-finite subordinate clauses, analysing them as complements and modifiers in the same manner as the finite clauses in Table .. Note that the analysis of some non-finite clauses functioning as post-verbal complements is controversial; this is further discussed below. Some of the examples of clause level complements in Table . contain an NP between the matrix verb and the complement clause predicate. In most, but not all, of the examples, this NP is analysed in the table as part of the non-finite clause. However, the exact status of these NPs is a matter of considerable dispute (see Borsley section ., this volume). Indeed, the analysis of sentences containing NPs in this position involves various questions not posed by sentences containing just one verb. Consider in this respect ()–(), which may be compared to ()–(). () They liked his policies. () They asked him some questions. () They elected him president. () They liked him to put his policies into practice. 13
One exception consists of exclamations such as Oh, to be in England.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Main functions of non-finite subordinate clauses Level
Relation
Function
Example
Clause
Complement
Subject
(Him/his) arriving late irritates them. (For him) to arrive late would be disastrous. They dislike (him/his) always being late. They like (him) to arrive on time. The problem is (him) arriving late. The question is to arrive late or not. She forced him to arrive late. She wanted a taxi booked for eight o’clock. They took a taxi to arrive on time. They were overjoyed, arriving on time for once. Given the chance, they always arrived on time.
Object14 Predicative
Phrase
Modifier
Adjunct
Complement
of Preposition of Noun of Adjective of Noun
Modifier
They were irritated because of his arriving late. He grasped the opportunity to arrive on time. They were certain to arrive late. The student arriving late failed his finals. The time to arrive is printed on the ticket. The time stipulated on the ticket is fixed.
() They asked him to put his policies into practice. () They elected him to put his policies into practice. In () his policies is the direct object of liked. In () him is the indirect object and some questions the direct object of asked. In () him is the direct object of elected and president is an object complement/predicative, i.e. the function of president is predicated of the object him. When we replace the final NPs in ()–() with non-finite clauses, as in ()–(), the picture becomes more blurred, especially in the case of those constructions containing three arguments. The analysis of () is relatively straightforward. Since there is no implication of the(ir) liking him, the latter is analysed as forming part of the object clause.15 In () and (), on the other hand, there is no doubt that it is the referent of him that is asked and elected. Him is therefore taken to be an object in both cases. It is however problematic to maintain the distinction between ditransitive and complex-transitive constructions (to adopt Quirk et al.’s terminology)
Note that Huddleston and Pullum (: ) do not in general employ the term ‘object’ for nonfinite clausal complements in sentences, arguing that they comprise a distinct kind of complement. However, they make an exception for the first post-verbal constituent in sentences containing object predicatives, such as She considers consulting mobile phones at table very rude. 15 Huddleston and Pullum (: –), however, argue against such an analysis on syntactic grounds, seeing it as an instance of mismatch between semantics and syntax. 14
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
when the final constituent is a clause, since him is in both cases the understood subject of the verb in the complement clause. Quirk et al. (: –) address this problem by positing a cline of verb classes according to a small handful of criteria. Huddleston and Pullum () adopt a very different approach to non-finite complement clauses than that of Quirk et al. (). In particular they reject the parallel drawn above between clauses with only nominal complements, as in ()–(), and those with non-finite complements, as in ()–(). They argue that the material differences between the two types, the most important of which is the blurring of the distinction between objects and predicatives, are so great as to preclude their being usefully subsumed into the same set of categories. Instead, they propose to group together constructions containing post-verbal non-finite complements and constructions containing auxiliary verbs. Thus he liked to clean the pool and he liked cleaning the pool would be grouped with he ought to clean the pool and he was cleaning the pool rather than with he liked pools. Conflating these two construction types necessitates the analysis of auxiliary verbs as head of the verb phrase rather than dependents of the following main verb. The correct analysis of auxiliaries has been much debated in the literature. Huddleston and Pullum present various arguments in favour of auxiliaries as heads (: –; section ., this volume). They label all constructions with post-verbal non-finite complements as catenative constructions, of which there are two main subtypes. They distinguish between these according to whether there is an NP between the catenative verb (i.e. the auxiliary or matrix verb) and the verb in the catenative complement. Constructions without such an intervening NP are called simple catenative constructions. Constructions with such an NP are called complex catenative constructions. Non-finite constructions like those in Table . are sometimes said to be ‘deranked’ or ‘desententialized’ compared to finite constructions (Lehmann , Cristofaro : , Croft : ). Also said to be deranked are subordinate clauses in which the verb is in the subjunctive as in () and (), which may be compared to () and (). () The doctor recommended that he get more exercise. () The doctor recommended he get more exercise. () The doctor recommended him to get more exercise. () The doctor recommended getting more exercise. In all four examples ()–() the underlined clauses are complements of the verb recommend, and follow the verb just as a nominal object normally does (compare The doctor recommended exercise). In addition all contain explicit marking of the subordinative relationship: that plus the subjunctive form of get in (), subjunctive get without that in (), the to-infinitive in (), and the -ing form in (). Example () differs from the other three in that the subject of getting is implicit, in the sense that it cannot be identified with any other element in the sentence. We understand it to refer to the doctor’s addressee because the doctor is a definite nominal (if we replace the
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
doctor with the indefinite doctors, we would understand the subject of getting as people in general). According to Cristofaro (: ), deranking takes two main forms. That is, there are two main differences between verb forms that can occur in main clauses and verb forms that are restricted to subordinate clauses. The first difference is the lack on the part of the latter of one or more distinctions such as tense, aspect, and person. It is this element of subtraction of meaningful components that is behind the term ‘deranking’. The second difference involves the addition of components in the form of marking not allowed in main clause verbs, such as the to of the to-infinitive and the -ing form. Examples ()–() all exhibit the loss of tense. Examples () and () both exhibit the second difference, in the form of the presence of to in the former and -ing in the latter. Unlike finite clauses, non-finite subordinate clauses, which lack a grounding predication of their own (i.e. which are not tensed), depend on the verb in the main clause for their location vis-à-vis the point of utterance.16 Thus, when occurring as a complement, the to-infinitive may be interpreted as coding a general validity or a forwardlooking predication, depending on the type of verb it complements (Egan : ). () She loves to go to London on her weekends off. () She loved to go to London on her weekends off. () He believes her to live in London. () He believed her to live in London. () She plans to go to London on her weekends off. () She planned to go to London on her weekends off. With attitudinal verbs, like love in (), the predication in the subordinate clause has general validity, in the sense that it is likely to be realized by the subject whenever circumstances allow, in the case of (), or allowed in the case of (). Note that the form of the subordinate clause is the same in both examples. The temporal location of the activities in them is dependent on the tense of the matrix verbs. With a verb expressing a judgement, such as believe, a to-infinitive complement codes a situation taken to pertain at the same time as the main verb. In () her living in London is posited as being true at present, in () it is posited as having been true in the past (although it may, of course, also be true in the present).17 With the majority of verbs,
16 For a somewhat different interpretation of the temporal integration of non-finite subordinate clauses see Duffley, sections . and ., this volume. 17 It is also possible to use a perfect form of the infinitive to indicate that the situation in the complement clause pertained prior to the time of the matrix verb, especially in constructions containing seem, appear, and verbs expressing judgements, as in He believed her to have lived in London (see Depraetere and Tsangalidis, this volume). As shown by Bowie and Wallis (), the construction with the infinitival perfect has been in decline for several centuries. It is now archaic with some matrix verbs, such as remember in the pattern I remembered to have seen a photograph of her.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
however, a predication in a subordinate clause coded by a to-infinitive is understood to take place, if at all, at a time posterior to that of the main verb. This is the case for mental process verbs like intend and plan, as in () and (), communication verbs like advise and persuade, enablement verbs like allow and help, causation verbs like force and compel, effort verbs like attempt and try, and aspect verbs like begin and start. In the case of some of these verbs there may be some temporal overlap between the matrix verb situation and the situation in the complement clause, but the complement situation may never precede the main verb situation in the case of non-perfective to-infinitives. The bare infinitive resembles the to-infinitive in occurring in forward-looking constructions, as in (), in which the rewriting cannot precede the imposition of force, and same-time predications, as in (), in which the hearing and the tolling are understood to happen at one and the same time. () He made her rewrite her essay. () She heard the bell toll. The -ing form of the verb resembles the bare infinitive in occurring in forwardlooking () and same-time () predications. In addition it occurs with backwardlooking main verbs, as in (), and can also encode mere objects of contemplation, as in (). () She intended breaking the speed limit. () She enjoyed breaking the speed limit. () He denied breaking the speed limit. () She contemplated breaking the speed limit. We saw in .. that when a finite clause functions as a subject it must contain a subordinator, or a wh-word. Similarly, infinitives in subject position require to, as in (), or, in cases where the subject of the to-infinitive clause is mentioned in the clause itself, for . . . to, as in (). -Ing forms, as in (), do not require any mark of subordination apart from the -ing ending itself. () To succeed in this venture was of paramount concern to him. () For him to succeed in this venture came as a complete surprise to her. () Succeeding in this venture was his primary goal at that time. Moreover, when the identity of the subject of a non-finite complement clause is different to that of a matrix verb, as in ()–(), it must be stated explicitly in the construction. () She liked him to visit her mother. () She asked him to visit her mother.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
In () and () him has accusative or objective case, but also serves the function of logical subject (trajector in Cognitive Grammar terminology: see Taylor .., this volume) of the verb in the complement clause. Indeed it is the actual, and not just the understood, subject of the complement verb in (), despite its accusative form. One exception to the requirement that the subject be coded explicitly is in the case of communication constructions where the subject is understood as either a specific addressee, as in () (The doctor recommended getting more exercise), or a generic addressee (Doctors recommend getting more exercise). In what are known as same-subject (or simple catenative) constructions, i.e. constructions in which the subject of the matrix verb is also understood to be the subject of the verb in the complement, as in () and (), this subject is not encoded twice. () She liked to visit her mother. () She liked visiting her mother. Many grammarians consider the subject of the non-finite subordinate clauses to be unexpressed but understood in sentences like () and (). Jespersen, for instance, considers the subject of infinitives in sentences like () to be latent or implied. He maintains that in such cases ‘the latent S is obvious because it is the same as the subject [ . . . ] of the (main) sentence’ (Jespersen –, V: ). Formalist grammars label the understood subject of the verb in complement clauses such as these PRO (‘big PRO’; short for ‘pronominal’). The construction is said to exhibit ‘subject control’ with the understood subject of the modifying clause being ‘controlled’ by, i.e. understood to be identical to, the subject of the main clause (see Duffley section ., this volume). An alternative analysis is that She in () and () actually is the subject of both verbs like and visit (Matthews : , Hudson : , Egan : ). There are several tests for subjecthood in finite clauses, such as verb agreement and auxiliary inversion in interrogatives. These tests cannot be applied in the case of constructions containing deranked verbs. We find the same lack of repetition of subjects in clause-level modifiers or adjuncts, as in (). () She slept, to get her strength back. In (), the subject (or logical subject/trajector) of to get, which is coreferential with the subject of the main clause, is not encoded a second time. Again this understood subject would be labelled ‘[PRO]’ in formal representations. To sum up, we have seen in this section that, like finite clauses, non-finite clauses may function as both complements and modifiers on the clause and phrase level. Lacking primary tense, non-finite clauses are largely dependent on the main clause predicate for their temporal interpretation. Various means of signalling subordination have been described. Finally, we have seen that understood subjects in subordinate clauses may be coreferential with various nominals in the main clause.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. C
.................................................................................................................................. The term ‘coordination’ denotes a relationship between two or more items, neither of which is subordinate to the other.18 In this section I will concentrate on structures containing just two coordinates. There are generally taken to be three basic coordinating conjunctions in English, conjunctive and, disjunctive or, and contrastive/ adversative but. I introduce briefly coordination on the phrase level in .. and on the clause level in .., before considering some more problematic aspects of coordination in ...
.. Coordination at the phrase level At the phrase level we find coordination of various types of heads and dependents (modifiers/complements). Table . contains just a small illustrative sample of some of these possibilities. All of the examples in Table . contain the conjunction and. We could replace and with or in the phrases in the table, indicating that just one or other of the two elements, Table . Coordination in and of phrases Bare heads
Heads with dependents
Noun phrases
young [men and women]19
[young men] and [older women] [many young men] and [some older women]
Adjective phrases
extremely [bright and eager]
[quite bright] and [extremely eager]
Adverb phrases
very [slowly and stealthily]
[quite slowly] and [very stealthily]
Preposition phrases [up and down] the hills Verb phrases
[a bit up] and [right down] the hills [in Britain] and [on the continent]
[washed and dried] the dishes [slowly washed] and [thoroughly dried] the dishes [cleared the table] and [washed the dishes]
18 Note that in some formal approaches the coordinated items are considered to be subordinate to the coordinator, which is then considered to be the head of the construction. Other approaches analyse such constructions as having two (or more) heads, or as having none (see Herbst section .. and Borsley section ., both this volume). 19 On an abstract theoretical level the exact constituent structure of a string like young men and women may be unspecified. In a given utterance context it could be ambiguous, allowing for the interpretation [young men] and women. In practice the context will normally indicate the intended interpretation.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
but not both, is included in the predication containing the disjunction.20 It would not be equally felicitous to replace and with but in the five types of coordination in the table, since but requires the presence of contrasting predicative elements. If, however, we were to replace bright with dim in the adjective phrases, but gives a more felicitous utterance than and. But may also be used to indicate contrasts between the other sorts of phrases in Table ., noun phrases, as in (), adverb phrases, as in (), preposition phrases, as in (), and verb phrases, as in (). () [young men] but [even younger women] () [quite quickly] but [very stealthily] () [up the ladder] but [down the stairs] () [slowly washed] but [quickly dried] the dishes
.. Coordination at the clause level On the clause level, subjects, objects, both direct and indirect, predicatives and adjuncts may all comprise conjoined items. In addition clauses themselves may be coordinated with other clauses, as in ()–(). () [He cooked dinner] and/but [she put the children to bed]. () They expected [him to cook dinner] and/but [her to put the children to bed]. () They hoped [he would cook dinner] and/but [she would put the children to bed]. In all three examples ()–() we can employ the coordinate and to join the clauses, whether main clauses as in () or subordinate clauses as in () and (). We can also employ but in order to contrast the activities. Using or would sound odd in these examples since there is no reason to think that the two conjoined activities described should not both be realized, involving as they do two different agents. However, it would be natural in an example involving only one agent such as They hoped he would cook dinner or put the children to bed. In examples of this kind, where the agent of the activity in the second conjunct is identical to the agent in the first conjunct, it is common to employ ellipsis. In this instance we have ellipsis in the second conjunct not only of the subject he, but also the modal would. Similarly, if the second conjunct in a sentence like () encodes a similar proposition to the first one, we generally employ pro-forms or ellipsis, as in He cooked dinner and [so did she/she did too].
20
There are also some contexts in which or is not genuinely disjunctive, where it actually implies and. Huddleston and Pullum (: ) give the example They are obtainable at Coles or Woolworths, which means that they are obtainable at both. The reason behind the use of or in such contexts is that one has a choice of shops to visit to obtain them.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
When it is a verb that is subject to ellipsis, as in (), we cannot illustrate the relationship between the clausal conjuncts by linear bracketing, as in examples ()–(). () He washed the kitchen and she the bathroom. Example () differs from ()–() in that it only contains one verb, which has two sets of complements, each consisting of a subject and an object. What is conjoined is the action of washing the kitchen by him and the bathroom by her. We can, however, show it as in (). () washed {[he, the kitchen] and [she, the bathroom]}. The process whereby the language user omits to repeat the predicator in cases like () is generally referred to as ‘gapping’ in the literature, because of a perceived ‘gap’, where one would normally expect to find a verb, between the subject and object in the second conjunct (see Huddleston and Pullum : ff.).
.. Non-prototypical coordination Tokens displaying non-prototypical coordination may instantiate constructions containing no coordinators or less prototypical coordinators, or they may contain less prototypical coordinates. Since I am discussing less common phenomena in this section, I will for the most part eschew made-up examples in favour of corpus data, taken from the Corpus of Contemporary American English (COCA; Davies –) and the British National Corpus (BNC). If a string of words contains a coordinator such as and we can be sure that it is related to some other string. In the absence of a coordinator, how do we decide whether two constituents are related? In some cases we may rely on word order or constituent order to signal coordination, as in the case of the two predicative noun phrases in (). () “ . . . He was a good man, an honest man,” said Logan. (Atlanta Journal Constitution , from COCA) Although it would be perfectly possible to add an and between the two noun phrases in (), this is not necessary to signal the coordination. Coordination with an explicit coordinator is called ‘syndetic’, coordination without is called ‘asyndetic’. When it comes to syndetic coordination, Quirk et al. have posited a cline from subordinate to coordinate relationships and proposed various criteria for situating coordinators and subordinators along that cline (Quirk et al. : , Aarts : ). Closest to the three conjunctions on this cline is the conjunct yet, whose distribution is in many respects similar to that of but, as in () and ().
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
() She was exhausted, but/yet satisfied. () She was running late, but/yet still made it on time. One important difference between but and yet is that one can insert and before yet but not before but. Like determiners, conjunctions are taken to be mutually exclusive. Even the common combination and/or means ‘and or or’ not ‘and and or’. Another conjunct which functions like a coordinating conjunction in many respects is so (Quirk et al. : , Huddleston and Pullum : ), as in () and (). () It’s their big high school test, so all of high school is devoted to that test. (Delta Kappa Gamma Bulletin , from COCA) () I felt alone and missed my mother, so went creeping back into the hall adjoining the dining room. (Red Cedar Review , from COCA) Plus is in some respects even more like a conjunction in that it cannot be preceded by and. It is semantically incompatible with or and but. However, plus, unlike and, always signals more of the same sort of thing. She went upstairs, plus she washed her hair sounds distinctly odd, since these actions are too disparate to allow easily of the sort of additional interpretation coded by plus. Moving from less prototypical coordinators to less prototypical coordinates, we have already noted that coordinators generally link like with like. Huddleston and Pullum (: ) state that ‘In the great majority of cases, coordinates belong to the same syntactic category, but a difference of category is generally tolerated where there is a likeness of function’. They point out that the coordination of different syntactic categories is perfectly acceptable in predicatives. For instance, in () and (), there are noun phrases coordinated with adjective phrases. () She was a happy person, a loving person, caring.
(/ , from COCA)
() My mom was really beautiful and a great person, a nice person. (NBC , from COCA) In (), which displays asyndetic coordination, the noun phrases precede the adjective. In (), which contains both syndetic and asyndetic coordination, the adjective phrase comes first. In the predicative construction in () and () was functions as a copula in relation to all the coordinates. However the verb be has two additional functions, as an auxiliary in both progressive and passive constructions. In the examples of syndetic coordination ()–(), in which the coordinates belong to different categories, be plays different roles in relation to the coordinates. () She told me this later when she was drunk and I was sober and holding her hands and letting her talk. (Carver , from COCA)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
() He was crying and lost and had come to the ninth grade lunch instead of the one for his grade. (Crucet , from COCA) () Do you know what it’s like for a parent to wish their child was abducted and being held. (Dateline NBC , from COCA) In () and () be serves both as a copula and a progressive auxiliary, in different order in the two examples. In () it serves as both a passive and progressive auxiliary. Despite these dual roles, the predication in () is readily comprehensible since being in turn functions as a passive auxiliary. Moreover the subject of both verb phrases, their child, plays one and the same thematic role, that of . Similarly, he is the of the predications of crying and lost in (). Example () differs, however, in that the subject I is with regard to sober, but with regard to holding and letting. The fact that () appears a perfectly felicitous utterance shows that thematic equivalence of subject is not a prerequisite for coordinating predications. Another sort of non-canonical coordination involves two forms from the same syntactic category which occur in contexts in which they would not have occurred individually. () Him and me share a tent out at the camp.
(Havill , from COCA)
() So me and him struggled to throw the thing out of the window. (BNC CDG ) Examples () and () contain coordinated heads on the phrase level, which together function as subjects on the clause level. Being plural, they naturally occur with a plural form of the verb in (). This type of coordination is called ‘extraordinary balanced coordination (EBC)’ by Johannessen (: ). It is balanced because both coordinates share the same (objective) form. It is extraordinary because, occurring as they do in subject position, one might expect them to be subjective in form, as in He and I shared a tent. As Johannessen (: ) puts it, ‘EBC is the case where a coordinated structure has different grammatical features from a simplex structure in the same surroundings’. However extraordinary the construction may seem, it is by no means uncommon, especially in spoken English. Rather more unusual is the form of coordination which Johannessen dubs ‘unbalanced coordination’, illustrated here by ()–(). () There’s no bad blood between him and I. (Washington Post , from COCA) () That’s very personal between she and him. (Mayor Marion Barry Trial , from COCA) () You can’t just have we and them.
(PBS_Newshour , from COCA)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
()–() contain coordinate phrases in each of which one of the coordinates is subjective in form and the other objective. According to Johannessen (), only the type in () actually occurs in English. She writes ‘I have not encountered any example of unbalanced coordination in English in which both conjuncts were overtly case-marked pronouns, and where only the pronoun in the first conjunct had deviant case’ (Johannessen : ). To my mind, () and () seem equally (in)felicitous to (). The point is by no means moot, in view of Johannessen’s general thesis that in cases of unbalanced coordination in SVO languages, it is always the second coordinate that has deviant case.
. C ?
.................................................................................................................................. Given that a relationship between two strings has been established, it remains to determine whether that relationship is one of equals, instantiating coordination, or of unequals, instantiating subordination. Huddleston and Pullum (: ) point out that ‘while the central or prototypical cases of coordination and subordination are sharply distinct, there is no clear boundary between the peripheries of the constructions’. In some cases the presence of a coordinator or subordinator may point us in the direction of one or other relationship (though here we must be aware of the risk of circular reasoning). Thus the to of the to-infinitive always signals subordination, while the conjunction but always signals coordination. But even in the case of the most prototypical of all coordinators, and, we may still be in doubt. Here I limit the discussion to just three constructions, try and V, go and V, and and. () Metaphors have been used to try and capture the overall reality of education. (Education , from COCA) () We also try and talk to our teachers and try to find out things they are interested in learning so they will take the new knowledge back to their classroom. (Emily Lutrick , from COCA) () Khalid left his home this morning at about :, sent a text message to our office saying that his route was blocked and he would try and find another road. (PBS_Newshour, , from COCA) () They would try to find deserted areas with trees. (Jill Rosenberg , from COCA) The construction in ()–() seems at first sight very similar to the one in (), with try and V being equivalent to ‘try to V’. This similarity might incline one to conclude that the ‘and V’ string is subordinate to the verb try. Indeed Huddleston and Pullum (: ) reach this very conclusion, maintaining that ‘In spite of the and [ . . . ] this construction is subordinative, not coordinative: and introduces a non-finite complement of try’.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Certainly the try and V construction cannot code clausal coordination since the truth conditions of the second coordinate are not identical to those of the first. Example (), for instance, asserts that metaphor has been used to try something, not to capture something. Whether anything has in fact been captured is left in doubt, the predication in the putative ‘and V’ clause being non-factive. However, if this is a case of complementation, why should it be restricted to the base form of try, exemplified by the to-infinitive in (), the first person plural present in (), and the bare infinitive in ()? Why can we not say *he tries and goes or *he tries and go to mean he tries to go? The fact that we cannot do so indicates that the construction may actually instantiate phrasal rather than clausal coordination, in particular the coordination of verb phrase heads. The construction may be illustrated as in () which may be contrasted with the subordinative construction in (). () They will [try and attend] the meeting. () They will try [to attend the meeting]. Quirk et al. (: ) describe the construction as ‘pseudo-coordinative’. I consider it preferable to interpret it as a genuinely coordinative low-level construction in its own right, one in which the second coordinate refers to a proposition, the realization of which depends on the expenditure of effort signalled by the first coordinate. The second construction containing and that seems to straddle a strict subordination–coordination divide is the go and V construction, illustrated here in () and (). () But you had to go and make it complicated.
(Ken Liu , from COCA)
() Well who do you suppose went and married my sister? (NPR Fresh Air , from COCA) Unlike the try and V construction where the ‘and V’ string appears at first sight to resemble the complement of a matrix verb, in the go and V construction it is the ‘go and’ string that appears subordinate to the second verb. Another difference to the ‘try to V’ construction is that both verbs in go and V may be grounded (tensed), as demonstrated by the past tense morphology of the verbs in (). Moreover, one can omit the ‘go and’ string without altering the truth conditions of the utterance. The semantic contribution of ‘go and’ is limited to the implication of irritation or surprise on the part of the speaker. According to Huddleston and Pullum (: ), ‘In this colloquial use of go, the verb has lost its motional meaning and has a purely emotive role’. The absence of any implication of a change in location is reminiscent of a similar lack in the ‘going to’ future construction, as in I am going to stay on here. And just as the older motion sense of ‘going to’ may still be encountered with a purposive to-infinitive adjunct, as in she is only going to please her mother, so the motion sense of ‘go and’, as in she went and pleased her mother, is still flourishing side by side with the non-motion sense. In the ‘go and’ construction, the verb go has been bleached of its motion sense, and has been bleached of its coordinative sense, while the combination
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
of the two has acquired a new sense, not predictable from the two individual items. The order of the two, with and following immediately upon go, and admitting no intervening elements, has been fossilized. This fossilization, combined with the elements of semantic bleaching and pragmatic enrichment, is typical of grammaticalization in the sense of Hopper and Traugott (). We may conclude that ‘go and’ has grammaticalized as a subordinate marker of speaker attitude. The final and construction to be considered here is exemplified by ()–(). () Make your service articles focused, interested and helpful, and you’ll have your editors coming back for more. (Kelly James-Enger , from COCA) () “One move, buster, and I’ll shoot you with my .”. (Staton Rabin , from COCA) () One more kiss and you’ll drive me crazy. (NPR_FreshAir , from COCA) The construction in () contains two clauses, one imperative, the other declarative, joined by and.21 The initial constituents in () and () do not contain a verb. In all three the constituent before and encodes a condition for the realization of the situation encoded in the final clause. The obvious question is whether we are to assign subordinate status to the first constituent, considering it as a unit up to and including and. The consensus among linguists of both a functional and formal bent is to reject this hypothesis, arguing that the putative subordination is purely semantic, not syntactic (Bolinger : , Culicover and Jackendoff : ). Quirk et al. (: ) classify and conditionals as one of eight types of semantic relationship that may be connoted by and. Some of the other types are illustrated in ()–() with Quirk et al.’s own examples. () He heard an explosion and he (therefore) phoned the police. () I washed the dishes and (then) I dried them. () She tried hard and (yet) she failed. Constructions like () are labelled ‘’, constructions like () ‘’, and constructions like () ‘’ by Quirk et al. (: –). Applying the same sort of reasoning as in the case of and conditionals, one could argue that the first clauses in these three sentences are subordinate to the second ones, insofar as they could be paraphrased Because he heard an explosion, he phoned the police; After I washed the dishes, I dried them; and Although she tried hard, she failed. Quirk et al. (: ) describe more correspondences of this sort. However, they still maintain the distinction between the coordinative constructions and their semantically similar subordinative counterparts. 21
For a discussion of the pragmatic force of such constructions see König section ., this volume.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
To sum up, two of the three and constructions in this section have been deemed coordinative, albeit on different levels, the try and V construction on the phrase level, and and on the clause level. The third construction, go and V, has been deemed subordinative, with go and having grammaticalized as a marker of speaker attitude. Other marginal uses of and are discussed in Quirk et al. (: –). The important point here is that and does not always code coordination, which of course is not to say that it, in itself, codes subordination.
. S
.................................................................................................................................. This chapter has described the forms and functions of the main types of subordinate clauses, as well as various types of coordination. We have seen that it is not always easy to distinguish coordination from subordination, nor indeed to distinguish between the various sub-types of subordination. We have also seen that grammarians differ in their analyses of some of these phenomena. Subordination and coordination are two very broad categories, and both comprise a wide variety of grammatical relationships. With respect to subordination, I have only looked at subordinate elements in the form of clauses, i.e. constructions containing verbs. We have seen that these verbs may be both finite and non-finite and that both forms of clause can function as complements and modifiers in phrases and clauses. We have also looked at various means of signalling subordination in both finite and non-finite clauses and noted that, while the former are independently grounded, their non-finite counterparts, lacking primary tense, are dependent on the main clause predicate for their temporal interpretation. As for coordination, we saw that both bare heads and heads with dependents may be coordinated and that, although we normally coordinate like with like, the items being coordinated do not necessarily have to be identical in form. I also described some constructions that present problems in classification as either subordinate or coordinate. That we should encounter such constructions is by no means surprising when we are operating at such a coarsegrained level of analysis involving just two categories of relationship. While it would obviously be impossible to engage in grammatical analysis without positing some type of classification, we need to bear in mind that superordinate classes of relationship, such as those labelled subordination and coordination, subsume a multitude of lowerclass relationships that differ from one another in various ways.
A I would like to thank Beck Sinar, internal reviewer Ash Asudeh, and an anonymous external reviewer for their incisive comments on the first version of this paper, and Jill Bowie and Bas Aarts for their very helpful advice on various questions that arose during the revision process.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
̈
. I
.................................................................................................................................. W we use language, we usually do so to convey information.1 The information we convey is, however, not of equal informational value. Rather there is an ‘informational asymmetry’ (Prince a: ): we can expect some of the information to be already known to the hearer, while other information will be new. This basic distinction plays an important role in the lexico-grammatical expression of utterances, as speakers ‘tailor’ their utterances to meet the perceived mental states of the hearer. Arranging information lexico-grammatically to fit the communicative needs of the situation is generally subsumed under ‘information structure’, a term coined by Halliday (). Other terms used include ‘Functional Sentence Perspective’ (e.g., Firbas ), ‘information packaging’ (Chafe ), and ‘informatics’ (Vallduví ). A useful definition of information structure is given by Lambrecht (: ) as [t]hat component of sentence grammar in which propositions as conceptual representations of states of affairs are paired with lexicogrammatical structures in accordance with the mental states of interlocutors who use and interpret these structures as units of information in given discourse contexts.
In terms of its formal manifestation, information structure comprises the whole range of lexico-grammatical means, including prosody, morphological marking, syntactic structures, and ordering constituents in the sentence. Of particular interest to the study 1 The notion of conveying information of course subsumes not only assertions but also other speech acts such as requests, commands, promises (e.g., Gundel ). As noted by Lambrecht (: ), for instance, ‘by asking a question, a speaker may inform his addressee of his desire to know something’. A detailed discussion of different speech acts, however, goes beyond the scope of this chapter. For more information on speech acts see Chapter .
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
of information structure are word-order variations, so-called ‘information packaging constructions’, which express the same proposition in formally and pragmatically divergent ways. Consider, for example, the following simple event: the transfer of a book from John to Mary. This can be expressed linguistically in a number of different ways, including those in (). ()
a. b. c. d. e. f. g.
John gave Mary a book. John gave a book to Mary. Mary was given a book by John. It was a book that John gave to Mary. What John gave (to) Mary was a book. A book was what John gave to Mary. John, he gave Mary a book.
To capture the specific functions of each constructional variant, the study of information structure makes use of categories, such as ‘given/new’ information, ‘presupposition’, ‘topic’, and ‘focus’. There is, however, no generally agreed set of categories, and the terms used, as well as their definitions, differ widely to the extent that some scholars have lamented the terminological profusion, and indeed confusion, prevalent in this field (Levinson : x). This chapter will therefore foreground areas of agreement, while still comparing different approaches and pointing out differences between them. Given the limitations of space, an overview of such a diverse field will, however, necessarily be incomplete. The chapter is organized as follows: section ., first of all, examines the notions of ‘given’ and ‘new’ information, more specifically presupposition and assertion (..) and the activation of discourse referents (..). Sections . and . discuss the concepts of topic and focus respectively. Section . highlights some principles that may affect the structural realization of information structure, end-focus (..), endweight (..), and the strategy of information status construal (..). Section . then surveys major information packaging constructions, such as those illustrated in the examples above. The conclusion in section . briefly looks at the position of information structure in the study of grammar more generally.
. G
.................................................................................................................................. The notions of given and new have been discussed under a range of different labels: ‘given’ information has been referred to, for instance, as old, known, ground, anaphoric, context-dependent, activated, salient, or familiar information, and there is no uniform definition of the concept of givenness. In section .. we will look, first of all, at givenness and newness at the propositional level, i.e. presupposition and assertion. Section .. discusses the givenness and newness of discourse referents, more specifically their different degrees of (assumed) activation in the mind of the hearer.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. Presupposition and assertion For Lambrecht () all information is stored in the form of propositions. Propositions are, however, not understood in a narrow logico-semantic sense as something that is true or false, but rather as ‘conceptual representations of states of affairs’ (Lambrecht : ). These propositions are structured in terms of a speaker’s assumptions about the hearer’s current state of mind. ‘Old’ information accordingly consists of all the propositions evoked by a sentence which the speaker assumes are already available in the hearer’s mind at the time of utterance, that is, propositions which the hearer is assumed to know. These are referred to as ‘presuppositions’ (or more precisely ‘pragmatic presuppositions’). ‘New’ information, on the other hand, is the proposition which the speaker thinks his/her sentence contributes to the hearer’s store of propositions. This is referred to as ‘assertion’. Lambrecht’s view of presupposition closely corresponds with that of other approaches although they use different labels, such as ‘common background belief ’ (Stalnaker ), ‘speaker presupposition’ (Kempson , Stalnaker ), ‘common ground’ (Stalnaker ), and ‘antecedent’ (Clark and Haviland : ). Such a propositional view of information has important implications for the identification of ‘new’ (asserted) information, as an uttered sentence cannot be neatly subdivided into old and new components. What is being asserted is not just the nonpresupposed proposition but a proposition (or propositions) which combines old and new components, since ‘what is new is normally new with respect to something that is already given’ (Lambrecht : ). In this sense assertion is relational with regard to presupposition. Take an example such as (): What is asserted (new) in B’s response is not strictly speaking the word wallet but the state of affairs that ‘John found a wallet in the street’, i.e. a whole proposition which relates the word wallet to some given proposition (‘John found something in the street’) to create a new proposition unknown to the hearer (capitals indicate a stressed word). () A: What did John find in the street? B: He found a WALLET in the street. This is different from approaches where an informational value may be attached to individual lexical and phrasal elements of the sentence, such as wallet in (). In a propositional view such as Lambrecht’s, the information expressed by a sentence is not neatly segmentable into sentence constituents with each being identified as either old or new. Propositions also play a role in other approaches to information structure but usually only in connection with a specific class of focus–presupposition constructions, such as it-clefts, wh-clefts, inversions, and preposings (e.g., Chomsky , Jackendoff , Culicover and Rochemont , Prince , Birner and Ward ), which will be discussed in section .. These structure the proposition they express into two parts: (i) a so-called ‘open proposition’, which typically conveys information that is already
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
assumed to be part of the common ground (Vallduví ), and (ii) the instantiation of a variable in the open proposition, which represents new information and is identified as the focus. In a sentence such as (), for instance, the open proposition is ‘I like X’ (which is equivalent to the presupposition ‘I like somebody’),2 with the focus being ‘X = Graham’. () It is GRAHAM I like. The concept of focus, however, differs from Lambrecht’s notion of assertion and will be discussed in more detail in section ..
.. Activation In the previous section we looked at givenness and newness on the clausalpropositional level. But information status is also—probably more commonly— associated with discourse referents (typically expressed by noun phrases), which we will investigate in the present section. Discourse referents, in Lambrecht’s view (: ), are entities, typically expressed by noun phrases, but they may also be propositions, if expressed by a subordinate clause or a pronoun. Birner and Ward (: ), by comparison, include in their analysis of sub-propositional information status not only entities but any element which is part of a larger proposition, including attributes and relations. Despite differences in identifying the actual carrier of information status there is some agreement in classifying possible degrees of activation of an information unit. Lambrecht (: ) proposes the taxonomy in Figure ., which builds on various
Assumed mental representation Unidentifiable (Brand-new) Unanchored
Anchored
Identifiable Inactive (Unused) Inferentially
Accessible (Semi-active) Situationally
Active (Given) Textually
. Lambrecht’s () assumed mental representation of discourse referents (alternative terms by Prince and Chafe in brackets)
2 Although the notions of open proposition and presupposition are often conflated, it has been argued (e.g., Dryer ) that only full propositions (e.g., I like somebody), not open ones (e.g., I like X), can be believed and, therefore, presupposed (cf. also Birner : ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
previous classification systems, notably those of Chafe (e.g., , , ) and Prince (a) (cf. also Kuno , Clark and Haviland , Allerton ).3 Lambrecht’s classification of seven possible states of the assumed mental representation of discourse referents involves two main categories: (i) identifiability, the speaker’s assumption whether the hearer ‘knows’ an entity or state of affairs, in the sense that it already has a representation in the hearer’s mind, and (ii) activation, the speaker’s assessment whether and to what degree an identifiable referent is ‘activated’, that is, in the consciousness of the hearer at the time of the utterance. This is because ‘[k]nowing something and thinking of something are different mental states’ (Lambrecht : ). Let us briefly look at the different categories in turn. Unidentifiable entities, or brand-new ones in Prince’s (a) terms, can be of two types: either unanchored, as in (a), or anchored, as in (b), where the NP I acts as the anchor that links the new expression a guy to a given discourse entity (Prince a: ). Inactive referents, or unused in Prince’s (a) terminology, are typically expressed by an accented lexical (i.e. non-pronominal) phrase, such as Chomsky in (c). Active referents, on the other hand, are fully given in the sense that the addressee is assumed to be thinking about them, often because the referent is given in the immediately preceding text or situation of discourse. They are typically, but not necessarily, coded pronominally with lack of prosodic prominence, such as we in example (d). Accessible or semi-active referents, finally, can be of three types: (i) textually accessible, where a previously active referent has become deactivated (after not having been mentioned for a while in the discourse), as in (e), (ii) situationally accessible, where a referent is accessible on account of its presence in the text-external world, as in (f), and (iii) inferentially accessible, where a referent can be inferred from a cognitive schema or frame, such as ‘house’ in (g). () a. I heard something TERRIBLE last night. (Brand-new unanchored) b. She talked to a GUY I work with (Brand-new anchored) c. Noam CHOMSKY gave a talk at the Royal Academy. (Inactive/unused) d. We went to Paris last weekend. (Active/given) e. I talked to a policeman in my street last night . . . [intervening stretch of text] . . . The policeman told me there had been a burglary. (Textually accessible) f. Those pictures are ugly. (Situationally accessible) g. We’ve bought a new house. The kitchen needs to be (Inferentially accessible) refurbished though. The inferential link of an entity to some previously activated referent, as in (g), has been explored in more detail by Ward (), Ward and Prince (), and Birner and Ward (). They argue that this link involves a contextually licensed so-called 3 For a somewhat different classification system compare Gundel et al.’s () Givenness Hierarchy for referring expressions, which foregrounds how the different categories are implicationally related to each other.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈ Table . Prince’s () model of hearer-status and discourse-status and correspondences with Lambrecht () and Prince (a) Discourse-new
Discourse-old
Hearer-new
Brand-new
(non-occurring)
Hearer-old
Unused
Accessible, active
‘partially ordered set’ relationship, which can take various forms. For example, in (g) above the kitchen is in a ‘part–whole’ relationship to the ‘trigger’ entity house, since kitchens are typically part of houses. Other types are, for instance, entity–attribute, set– subset, and equality relations. Birner and Ward’s (e.g., ) taxonomy of information status, however, differs from Lambrecht’s as they adopt Prince’s () classification system, which is a further development of her ‘assumed familiarity’ model. In her taxonomy, Prince distinguishes more clearly between hearer- and discourse-status of an entity with the categories of her original model reappearing in a matrix of crosscutting dichotomies: discourse old/new and hearer old/new (see Table .). Prince’s model has been modified by Birner (, inter alia) to the extent that inferentially accessible referents are located in the (previously empty) discourse-old/ hearer-new category.
. T
.................................................................................................................................. The concept of ‘topic’ has probably received the most wide-ranging interpretations of all information structuring concepts. It usually figures in a pair of complementary notions, such as ‘topic–comment’, ‘topic–focus’, and also under the guise of ‘theme’ in a ‘theme–rheme’ structure (used mainly by the Prague School and Halliday).4 The most common characterization of Topic is in terms of ‘what the sentence is about’.5 This definition can be traced back to the notion of ‘psychological subject’ used by grammarians of the nineteenth century (e.g., von der Gabelentz : –), which in turn 4 For a useful overview of different approaches to the concept of topic see e.g., Gómez-González . I am disregarding here the notion of discourse-topic, which is much wider in that it covers also larger text passages. It has been defined in terms of immediate concern (e.g., Keenan-Ochs and Schieffelin : –) of a particular stretch of discourse, or in terms of what it is about (e.g., Brown and Yule : –). 5 A different view is expressed, for instance, by Halliday, whose term topic refers to one particular type of theme, which is defined as ‘the point of departure of the message’ (Halliday : ), i.e. the leftmost constituent of the sentence. The close theoretical association of theme with initial position has, however, received considerable criticism (e.g., Fries , Downing , Huddleston , ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
goes back to Aristotle. A classic definition in terms of aboutness is given by Hockett (: ): ‘the speaker announces a topic and then says something about it’. The notion of aboutness has been adopted by various other scholars (inter alia Strawson , Kuno , Gundel , Dik , Downing ) and has become the most pervasive of all accounts of topichood.6 Reinhart () compares this use of topic to a file card with a particular heading under which the comment is stored. Thus, sentence (a) expresses information to be stored under Adele, whereas in (b) it is to be stored under One of the most successful pop singers.7 ()
a. Adele is one of the most successful pop singers. b. One of the most successful pop singers is Adele.
As shown by (), aboutness topics are frequently associated with clause-initial position. But this is not necessarily so: compare (), where B’s response is about the children. () A: Where are the children? B: The nanny took them to the park.8 Although the concept of aboutness is notoriously difficult to identify, various tests have been proposed, such as the ‘as for’ test (As for Adele, she . . . ) and the ‘said about’ test (He said about Adele that . . . ) (e.g., Gundel , Reinhart b). They have, however, been criticized for only limited applicability (e.g., Prince , Ward ). Lambrecht (: e.g. ) is one of the linguists who uses ‘topic’ in terms of ‘aboutness’ but gives it a new ‘relational’ definition.9 It is relational in the sense that it pragmatically construes a relation between a referent and a proposition such that the proposition is interpreted as ‘being about’ that referent. Such an aboutness relation is highly context-dependent and cannot be identified for sentences in isolation. Compare the examples in (), where only (a) has a topic in Lambrecht’s sense. In (b), there is no topic in Lambrecht’s account (: ff.); rather, the entire proposition is in focus (note that in other accounts such all-new sentences have been analysed as having a topic that is its spatio-temporal index; e.g., Gundel ).
6 The concept of topic (together with focus) has also been extensively discussed in the domain of generative syntax and formal semantics/pragmatics. For useful overviews see e.g., Krifka , Büring , and Vallduví . 7 Note, however, that reverse specificational sentences such as (a) need not have the topic in initial position but may also have comment + topic structure (see Heycock and Kroch , den Dikken , and Hedberg and Fadden ). 8 I am grateful to the anonymous reviewer for this example. 9 A similar relational view of topic and focus has been proposed by Gundel (e.g., ), who distinguishes relational givenness–newness (i.e. topic–comment) from referential givenness–newness (see section .. above). Her account differs from Lambrecht’s, however, in that it is not explicitly proposition-based and in that topic is seen as a complementary notion to comment.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
() a. (What did the guests do next?) The guests went to BED. b. (What happened?) The GUESTS went to BED. As an essentially pragmatically construed relation, the notion of topic is not tied to a specific form of codification. Although topic expressions usually have low pitch, as in (a) above, and often take the form of pronouns in sentence-initial position, neither of these features is a necessary requirement for topics (see also section .. for typical topic structures). A topic referent does, however, need to be pragmatically accessible in the discourse for a proposition to be construable as being about it (see Lambrecht : ; section .. below on construal).10 It follows from this that the more cognitively activated a discourse referent is (e.g., active, accessible in the taxonomy in Table . above), the easier it is to be interpreted as a topic. Unactivated (e.g., brandnew) discourse referents, on the other hand, cannot easily be construed as topics.
. F
.................................................................................................................................. Like topic, the concept of focus has received a number of different interpretations. Despite the differences in definition it is generally agreed that in English focus is typically signalled by prosodic prominence, such as a nuclear accent (also called tonic or pitch accent), and represents information that is new, or in some sense ‘relevant’ or ‘important’ (e.g., Quirk et al. : , Sgall et al. , Büring : , Winkler ). What exactly is understood by these terms of course varies widely. For Ward and Birner (: ), for instance, it is ‘that portion which augments or updates the hearer’s view of the common ground’. Similarly, for Jackendoff () it is the information that is not shared by speaker and addressee. Erteschik-Shir (: ), on the other hand, defines focal information more in terms of speaker intentions as the constituent of the sentence ‘which the speaker intends to direct the attention of his/her hearer(s) to’. This is reminiscent of Halliday’s (: f.) definition of information focus as ‘one kind of emphasis, that whereby the speaker marks out a part (which may be the whole) of a message block as that which he wishes to be interpreted as informative’.11 The concept of focus is usually seen as part of a complementing pair, such as presupposition–focus, topic–focus, ground–focus (see e.g., Vallduví , Vallduví and Engdahl for overviews). A different approach is taken by Lambrecht (), who defines focus in relational terms as adding new (unpredictable, non-recoverable) Compare also Gundel’s (e.g., ) Topic-Familiarity Condition, which stipulates that the referent of a felicitous topic must be already familiar in the sense that the addressee has an existing representation in memory. 11 A somewhat different take on focus is provided by Rooth () in the framework of Alternative Semantics, adopted for instance by Krifka (e.g., : ), who proposes that ‘[f]ocus indicates the presence of alternatives that are relevant for the interpretation of linguistic expressions’. 10
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
information to the pragmatic presupposition and thereby turning it into an assertion. The focus is thus ‘the element of information whereby the presupposition and the assertion differ from each other’ (Lambrecht : ). In an example such as (), what is presupposed in B’s reply is the proposition ‘Speaker went to some place’. The new information or assertion, on the other hand, is the abstract proposition: ‘The place the speaker went to for holiday was Mallorca’. The focus is therefore Mallorca, as this is what makes the assertion newsworthy: only by adding Mallorca to the proposition and construing a pragmatic relation to it does B’s reply become informative. () A: Where did you go for your holiday? B: I went to MALLORCA. Depending on which part of the sentence is in focus in a given context, Lambrecht (: f.) distinguishes three conventional types of focus-structure, as illustrated in () to (): (i) ‘predicate focus’ used for focusing on the entire verb phrase broke down, thus predicating a property of a given topic (i.e. the car), (ii) ‘argument focus’ for identifying an argument for a given proposition, and (iii) ‘sentence focus’, where the whole sentence is in focus, for reporting an event or introducing a new discourse referent.12 Note that sentence-focus structures do not have a topic in Lambrecht’s framework. The same is true for argument-focus, which is for Lambrecht (: ) a case of open proposition. ()
A: What happened to your car?
B: My car (it) broke DOWN. (Predicate-focus)
() A: I heard your bike broke down? B: My CAR broke down. ()
A: What happened?
B: My CAR broke down.
(Argument-focus) (Sentence-focus)
As a pragmatic category which refers to different kinds of pragmatically structured propositions, focus (like topic) is not per se tied to any specific formal realization. In English the three focus types are mainly marked prosodically by accent placement. Lambrecht’s argument-focus can therefore be related to what is sometimes called ‘narrow focus’, where a word is singled out by accent placement, often for contrastive purposes, while his sentence-focus corresponds with ‘broad focus’, where the whole sentence is in focus (e.g., Ladd , Selkirk ). Note, however, that Lambrecht’s argument focus includes ‘contrastive focus’ (as suggested by the contrast of car with bike in example ), which in other approaches is identified as a separate type of focus (e.g., Chafe ). The special status of contrastive focus can be justified by its distinctive prosodic form, such as a steeper fall (e.g., Selkirk , Gussenhoven ). Accent placement may, however, not always mark the focus unambiguously. In a sentence such as Mary bought a car, for instance, the nuclear accent will be on the same
12 One of the advantages of Lambrecht’s approach is that it accounts for the fact that the three focusstructure types are consistently expressed in distinct formal categories across languages (although in some languages, including English, argument-focus and sentence-focus are (near-)homophonous).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
word (car) for all three of Lambrecht’s focus types as well as for narrow and broad focus. Compare the following: ()
A: What did Mary do?
B: Mary bought a CAR.
(Predicate-focus)
()
A: What did Mary buy?
B: Mary bought a CAR.
(Argument-focus)
B: Mary bought a CAR.
(Sentence-focus)
() A: What happened?
There are two reasons for this overlap. First, accented constituents are not necessarily coextensive with focal elements. This problem has been discussed in the generative literature under the terms of ‘focus projection’ (e.g., Selkirk , Rochemont ) or ‘focus integration’ (e.g., Jacobs ) in response to examples such as the ones above where the nuclear accent is on the rightmost element in a verb phrase but where the focus may be on either the verb phrase as a whole (bought a car), the object noun phrase (a car), or the entire sentence. Second, in English the ‘default’ position for nuclear accent placement (broad focus) is on the last content word of the sentence (e.g., Chomsky and Halle ). To avoid any ambiguities, speakers may however resort to other means of focus marking: lexically, with the help of focus particles such as only or even (e.g., Jackendoff , Kö nig ), or structurally with the help of specific focus constructions such as clefts and ‘focus movement’ (see section .).13
. I
.................................................................................................................................. Having looked at the central concepts of information structuring in the preceding sections, let us now take a closer look at how they interact with the level of syntactic form, that is, how they conventionally manifest themselves in morphosyntactic structure. Before turning to some specific construction types in section ., this section gives a brief overview of some general principles: the end-focus principle (..), the end-weight principle (..), and lexico-grammatical construal (..).
.. End-focus As noted above, the concepts of topic and focus are not seen in most accounts as being tied to a specific formal realization. Nonetheless, for English there is a systematic 13 Note, however, that focus particles may add more to the meaning of a sentence than specifying the focus. Moreover, their association with different types of focus may lead to truth-conditional differences, as in John only showed Mary the PICTURES versus John only showed MARY the pictures (see Krifka : ). Similarly, focus constructions such as clefts can also be ambiguous as to their focus marking (e.g., Hedberg ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
correspondence between these pragmatic concepts and their grammatical and phonological realization: topic correlates with givenness, preverbal position, and lack of pitch prominence, while focus correlates with newness, postverbal position, and pitch prominence (e.g., Lambrecht : ). This ordering preference was originally discussed by the Prague School under the heading of ‘Functional Sentence Perspective’, then taken up by Halliday in his ‘Thematic structure’, and is probably best known by the name of ‘end-focus principle’ or ‘given-before-new principle’ (e.g., Leech , Quirk et al. , Gundel ). It is illustrated by the following example where the endfocus principle correctly predicts that (a) is a more appropriate response to the question than (b). ()
What happened to John? a. He was interrogated by the police. b. The police interrogated him.
This given-before-new arrangement of information can be accounted for in cognitive terms, as ‘given’ information anchors the sentence to already existing knowledge and in this way acts as an address in memory which indicates where the new information is to be integrated. It thus facilitates both the production and processing of new information. Clark and Haviland () refer to this conventionalized way of arranging information as the ‘given-new contract’ between speaker and listener: an implicit agreement between the interlocutors based on the co-operative principle (Grice ). Although givenness and topic (in the aboutness definition, as discussed above) represent distinct informational categories, they tend to coincide in initial position, typically occupied by the subject (e.g., Strawson , Prince a, ). Diachronically, the relation between subject and topic has been emphasized by Givón (e.g., a), who takes subjects to have derived from grammaticalized topics (although he defines topics in terms of discourse continuity rather than aboutness). As we will see in section ., end-focus is an important principle which explains the existence of special word-order variants, such as the preference for the passive sentence in (a) above. However, end-focus is not the only ordering principle. As Givón (e.g., ) has argued, there is also an alternative principle, which he calls ‘first-things-first’ (cf. also Gundel ). It stipulates ‘attend first to the most urgent task’ (Givón : ) and therefore may involve a ‘new-before-given’ order. What seems to be a contradiction to the end-focus principle may however be explained by a greater degree of spontaneity or lack of planning.
.. End-weight Another factor evoked in connection with the linear ordering of sentences is that of syntactic weight. It figures under various terms, such as ‘end-weight principle’ (Quirk et al. : ), ‘light subject constraint’ (Chafe ), ‘principle of increasing
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
complexity’ (Dik : ), and ‘principle of relative weight’ (Jespersen –, VII: ). They all stipulate that the relative weight of a constituent affects its position in a clause or sentence with lighter/shorter constituents typically preceding heavier/longer ones. For constructional variants as in () this means that the extraposed version in (a) will normally be preferred over the non-extraposed version in (b) owing to the length difference between matrix predicate and subordinate clause. () a. It’s strange that Mary went to Paris. b. That Mary went to Paris is strange. Although the concept of weight is in principle distinct from that of information structure, there is a close correspondence between the two, as information which is new often needs to be stated more fully and can therefore be expected to be longer. Given information, by contrast, can be referred to by short and simple expressions, often pronouns. The typical given-before-new distribution of information in the clause, as discussed in the previous section, is therefore in accordance with the principle of end-weight. The distribution of weight can be viewed from two different perspectives, that of the speaker and that of the hearer. Hawkins (e.g., ), for instance, adopts a hearer-oriented approach and claims that ‘words occur in the orders they do so that speakers can enable hearers to recognize syntactic groupings and their immediate constituents . . . as rapidly and efficiently as possible’ (Hawkins : ). This recognition is facilitated by a principle of Early Immediate Constituents, which ensures that the overall constituent structure of the sentence can be parsed as early as possible by the hearer. Wasow (), on the other hand, adopts a speaker-oriented approach. For him the main reason for postponing complex constituents is that it facilitates utterance planning and production. The listener thus prefers early commitment to make the remainder of the sentence more predictable, while the speaker prefers late commitment, which keeps options open and buys time for further planning (cf. Wasow : ). In most cases, however, the two perspectives can be expected to ‘conspire’ towards the same aim: placing light constituents before heavy ones.
.. Lexico-grammatical construal of information status As noted above, there are certain conventionalized ways of presenting information in linear grammatical form. These coding preferences include above all the tendency for given information to occur early in the clause as the topic of a sentence, typically in pronominal, unaccented form, while new information tends to be non-initial, nonpronominal, accented, and sentence focus. The firmly established nature of this convention affords the speaker with a useful rhetorical strategy: construing a referent’s information status by grammatical means independent of its actually assumed degree of activation, provided the hearer is able and willing to accommodate it.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Consider the following example (adapted from Lambrecht : , ; capitals indicate stressed words). ()
Remember MARK, the guy we went HIKING with, who’s GAY? a. His LOVER has won the LOTTERY. b. I ran into his LOVER yesterday, and he told me he had won the LOTTERY.
The two versions differ in their pragmatic construal of his lover. Version (a) presents his lover in prototypical topic position (as subject), stressed but without nuclear accent, and as a potentially identifiable referent (Mark’s lover). As such, it construes the referent of his lover as in some form ‘given’ (accessible), irrespective of whether the hearer actually knows that Mark has a lover. Version (b), in contrast, construes his lover as ‘new’ (inactive, unused) information by virtue of its postverbal focus position and nuclear accent. Thus, while (b) informs the hearer that Mark has a lover, (a) ‘conveys a request to the hearer to act as if the referent of the NP were already pragmatically available’ (Lambrecht : ) and update the common ground accordingly, if necessary. Thus (a) requires ‘pragmatic accommodation’ on the part of the hearer (see Lambrecht : ). For such accommodation to be possible, however, the referent needs to be at least pragmatically accessible in a given discourse.
. I
.................................................................................................................................. Although English has a fairly fixed SVO word order—historically conditioned by the loss of most case endings—it does allow for some limited word order variation. This section gives a brief overview of some of the non-canonical constructions used for structuring information, also called information packaging constructions. They have been grouped into structures involving fronting of canonical elements (..), postponement of canonical elements (..), argument reversal (..), and clefting (..). Since they all have canonical word-order counterparts which are semantically (but not pragmatically) equivalent, they are often considered as members of a pair of so-called ‘allosentences’ (Daneš , Lambrecht ), such as active–passive, clefted–canonical (Halliday e.g. uses the term ‘thematic system’ for these). Restricting their analysis to a simple comparison with their non-canonical counterpart is, however, problematic as it ignores potential functional overlaps between information packaging constructions. It should be noted that the terms fronting, postponement, reversal and clefting are used here simply to refer to deviation from canonical SVO word order. They do not imply a transformational movement account which derives word order variation from an underlying, more basic construction, as is the case in generative models of grammar (see Chapter ). In many cognitive-functional accounts, such as Construction
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
Grammar, on the other hand, each of these structures is taken as a construction in its own right (see Chapter ). There is some evidence, however, that speakers may store generalizations of a common meaning between formally different constructions, which would suggest a common category even in constructional terms (Perek ). It should also be noted that, despite the close association of these constructions with specific information-structure patterns, their felicitous use cannot be reduced to information structure alone. For a complete description of their functional potential other factors also have to be taken into account, such as stylistic and subtle semantic differences from their canonical counterparts, which cannot be discussed here for reasons of space.
.. Fronting: left-dislocation and topicalization Left-dislocation and topicalization both involve placing an argument, typically a noun phrase, in the left-periphery of the clause. The two constructions differ in that leftdislocation fills the original argument position with a personal pronoun (which is coreferential with the preclausal constituent), as illustrated in (), whereas topicalization, as in (), does not. ()
‘My sister got stabbed. She died. Two of my sisters were living together on th Street. They had gone to bed, and this man, their girlfriend’s husband, came in. He started fussing with my sister and she started to scream. The landladyi, shei went up, and he laid her out. So sister went to get a wash cloth to put on her . . . ’ (Prince : ) (left-dislocation)
() A: Do you watch football? B: Yeah. Baseball I like a lot better (Birner and Ward : ) (topicalization) Topicalization, as the name suggests, is generally thought to mark the preposed constituent as the sentence topic (e.g., Halliday , Gundel , Reinhart b), but this seems to be too restrictive a view (Birner and Ward : ), not least since preposing, unlike left-dislocation, is possible with elements other than noun phrases, such as the adjective phrase in (). ()
Stupid he is not.
In informational terms, two types of topicalization can be distinguished. In one use, such as () above, the preposed element is given information, more precisely accessible or discourse-old. In the other use, often called ‘focus movement’ (Prince b), the preposed element is the focus, with the rest of the utterance representing a salient open proposition, as illustrated in () below. In both cases, however, the preposed element stands in a set relationship to previously evoked information (section ..; Ward ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
()
I saw Shakespeare last night. MacBETH it was. (Birner and Ward : )
Left-dislocation is also topic establishing (e.g., Lambrecht : ff.), but differs from topicalization in that it does not involve the option of an open proposition and, in its most typical use, has a left-dislocated element that introduces new information (e.g., Geluykens ). This is illustrated by example () above, where left-dislocation is a useful strategy for avoiding a new referent in subject position (see section ..). There is also another use, however, identified by Prince () as marking the preclausal noun phrase as contrastive and inferrable. As such, it stands in a set relation to some previously evoked entity, such as ‘member of’ in example (). ()
‘My father loves crispy rice’, says Samboon, ‘so we must have it on the menu. And Mee Grob, too, he loves it, just as much. Mee Grob ($.) is a rice noodle’ (Prince : )
There is thus some functional overlap between topicalization and left-dislocation, as both may involve an inferrable pre-clausal noun phrase (Gregory and Michaelis ). At the same time, the two constructions have been shown to differ syntactically and prosodically, with topicalized elements being syntactically integrated in the following clause, while left-dislocated elements are syntactically independent ‘orphans’, which therefore have more in common with parenthetical expressions (Shaer and Frey ).
.. Postponement: existential there, indirect object shift, right-dislocation, it-extraposition Postponement (with the exception of right-dislocation) tends to mark the postponed constituent as new (Birner and Ward : ), thereby bringing a construction in line with a given-before-new distribution of information (section ..) and shifting a heavy constituent to the right (section ..). In an existential there construction the logical subject NP has been postponed by the use of expletive there in canonical subject position. Existential there thus functions as a ‘presentative signal’ (Breivik ) which allows for the introduction of a ‘new’ discourse referent in non-initial position, as in () and (). In Lambrecht’s (: ff.) terms, existential there is a topic-promotion construction which serves to introduce a brand-new or unused referent into the discourse. () a. There is an otter in the garden. b. There is plenty of milk. ()
Famous men came – engineers, scientists, industrialists; and eventually, in their turn, there came Jimmy the Screwsman and Napoleon Bonaparte . . . (Birner and Ward : )
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
We can distinguish two types of existential there constructions based on the type of verb they contain: (i) existential there-sentences proper, which contain be as their main verb, as in (), and (ii) presentational there-sentences, which contain some other verb, as in () (e.g., Rochemont and Culicover , Martínez Insúa ). These two types have been shown to differ also in their information structure (Birner and Ward : ): while both introduce a ‘new’ noun phrase referent, it is always ‘hearer-new’ in existential there-sentences whereas presentational there-sentences are less restrictive, admitting referents that are hearer-old as long as they are discoursenew. Moreover, existential there-sentences proper can be further divided into extended existentials, with an extension, typically in the form of a locative complement, following the postponed logical subject, as in (a), and bare existentials, lacking such an extension, as in (b) (e.g., Huddleston and Pullum : –). Note that only the extended existential has a canonical counterpart (An otter is in the garden) but not the bare existential (*Plenty of milk is). Indirect object shift, also known as dative movement, ‘shifts’ a ‘new’ indirect object (the recipient or beneficiary) out of its canonical postverbal position to the right of the clause, where it takes the form of a prepositional phrase, as in ().14 It thus represents a positional variant for a ditransitive (‘double-object’) construction, as in (). This pair of constructional variants is also known as dative alternation and includes the special case of benefactive alternation, which involves the preposition for: e.g., He baked a cake for Mary. () A: Who did the waiter give a drink to? B: The waiter gave it to Mary.
(indirect object shift)
() A: What did the waiter give to Mary? B: The waiter gave her a drink.
(ditransitive construction)
Various studies have identified a preferred information structure for the two alternants: the first object is typically discourse given, pronominal, animate, definite, and shorter than the second object. For the second object the reverse applies: it is discourse-new, nonpronominal, inanimate, indefinite, and relatively long and grammatically complex (e.g., Erteschik-Shir , Thompson , Collins , Arnold et al. , Bresnan et al. ). Compare, for instance, the corpus examples in the (a)-examples below and their constructed, less natural counterparts in (b): 15 () a. They gave me a lotion that wasn’t as good as the cream (ICE-GB:sa--) b. ?They gave a lotion that wasn’t as good as the cream to me.
14 The syntactic status of the to-PP phrase has been analysed in different ways: as a special realization of an indirect object (e.g., Biber et al. ), as a complement in a monotransitive clause (e.g., Huddleston and Pullum : ), or as part of a larger (Transfer)-Caused-Motion construction (Goldberg ). 15 ICE-GB stands for the British component of the International Corpus of English (Nelson et al. ). The contrast in () is discussed in Aarts (: ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
() a. And so Bob drafted this questionnaire and gave it to Dick (ICE-GB:sa--) b. ??And so Bob drafted this questionnaire and gave Dick it. The choice between the dative alternants, in other words, enables the speaker to bring the utterance in line with the cognitive principles of end-focus and end-weight. Information structuring is, however, not the only factor determining the choice. The two alternants have been shown, for instance, to differ in verb semantics, with ditransitives preferring verbs of direct face-to-face transfer and indirect object shifts preferring verbs of transfer over a distance (e.g., Thompson and Koide , Gries and Stefanowitsch a). Other researchers argue for distinct constructional meanings of the two variants (e.g., Goldberg ; see also Chapter , section .. in this volume). In contrast to existential there and indirect object shift, right dislocation has a clause-final noun phrase which is discourse-old. More precisely, it conveys ‘information that has been evoked, either explicitly or implicitly, in the prior discourse’ (Birner and Ward : ). This can be seen in example (), where the pipes have already been mentioned in the preceding text. ()
Below the waterfall . . . a whole mass of enormous glass pipes were dangling down into the river from somewhere high up in the ceiling! They really were ENORMOUS, those pipes. (Birner and Ward : )
As with left-dislocation, there is a coreferential pronoun in the canonical position of the dislocated element. The clause is thus grammatically complete, with the final noun phrase being added to give the referent of the pronoun in full. Not surprisingly, therefore, right-dislocations have often been characterized as repair mechanisms (e.g., Tomlin , Geluykens ). Other studies, however, distinguish between right-dislocation and afterthought (repairs), such as Ziv and Grosz (), who argue that afterthoughts are separated from the clause by a pause whereas right-dislocations are not. Similarly, Birner and Ward (: ) point out that there is not necessarily a need for referent clarification, as in example () above, where the pronominal referent is unambiguous. Rather, the final noun phrase represents discourse-old (accessible) information which may be topical. For Lambrecht (: –), the function of rightdislocation (in his terminology ‘antitopic’) is precisely that of promoting a topic: it allows the speaker to present a given referent as topic and not talk about it in the same clause, which is preferable for processing reasons (cf. Lambrecht’s Principle of the Separation of Reference and Role). In promoting an already given referent as topic, right-dislocation thus provides an important alternative to the topic promotion of brand-new and unused referents noted above for existential there. Like right-dislocation, it-extraposition also ‘shifts’ a constituent to the right of the clause, only this time a clausal subject,16 and replaces it with a pronoun, anticipatory it, 16
I am ignoring object extraposition here (e.g., She considered it essential [to talk to him]), which is usually obligatory, as well as extraposition from noun phrases (e.g., A man appeared [with a gun]).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
in its canonical position. Examples () and () illustrate the canonical and ‘extraposed’ version respectively (cf. also example ()). () That John went to Paris is surprising. ()
It is surprising that John went to Paris.
The functional motivation for choosing it-extraposition over its canonical counterpart lies in avoiding a weighty and highly informative clause in subject position: the ‘extraposed’ subject typically (though not exclusively) conveys discourse-new information (irrecoverable from the preceding context) and is longer than its matrix predicate (Collins , Kaltenbö ck ). In spoken language the function of it-extraposition is somewhat different though, as it shows a substantial portion of given complements with a clarification of reference or repair function (Kaltenbö ck ), as illustrated in (). ()
And some people very much like to cut out the phrase altogether in order to have a simple condemnation against divorce But it’s all very well to cut out a phrase (ICE-GB:sa--)
It-extraposition allows for the construal of information status in an interesting way, as a result of its unusual topic–comment structure, which presents the topic in syntactically subordinate position following (rather than preceding) the comment, and its frequent use of factive matrix predicates (i.e. predicates such as be significant, odd, exciting; Kiparsky and Kiparsky ), which lexico-grammatically presuppose their complements. This enables the speaker to present new information in the complement clause as if it were known (Kaltenbö ck ), as in example () (compare also it-clefts below for a similar function). ()
He recorded his famous description of the view in his autobiography and it is extraordinary that it has changed so little in the past years (ICE-GB:wb-)
.. Argument reversal: passive, inversion Argument reversal involves both fronting and postponing of a constituent. The bestknown information-structural pair of this kind is the active–passive alternation. Although the choice between them is constrained by a number of factors (such as the availability of a by-phrase agent and the willingness to express it, as well as the semantics of the verb; e.g., Gries and Stefanowitsch a), a major one is that of appropriately adjusting the information structure. Once again, the principles of endfocus and end-weight play an important role, but the major constraint is that the subject should not be newer (in discourse familiarity terms) than the by-NP
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
(Birner , Birner and Ward ). This allows for the following sequences of subject NP and by-NP: old–new, as in (), but also new–new, and old–old, as illustrated by the first and second instances in () respectively. The final by-NP can also be expected to be relatively longer than the subject NP (e.g., Biber et al. : ). ()
The mayor’s present term of office expires Jan. . He will be succeeded by Ivan Allen Jr . . . (Birner : )
()
Appointment of William S. Pfaff Jr., , as promotion manager of The TimesPicayune Publishing Company was announced Saturday by John F. Tims, president of the company. Pfaff succeeds Martin Burke, who resigned. The new promotion manager has been employed by the company since January, , as a commercial artist in the advertising department (Birner and Ward : )
If the information structure of the two NPs is reversed, as in the following example (adapted from () above), where the first NP is discourse-new while the by-NP is discourse-old, the result is an infelicitous use of the passive. ()
Ivan Allen Jr. will take office Jan. . #The mayor will be succeeded by him. (Birner : )
The same information structure can be found with cases of inversion, as in (), where the (logical) subject (a moat of burning gasoline) is in postverbal position while a canonically postverbal constituent, typically a locative expression (next to it), takes preverbal position. () The outermost circle of the Devil’s world seems to be a moat filled mainly with DDT. Next to it is a moat of burning gasoline (Chen : ) As with passive structures, the preverbal element (and the verb, if other than be) of inversion must be at least as familiar from the discourse as the postverbal subject (e.g. Birner ). This has been explained in cognitive terms by a ground-before-figure analysis of inversion (Chen ), according to which the preverbal element functions as ground, which serves to locate the postverbal figure, and as such needs to be anchored in the preceding discourse. Inversion thus first directs the hearer’s attention to the ground, which is anchored by a link to a previously established landmark—either in the preceding linguistic co-text or the discourse context. This landmark then functions as a signpost which navigates the hearer to the figure, in other words the focus of the hearer’s attention. In terms of discourse functions inversion therefore lends itself particularly well for topic management (topic shift and topic introduction) (Kreyer ; see Chapter , this volume, for the ‘immediate observer effect’) and has been noted to carry interpersonal meaning in the form of ‘emotivity’ and ‘subjectivity’ (Dorgeloh ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
.. Clefting: it-clefts and wh-clefts Clefting not only changes the canonical order of a clause but also its internal structure by splitting (cleaving) it into two clauses, as illustrated in (). It includes it-clefts (b), whclefts (or pseudo-clefts) (c), and reverse wh-clefts (or reverse pseudo-clefts) (d).17 () a. b. c. d.
John offered Mary a cocktail. It was a cocktail that John offered Mary. What John offered Mary was a cocktail. A cocktail was what John offered Mary.
In contrast to the previous allosentences we are dealing here with a larger set of word order alternants. It-clefts have therefore been set in relation to both the canonical, i.e. predicational, variant (a) and the specificational (identifying) copulative wh-cleft (c)18 (e.g., Delahunty , Declerck , Davidse , Patten ). In terms of information structure, it-clefts represent a focus–presupposition structure (section ..), where the constituent following it+ in the first clause, the ‘highlighted element’. (Collins ), represents the focus and the subordinate clause the open proposition. The actual distribution of information, however, does not have to match the focus–presupposition structure. In fact, different types of it-clefts have been identified. Prince () distinguishes between Stressed-focus it-clefts, as in (), and Informative-presupposition it-clefts, as in () and (). The former has a highlighted element which is new and heavily stressed, while the second clause is only weakly stressed and contains information that is ‘given’. The latter conveys ‘new’ information in the second clause. Declerck () further splits the class of Informative-presupposition it-clefts into Unaccented anaphoric focus clefts, where the focus element is ‘given’, as in (), and Discontinuous clefts, where it is ‘new’, as in (), which occurs at the start of an article. ()
[soccer commentary] Oh that’s surely a penalty Mostovoi is pushed over No question on that occasion I think it was little Garcia who was at fault (ICEGB: sa--)
()
However, it turns out that there is interesting independent evidence for this rule and it is to that evidence that we must now turn (Prince : )
()
It was just about years ago that Henry Ford gave us the weekend. (Prince : )
For a discussion of it-clefts and wh-clefts from a Construction Grammar perspective see Chapter , section .., this volume. 18 Specificational or identifying constructions, such as it-clefts and wh-clefts, express an identifying relationship between two entities, where one serves to specify the identity of the other (as opposed to attributive constructions, which denote a relationship between an entity and some attribute ascribed to it). 17
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
In the case of Informative-presupposition it-clefts there is an interesting mismatch between the ‘newness’ of the information in the subordinate clause and its presentation as the presupposition in lexico-grammatical terms. This allows the speaker to construe new information as if it were already known and to ‘mark a piece of information as fact’ (Prince : ). For wh-clefts, the typical information structure is that of an open proposition expressed by the wh-clause, conveying ‘given’ information, and the part immediately following the copula within the superordinate clause (e.g., a cocktail in (c)) providing the focus, i.e. ‘new’ information. They are thus focus–presupposition structures like it-clefts but also very much in keeping with the principle of end-focus (e.g., Prince , Collins ). Wh-clefts have been shown by Hedberg and Fadden () to differ from it-clefts and reverse wh-clefts in being much more restricted in their information structure. All three cleft types have in common that they split a proposition into two parts, identified by Hedberg and Fadden () as topic and comment (where ‘topic’ is defined in terms of aboutness and typically, though not necessarily, expresses ‘given’ information, while ‘comment’ is identified by primary stress and thus equivalent to focus as discussed in section .). With wh-clefts, however, the distribution of topic and comment is highly constrained: the initial wh-clause is exclusively the topic (conveying typically discourse-old information) and the subsequent constituent is the comment (conveying always discourse-new information), as in example (). ()
‘He is reported . . . not to be as desperate today as he was yesterday but to still be on the brink, or at least shaky. What’s made him shaky is that he’s seen McCord bouncing out there and probably walking out scot free.’ (Prince : )
Reverse wh-clefts, by comparison, as well as it-clefts show more flexibility in allowing for both a topic–comment structure, as in (), and a comment–topic structure, as in (); as well as comment–comment structure. Topic–comment structure is, however, the more frequent pattern for reverse wh-clefts (as observed also by Heycock and Kroch ), which is in accordance with the end-focus principle (see section ..). ()
Don’t you think resources are really being redirected, for example, into security and into counterterrorism? That’s where the resources are GOING. (Hedberg and Fadden : )
() THAT’S what we lack in Africa now. (Hedberg and Fadden : )
. C :
.................................................................................................................................. As the discussion in this chapter has shown, the study of information structure subsumes a number of pragmatic-cognitive concepts and principles, such as the
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
̈
assumed givenness and newness of information, topic, and focus, all of which play an important role in communication. A central question in this respect is how these principles relate to the linguistic form of utterances. In other words, how exactly do the formal and communicative aspects of language interact? The answer to this question depends on one’s model of grammar. Most approaches to the study of grammar distinguish different components or levels along the lines of syntax, semantics, and pragmatics or similar subdivisions. Where the approaches differ is in the relationship between these components. While in formalist, generative models they are generally taken to be autonomous,19 cognitive-functionalist approaches see them as interdependent. Attempts at integrating the different levels can be found for instance in Functional Discourse Grammar (Hengeveld and Mackenzie and Chapter , this volume), Role and Reference Grammar (Van Valin and LaPolla ), and more recently Construction Grammar (e.g. Goldberg , Croft a, Hoffmann and Trousdale , and Chapter , this volume), where morphosyntax, semantics, and pragmatics are taken to be integrated aspects of grammatical constructions. Lambrecht (), who also adopts a constructionist view, argues for an intermediate position. While conceding some degree of structural independence, he emphasizes the pragmatic motivation of grammatical form, which he sees as arising diachronically under pressure from information-structural constraints (Lambrecht : ). Information structure is, however, not expressed by syntactic form alone: ‘What syntax does not code, prosody does, and what is not coded by prosody may be expressed by morphology or the lexicon’ (Lambrecht : ). The great challenge in studying information structure thus lies in its complex relationship with all grammatical levels of a language and their competing, interactive nature in relating an utterance to a particular communicative context.
A I would like to thank the editors, particularly Jill Bowie for her invaluable feedback, Evelien Keizer, Ekkehard Kö nig, and an anonymous reviewer for their helpful comments and corrections.
19 A notable exception in the generative tradition is the cartographic approach (e.g., Rizzi , Cinque and Rizzi ), which integrates functional categories such as topic and focus into formal structures (proposing TopP and FocP as part of an X-bar schema).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
.............................................................................................................
GRAMMAR AND OTHER FIELDS OF ENQUIRY .............................................................................................................
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
......................................................................................................................
......................................................................................................................
̈
. I
.................................................................................................................................. I this chapter, grammar (understood in the sense of morphology and syntax) and lexis will be discussed as core components of human language, focusing on their interrelation. The layman’s assumption is straightforward: grammar and lexis are two different things. The former is described in grammar books, the latter is contained in dictionaries. And for learning and/or using a language, one needs to know both. This understanding has repercussions in linguistic theorizing as well. Up until the early mid-twentieth century it was common practice for a linguist to be concerned with either lexical or grammatical matters, that is linguistic phenomena (such as sounds, words, meanings, inflectional endings, structures, etc.) were allocated to one or the other ‘domain’. Generally speaking, everything assumed to be stored/memorized went into the lexicon, while grammar contained everything regularly variable and everything needed for the combination of stored elements, that is a language’s ‘combination rules’. The regularities in the variation of words were considered a matter of morphology, those in the combination of words (and variations thereof) a matter of syntax. This ‘compartmentalization’ reverberates in the assumption that in speakers’ minds language (as an encapsulated module itself) consists of different (sub)components or modules, e.g., the mental lexicon and the mental grammar. Such a view, known as the modularity hypothesis (Fodor ), implies that lexicon and grammar represent distinct linguistic subcomponents interacting in one way or another, e.g., in that lexical entities are embedded in syntactic constructions in a slot-and-filler mode. Structurebuilding would be licensed by a language’s syntax, which, in turn, seems to be constrained by the speaker’s choice of (morphologically manipulated) lexical items that are best suited to communicate the intended message. Thus, syntax would license an English NP structure like Det (Adj) N, which will then be filled in with items listed in the lexicon as Det (the, this, that etc.), as N (woman, child etc.) and Adj
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
(clever, alive, etc.). If the speaker chose the adjective alive, this would put a constraint on syntax because it is used in copular clauses (X be alive) rather than in NPs (*an alive N). Linguistic models adhering to such a modular view, also known as ‘dual-system theories’ (see Snider and Arnon for a brief overview), need to specify two things: the definitional criteria for lexis and grammar, on the one hand, and the ways in which the two components interact on the other. More holistic accounts of language (seeing language as a non-autonomous phenomenon making (special) use of the human general cognitive apparatus) suggest that lexicon and grammar are gradient phenomena1 sitting on a cline. They seek to identify not only what makes lexis and grammar different, but also what they have in common. Firstly, lexis and grammar in such models are assumed to do the same type of work: they sanction expressions on the basis of categorization and composition/unification of smaller units into larger ones. Secondly, lexis and grammar comprise elements of the same nature, namely signs, that is forms with a meaning/function. They are, however, different with respect to such properties as schematicity (abstractness), generality of meaning, or complexity, all affecting their locations on the postulated cline. Lexical units are specific units, such as woman or white elephant, and grammatical units are schematic, such as Det N or NP V NP NP. The latter are understood to emerge as generalizations language users make when they experience similar specific expressions, associated with such more general meanings as referring to an instance/instances of a class of things (Det N), or an event of transfer (NP V NP NP). Models adopting these assumptions, aka single-system models (see Snider and Arnon ), have to specify how abstraction and generalization can ‘produce’ knowledge of rules (or (schematic) patterns), and they need to account for the interaction between the specific and the (more) abstract entities making up speakers’ knowledge of their language. In linguistic models that draw a sharp boundary between the lexicon and the grammar, linguistic elements are classified as belonging to one or the other on the basis of the concepts of regularity/compositionality and idiosyncracy. Elements that are idiosyncratic, i.e. whose meaning cannot be derived from the meaning of their parts, are stored in the lexicon. The grammar, on the other hand, comprises the rules by which everything that can be derived as a sum of its parts, i.e. everything that is compositional, is computed . . . ’ (cf. Taylor : – for a similar description). A proposal along these lines was already made by Bloomfield (: ), who claimed that ‘[t]he lexicon is really an appendix of the grammar, a list of basic irregularities’, and since then has been widely adopted. Di Sciullo and Williams (: ) speak of the lexicon as a prison containing the ‘lawless’, and Stowell and Wehrli (: ) view it as ‘the repository of all idiosyncratic and unpredictable properties of individual lexical items’. This dichotomous understanding of lexicon and grammar has become a basic tenet in generative theorizing, where the elements of the former are assumed to be used
1
For a thorough discussion of gradience in language (with a focus on word-classes) see Aarts a.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
in the computation by the latter (Marantz : ). That is, grammar, employing its repertoire of rules, computes structure into which elements of the lexicon are inserted. Individual models differ with respect to the particulars of what is available in the lexicon: only simple words (sun, move), or simple and complex words (sunny, sunnier, sunshine, moved, movement),2 only fully lexical idioms (a blessing in disguise, break a leg) or schematic idioms as well (take N to task, the X-er, the Y-er). Such items have turned out to be crucial in the debate about a clear-cut boundary between lexicon and grammar. Another problem for drawing a clear line is posed by frequent regular (i.e. compositional) phrases, such as write a letter, read the newspaper, I’ve seen that before etc.). They are not idiosyncratic, but (in all probability) stored. The issue of complex words touches on the more fine-grained differentiation of grammar into syntax and morphology. Syntax is clearly ‘grammatical’ (schematic, regular), but morphology seems to be concerned with both lexical (specific, idiosyncratic) and grammatical aspects. The association with the lexicon is straightforwardly clear for a language’s free morphemes, whose form-meaning relations are specific and idiosyncratic, i.e. unpredictable, in principle. But bound morphemes, though specific (= lexical) too, may be required by structural, i.e. syntactic, needs (such as structural case or agreement markers) and they contain schematic information regarding the set of elements they combine with (such as -ed or -ing requiring V). This speaks for a close interaction between lexical and grammatical (schematic) elements. Moreover, the combinations of free and bound morphemes (i.e. elements of the lexicon), no matter whether semantically motivated and/or required by syntactic position, clearly exhibit combinatory patterns, or can be said to be produced ‘on the fly’ by ‘grammatical’ rules. That is why morphology is often compared with syntax, although the combinatorics is not generally assumed to be the same.3 From this perspective, some complex words are compositional (almost all inflection and some derivation) and some complex words (non-compositional derived words and word-forms) are idiosyncratic, ending up in the lexicon. In the next section, I turn to the lexicon-grammar issue in a number of specific linguistic theories. To make the task manageable, given the allotted space, as well as my own expertise, I have selected only a number of current formal and functional theories, aiming to be as representative as possible. The discussion is organized as follows. Section .. covers generative (or formal) models of language, with the focus being on some of the ‘mainstream’ frameworks of this type: the Government & Binding Model (G&B), the Minimalist Program (MP), Distributed Morphology (DM), HeadDriven Phrase Structure Grammar (HPSG), and Lexical-Functional Grammar (LFG).4
2
There is the issue of the place of productive word formation patterns, such as derivation or compounding for English. Whether they are part of the lexicon or part of grammar is a question answered differently, as will become apparent below. 3 Parallel mechanisms go well with morpheme-based approaches to morphology (for these and other approaches (word-based or paradigm-based) see Spencer, this volume: Chapter ). 4 For a recent overview on the syntax-lexicon interface in formal models, see also Alexiadou ().
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
Section .. is concerned with functional models, picking the models by Dik (, ) and Halliday (b, ). Section .. elaborates on the impact of corpuslinguistic studies on the issue, bridging the way to section .., which discusses usagebased theories. I opted for the usage-based model developed in great depth by Langacker (, ) and the construction-grammar approach by Goldberg (, a). Section . concludes with a summary of the investigation.
. L
..................................................................................................................................
.. Generative models Here I will look at some details of what particular generative models assume with regard to the character of a language’s lexicon, syntax, and morphology, and their interrelations.5 Theories within this approach adhere to the modularity hypothesis, and since the modules are assumed to cooperate on an input-output basis rather than interactively, they must be clearly distinct from one other.
... Generative syntax: the Government & Binding Model (G&B) and the Minimalist Program (MP) The Chomskyan theory of syntax has been developed over more than fifty years resulting in a number of (successive) models. The discussion here will make reference to the G&B model, the first model of the Principles and Parameters (P&P) theory,6 and the MP, the latest version of the theory. In G&B, the rule system of grammar consists of four subcomponents, namely (i) lexicon, (ii) syntax (subsuming a categorial (iia) and a transformational component (iib)), (iii) a phonological component (PF), and (iv) a logical component (LF). The components (i) and (iia) constitute the base and interact to produce an underlying (D-) structure, in that lexical items from (i) make available their arguments and are combined according to the structure rules of (iia). The transformational component (iib), consisting of a movement rule, maps D-structure onto S-structure,7 a structure close to the surface form of the sentence. For example, passive S-structures are derived For an earlier argument on this topic, see Schönefeld (: –). The P&P theory is an approach to the study of human language aiming at the characterization of a speaker’s knowledge of language and how this knowledge comes about. One of its central questions is what of this knowledge is innate and what aspects are acquired by exposure to language experience (cf. Lasnik and Lohndal : ). 7 S-structure is then mapped onto PF, the interface with the articulatory-perceptual system giving articulatory information and onto LF, the interface with the conceptual-intentional system giving all the details needed for the interpretation of the sentence. 5 6
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
from the D-structure by moving the verbal complement to the subject slot (cf. Chomsky : ). The lexicon is defined as ‘the repository of all (idiosyncratic) properties of particular lexical items’ (Chomsky : ). Among those are the item’s word category and its ‘(predicate-) argument structure’ (comprising its subcategorization frame and θ-grid).8 That is, a lexical item contains information about its syntactic class and the number and semantic roles of the arguments it takes, thus practically steering its own grammatical behaviour. Information about the semantic features of its arguments, aka selectional restrictions, can be derived from the semantic roles of the arguments and need not be stipulated in its entry (cf. Chomsky : –). Assumptions like these reflect theoretical insights of the time, as ascertained by Pinker (: ), for example, who argues that a lot of grammar has been found to follow from properties of the lexical items used. This is why ‘[r]ecent theories of grammar specify rich collections of information in lexical entries and relatively impoverished rules or principles in other components of grammar.’ The categorial component in the G&B model specifies—separately from the individual argument structure information given in the lexical entries—all the possible subcategorization frames a language provides for the insertion of lexical items (cf. Chomsky : ), i.e. it contains the general X-bar schema with the parameters set for the respective language.9 In previous variants of the model, this information was represented in phrase-structure rules (PSR). But since the categorial environment of a particular word derives from its argument structure, the information contained in the PSRs is actually redundant and they have been done away with.10 The X-bar schema, in close co-operation with Case Theory, generates the repertoire of phrase- and sentence structures into which lexical entries with the required argument structure can be inserted, like plugging fillers into schematic slots. The Projection Principle11 then guarantees the syntactic adequacy of the resulting structure.12 What seems to be left for the exclusive specification by the categorial component is the canonical serialization of phrases in larger segments, and the positions of relatively independent phrases, such as adjuncts. That means that for the production of a 8
On argument structure see also Asudeh, this volume. The X-bar schema is a generalization of the phrase-structure rules of (lexical) categories. It is thoroughly elaborated in Borsley (this volume). 10 The alternative, to keep the phrase-structure rules and to eliminate the predicate-argument structure, is ruled out because the latter is item-specific and does not derive from general principles (cf. Webelhuth : ). 11 This principle says that the representations at the syntactic levels (LF, and D- and S-structure) are projected from the lexicon in accordance with the subcategorization properties of lexical items. (cf. Chomsky : ). 12 The procedure is assumed to run in another sequence in language use. Concepts entertained by the speaker will activate lexical items associated with them, and these will choose their appropriate syntax in accordance with their predicate-argument structure and the principles of X-bar syntax (cf. also Grimshaw (: )). 9
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
grammatical utterance, a speaker must know the argument structure of the lexical entries involved and the parameterized X-bar schema inclusive of case theoretical aspects motivating the required movements of arguments to receive case. Different surface realizations of a verb’s argument structure (such as dative shift, or causativization) are captured by lexical operations in the lexicon. These change the θ-grid before the verb projects its structure (cf. Harley : f.), e.g., causativization adds an agent role to an otherwise intransitive event. The MP is the most recent (Chomskyan) model of generative linguistics (for detailed accounts see Chomsky , Marantz , Al-Mutairi , for example). We will briefly sketch what is new concerning the lexicon-grammar issue in this model. Basically, the assumptions about the lexicon have not changed dramatically. It contains phonological, morpho-syntactic, and semantic information in the form of feature bundles. Syntactic structure is now assumed to be generated in one component, the computational system (CHL). This is achieved by drawing on a token list of the lexical items selected from the lexicon for a particular utterance, the ‘numeration’. The latter sets the scope within which structure, the derivation, can be formed in CHL (cf. Richards : ) by such operations as Merge, Move, and Agree.13 For example, suppose the numeration contains the items , , . Grossly simplified, the sentence she hates cats would be computed by the operations of merge (e.g., hate + cat; she + hate cat), and move (movement of —to receive nominative case—to the subject position). Finally, the derivation can converge at the two interface levels PF and LF (cf. footnote ), if it contains legitimate PF and LF objects (cf. Bošković : ), i.e., if it is legible there (cf. Al-Mutairi : –). In later developments within the MP, it is argued that the syntactic computation does not use the full set of the numeration exhaustively before transferring its structure to the interfaces. One suggestion is that computation proceeds stepwise, drawing on lexical subarrays (rather than the complete set), and that each derivational step is immediately transferred to the interfaces for evaluation (cf. Richards : f.). This change does not seem to affect the role of lexical items in the model. However, a predicate’s argument structure is no longer required to be projected from the lexicon, because predicates, as functions, must combine with arguments anyway to be interpretable at LF. In other words, due to the ‘Full Interpretation requirement’ no further mechanisms are required to regulate the interaction of syntactic structure and lexical information (cf. Harley : ) thus obviating the syntactic trigger function of a word’s argument structure. A further shift towards syntactic centricity in this model becomes obvious when the place of G&B’s lexical operations (aka redundancy rules), which account for general argument-structure alternations, is considered. In the MP, such alternations Merge combines two syntactic objects α and β to form a complex syntactic object K (Al-Mutairi : ). Agree denotes a process by which a feature checking is initiated between a probe and its goal and an Agree relation is established if there is a match. Only then Move can take place (for more details, see Lohndal and Haegeman, this volume and Bošković : ). 13
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
‘can be . . . treated entirely within the syntactic component, via the same Merge and Move operations which construct any syntactic constituent’ (Harley : ). Finally, as far as the place of morphology is concerned, it has been a controversial issue in generative models where morphological processes are located, in the grammar (iib) or in the lexicon.14 Chomsky () had formulated what became known as the ‘lexicalist hypothesis’, according to which syntactic operations cannot apply to parts of words, so that morphological processes must be distinct from them. This idea marks the beginning of lexicalism, excluding from syntax derivational morphology (weak lexicalism) or both inflectional and derivational morphology (strong lexicalism) (cf. Scalise and Guevara : f., Harley : f.). G&B and also early work within the MP locate morphological processes in the lexicon (lexicalist approach), but later work assumes morphological structure-building to work in the same way as syntactic computation, so that it is naturally included in a language’s syntax. In the next sections, I will address such a non-lexicalist approach (Distributed Morphology) (see also Spencer (this volume: Chapter )) as well as models taking a lexicalist stance (such as LFG, and HPSG).
... Distributed Morphology (DM) Proponents of DM (first articulated by Halle and Marantz ) have developed a model of language without a lexicon in the traditional sense. Instead, the computational system is assumed to employ elements from lists of atomic entities (morphemes) to bring about all the combinations required: complex words, as well as syntactic phrases. That is, departing from the lexicalist hypothesis, complex words are also derived by the principles of minimalist syntax, namely the syntactic operations of Merge and Move (cf. Embick and Noyer : , Alexiadou : ), which makes DM in a way complementary to Minimalism (for details, see Spencer, this volume: Chapter ). The atomic (non-computational) elements come from three lists: i. a list of syntactic atoms, i.e. lexical roots (phonological sequences without meaning, such as [√/wɔ:k/]) and functional features (also called abstract morphemes, such as the word-category feature [noun] or [pl]), ii. a list of vocabulary items providing phonological representations to the abstract morphemes, such as [noun] /ø/ and [num][pl] /-s/, together with conditions on insertion, and iii. a list of the idiosyncratic meanings of roots or idioms. In the derivational process, the computational system takes elements from list (i) as ingredients to generate complex words and phrases as well as larger syntactic units. Vocabulary items from list (ii) provide the phonological forms, and specify their places of insertion. Hence, we would get [√/wɔ:k/] merging with the abstract (functional) 14
For a recent discussion of the ‘syntax-morphology interface’ in generative syntax, see Harley .
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
morpheme of the noun category [[√/wɔ:k/] n] and [[num][pl], /s/ added to the right. List (iii) would contribute the meaning of ‘pieces in particular contexts’, such as [[√] n] (an entity of slow self-propelled motion), or walk the line (behave properly) (for more details see Harley and Noyer , Scalise and Guevara , Embick and Noyer , Bobaljik , and Spencer, this volume: Chapter ). All of this feeds the processes of the morphemes’ morpho-phonological realization and semantic interpretation at the interfaces PF and LF respectively. The semantic interpretation (at LF) follows compositionally from the elements employed in the generation of structure in the computational system (cf. Marantz : , Harley : –). In terms of the issue pursued here, DM postulates a powerful syntactic component generating all structure (inclusive of complex words) by interacting at various stages of the grammatical computation with diverse individual linguistic elements that in other approaches are seen as unified in the lexicon. This characteristic is alluded to also by the model’s name, ‘distributed’ (cf. also Harley and Noyer : ), and it makes DM ‘a syntacticocentric, realisational, piece-based, non-lexicalist theory of word (and sentence) formation’ (Harley : ).
... HPSG In HPSG, a model developed in the s mainly by Carl Pollard and Ivan Sag, the relation between lexis and grammar is ‘skewed’ to the other component, making it a ‘more lexical’ approach. Comparable to G&B, the parsimony of the syntax is facilitated by ‘rich’ lexical entries containing the syntactic information required for the building of structure around them, such as information about the function, the semantic role(s), and the case of their complement(s) (including subjects) (cf. Pollard and Sag : ). Since HPSG adheres to lexicalism, word-internal structure is a product of the lexicon and opaque to syntax (cf. Nordlinger and Sadler ). To capture potential regular derivations of one lexical entry from another, such as inchoative and causative verb uses (The glass broke versus Somebody broke the glass), or nominalizations, HPSG employs lexical rules (analogously to G&B (section ...)) (cf. Müller : ). Other kinds of generalizations in the lexicon are captured by inheritance hierarchies (cf. Müller : ). For example, features common to all representatives of a particular word class are subsumed in ‘generic’ entries and as such ‘inherited’ by individual entries of this class. Grammatical functions (subject, object etc.) are also lexically defined, so only their configurational realization is left to be determined syntactically by ‘immediatedominance rules’ (cf. Pollard and Sag : ). Similarly to PSRs, these contain information about general dependencies (between heads, complements, and specifiers) and their serializations, inclusive of regulations for special arrangements, such as cleft or pseudo-cleft constructions (for English) and others. This implies that the syntactic rules produce the surface structure directly (cf. Jackendoff : ). Given this architecture, HPSG can be characterized as a (strongly) lexicalist approach. The model builds on a richly structured lexicon (inclusive of the mechanisms of morphological structure-building), which is the input to the syntactic component containing the mechanisms of syntactic (surface-)structure building.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
... Lexical Functional Grammar (LFG) LFG, a model mainly developed by Joan Bresnan (, ),15 distinguishes lexicon and grammar in a similar way. Though the internal mechanisms of their interaction are spelled out differently (cf. Schönefeld : –, Dalrymple , for example), the lexicon is just as richly annotated as in HPSG and contains lexical rules, too. Syntax consists in phrase-structure configurations (c-structures, containing also functional information), which determine the serialization of constituents and account for constituents independent of argument-structure requirements. Morphology is assumed to operate in the lexicon, providing word forms compatible with the requirements of c-structure (cf. Neidle : ).
.. Functional models In this section, we look at the relationship between lexis and grammar in models taking an expressly functional approach to language.16 In such an approach, language is understood as a means of communicative verbal interaction, and its features are assumed to be basically determined by its use (cf. Dik : , : ). This assumption is in sharp contrast with the generative perspective since it requires that language in use (performance) be taken into account when modelling speakers’ grammars.
... Dik’s model The main ideas of Dik’s understanding of language are laid down in Dik (, : –). He assumes that a language, or grammar, comprises two components, the ‘fund’ and ‘expression rules’. The fund contains terms and predicates, i.e., expressions naming entities and expressions naming relations or properties. They are either stored in the mental lexicon or constructed by applying productive rules of predicate and term formation. Speakers produce a ‘predication’ on the basis of an augmented predicate frame containing information on the predicate’s argument structure, the terms chosen for ‘filling’ these arguments, potential satellites (for adjunct terms), predicate operators for tense and aspect, as well as syntactic and pragmatic functions of the terms and sentence type. Expression rules regulate the transfer of these annotated predications into expressions, determining the form of the ‘constituents’ and their distribution in the sentence. (For more details, see Dik : –, : –.) For illustration, imagine the frame ‘give’ with the selected arguments of ‘my mother’ (as subject), ‘the present’ (as object), ‘my brother’ (as complement), the adjunct ‘finally’, the operator for past and the information that it is a contrastive statement. The expression rules would determine the resulting sentence: My mother finally gave the present to my brother, with focal stress on the last PP. From the s on, Bresnan has contributed massively to the further development of syntactic theory: she combined basic assumptions of LFG with those of Optimality Theory (OT), creating ‘Optimal Syntax’ (cf. Bodomo : ). 16 For an earlier discussion of this see Schönefeld (: –). 15
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
In such a model, lexical and grammatical aspects seem to be closely intertwined. The expressions stored in the ‘fund’ are of a lexical nature. The rules for predicate and term formation, e.g., those ‘regulating’ word-formation, can be relegated to grammar, more specifically to (derivational) morphology. Syntax comes in when the constituents of a predication are linked to syntactic functions, like subject, object or adjunct, on the basis of their semantic features. This procedure follows a universal strategy, the Semantic Function Hierarchy of alignments of semantic and syntactic functions (cf. Dik : –), thus in a way mediating between the lexicon and the syntax. Pragmatic functions, such as a constituent’s informational value (as topic or focus, for example), follow from guidelines of contextual embedding rather than syntactic rules, but may affect functional alignment, too. The only truly syntactic procedure is the serialization of the total predication on the basis of expression rules. They regulate how the particular semantic and/or syntactic functions are marked by case, adposition, or other markers, thus also handling inflectional morphology. These assumptions make Dik’s model lexico-centred, and the idea that (constituent-) structure follows from the lexical items selected (cf. Dik : –) suggests a similarity with HPSG and LFG, at least when it comes to the parsimony of syntax.
... Halliday’s Systemic Functional Grammar (SFG) In Halliday’s view, a language is a system network of choices available to the user for the making of meaning (Halliday : xxvii). It specifies all possible combinations of choices, thus indicating the permitted paths through it (cf. Halliday a: ). The speaker’s choices (e.g., what to say, his/her intention and focus) will become explicit to the hearer by the use of particular verbal means. This holds for words and larger constructs alike. The former can be illustrated by one of Fawcett’s examples (: , , , –), the network of choices associated with the personal pronoun her. Using this pronoun, the speaker indicates the path through the network shown in Figure . (adapted from Fawcett : ). The more complex structures, e.g., particular phrase or sentence types, order of constituents, or transitivity structure signal more general choices, e.g., the absence of a PERSON First Second Third
NUMBER Singular
Plural
GENDER Masc Fem Neut
CASE Nom Acc her
. Network of choices associated with her
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
subject in an (English) utterance indicates the speaker’s intention of requesting something (cf. Halliday a: ). Such signals are conceived of as ‘realization’ statements of choices rather than rules. The fact that speakers’ choices for making meaning are always signalled in lexically specific forms may be why Halliday posits that grammar and lexicon mean basically the same phenomenon ‘seen from opposite perspectives’ (Halliday : ). The perspective in grammar starts from the general choices provided by a language and their structural realizations, such as clause type and mood, and proceeds to ever more specific choices. Halliday (: xxvii) shows that for the system network for English the clause, which represents the process type to be communicated and associated participant roles, is taken as the entry point into the network. At the most specific level are the choices between particular lexical items ‘which . . . represent the most delicate distinctions that the system embodies’ (Halliday b: ). In this sense the lexicon can indeed be seen as the ‘most delicate grammar’ (Hasan : ), and is thus incorporated into the category of (lexico-) grammar. Unlike the models discussed so far, Halliday’s conception of grammar and lexis as one system network does not differentiate between what is regular/compositional and what is idiosyncratic, or what speakers compute and what they retrieve from memory. Instead, the system seems to be organized along the dimension of generality/specificity (of choices). The more specific (or delicate) the level of choices is, the more lexical the realizing structures are. As for the place of inflectional morphology, it is part of the lexico-grammatical continuum (cf. Fawcett : ). It signals choices from closed systems and structures with general meanings, such as tense (the domain of grammar), rather than from the open sets of words and collocations with specific meanings (lexis) (see also Halliday : ).
.. Corpus Linguistics: new methods opening up new perspectives The choices at the ‘lexical end’ of the continuum were the starting point of corpus linguistic analyses revealing the lexical patterning of language. Since then, corpus data have been used as a source for investigating many other aspects of language, e.g., grammatical constructions, conceptual phenomena (such as metaphor), discourse organization, language change, etc.,17 producing new theoretical claims and insights.18
17
From a more applied angle, corpus analyses have resulted in a number of language descriptions (mainly of English), such as the Collins Cobuild English Grammar () with Sinclair as Editor-inChief, Pattern Grammar (Hunston and Francis ) and the Longman Grammar of Spoken and Written English (Biber et al. ). 18 An outline of ‘corpora in linguistic research’ is given by Schönefeld (: ff.); corpus-linguistic methods and grammar are discussed by Wallis (this volume).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
In the early days of corpus-linguistic research, corpora were neither part-of-speech (POS) tagged, nor parsed, so that research concentrated on the usage of words. In the tradition of British Contextualism (as represented by Firth and Sinclair , for example), the analyses focused on a word’s contexts to elucidate meaning via the lexical and structural patterns in which it occurs. Such investigations were based on KWIC (key word in context) analyses of corpora. Over time, they became ‘more grammatical’, exploring the lexical potential of particular structural slots, such as a N of, and it became obvious that structural and lexical choices are mutually dependent (cf. Francis : ). This interdependence is reminiscent of Halliday’s concept of lexicogrammar, but seen from the lexical end: lexis is taken to drive syntax in that lexical items trigger structures into which they fit ‘comfortably and conveniently’ (Francis : ). In principle, this idea alludes to the projection principle of G&B (see footnote ) and the assumption of a richly specified lexicon in HPSG and LFG. However, the (corpus-linguistic) discovery of lexical patterns, like collocations (e.g., quick shower, fast train, or take a break), chunks with variable slots (e.g., become Adj, a(n) N of ) or idioms (e.g., give me the creeps), suggests that the grammar and the lexicon do not interact in the proposed ‘slot-and-filler’ procedure. Instead, two other principles appear to be at work in language use, namely the idiom principle and the openchoice principle, differentiated on the basis of storage versus computation (cf. Sinclair : –). The former denotes the fact that native speakers of a language have stored and use a large number of virtually prefabricated lexically specific syntactic segments. If these are experienced sufficiently frequently, they can be recalled as practically one choice rather than being constructed by filling syntactic structures with words (cf. Sinclair : –, Sinclair : ). That speakers are simultaneously aware of the grammatical structure of such segments is supported by the existence of fixed phrases and idioms allowing for some variability (e.g., take NP to task, get one’s head around NP).19 The open-choice principle refers to the ability of speakers to combine words online on the basis of the structural patterns (schemata or syntactic rules) they know. But, given the sheer amount of lexical patterns, this principle must be assumed to be secondary (cf. Sinclair : ). These assumptions have not gone unchallenged and researchers have tried to collect independent empirical evidence for or against them. One such example is Snider and Arnon (), who hypothesize that in a lexical decision experiment, subjects should be faster with holistically stored units (complex words) than with expressions to be assembled (phrases, n-grams). The evidence is mixed, speaking for both storage computation for both compositional non-compositional phrases. This finding is difficult to accommodate in dual-system theories, which correlate storage (the lexicon) with non-compositionality/idiosyncrasy and computation (the grammar) with compositionality/regularity. Single-system theories, on the other hand, can account for the
19
The issue of storage and computation of entities larger than words is addressed in Jackendoff (), for example.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
storage of frequent compositional units or patterns,20 and perhaps also for the computation of non-compositional expressions. Models of this type have been developed within usage-based research.
.. Usage-based theories Usage-based theories give prominence to language use21 and aim at showing how a speaker’s knowledge of language can ‘emerge’ from usage, thus obviating the need for assuming innate linguistic knowledge for the explanation of language acquisition. Early arguments about the validity of such an idea, such as Hopper (, ), used the term ‘emergent grammar’, borrowing the term ‘emergent’ from cultural anthropology. Hopper (: ) uses this term to suggest that ‘structure, or regularity, comes out of discourse and is shaped by discourse as much as it shapes discourse in an on-going process’. This implies that grammar is dynamic and that there is no final or ‘steady state’ in a speaker’s grammatical competence, thus also accounting for such phenomena as linguistic variation and language change across the lifespan (as investigated by Sankoff and Blondeau , for example). MacWhinney () substantiates the concept of emergence by analysing some types of the (biological, psychological, and language) processes involved. He elaborates on how linguistic structure can plausibly be assumed to emerge as a flexible and dynamic system, adapting itself to usage (cf. Bybee and Hopper : f.). Some particulars of how language experience can shape a speaker’s grammar were addressed by Bybee (, , ), for example, who understands grammar ‘as the cognitive organization of one’s experience with language’ (Bybee : ). In her view, it is central to find out how linguistic structure can be derived ‘from the application of domain-general processes’, that is, processes known to be part of the human cognitive apparatus in general, such as categorization or chunking (Bybee : , ). For the cognitive representation of language experience, Bybee suggests an exemplar-based model, where each speech event is recorded in the mind in a huge network of exemplars that emerges on the basis of associations between them (for a detailed description, see Bybee ). In our discussion of the notions of lexicon and grammar it is crucial to consider what such assumptions mean for the issue of storage versus computation, as well as the (non-)compositionality issue reported in the previous section. As for storage, it is assumed that speakers store and access the units of usage (Bybee and Hopper : ), 20
The role played by frequency of experience for storing units is debated at length in Divjak’s () article on frequency and entrenchment. 21 As said above, the idea of basing linguistic inquiry on language use has been common practice in the functional-linguistic framework. A lot of this work is situated in the field of language typology, such as Greenberg (), Givón (b), Haspelmath (, ), for example, and cognitive linguistics, such as Langacker (, ) and Goldberg (, a).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
such as words (rain; exceptional), collocations (acid/tropical/heavy rain; exceptional circumstances/cases/quality/talent), words with slots around them (X gonna V; V X into V-ing (Z)), and also schemata (such as Det + N; P + NP), with the complete network depending on speakers’ linguistic experience. As for computation, speakers are assumed to unify stored schematic or partially schematic units with stored specific (i.e., lexical) material, such that the lexical items ‘fill’ the unspecified elements in the stored (partially) schematic patterns (cf. also Bybee and Hopper : ). Another consequence of exemplar storage would be the availability of the instances of a syntactic construction, even when a generalization has been made (Bybee : ). This suggests that it does not matter for storage if some form is lexical (specific) or grammatical (schematic). What matters is frequency of experience, because repetitive experience is assumed to strengthen the stored representation.22 Given these assumptions, usage-based accounts do not have to compartmentalize speech units into lexicon and/or grammar. Instead, considering such usage phenomena as token and type frequencies, they differentiate conventional units with respect to their specificity, complexity, flexibility, or productivity. These units contain material of all ‘classical’ sorts: lexical, morphological, and syntactic. To my knowledge, the most comprehensive models of language working from these premises have been designed by cognitive-functional linguists, such as Ronald Langacker and Adele Goldberg.23 As usage-based models, they naturally draw on usage data for the elucidation of knowledge of language.24 Moreover, their assumption of ‘meaningful grammar’ (cf. Croft and Cruse : ) marks them as accounts of language which consider lexicon and grammar to contain the same types of elements, namely signs (cf. section .). In what follows, we will look at Langacker’s () and Goldberg’s (, a) models of language, focusing on what can be concluded about lexical and grammatical knowledge from their usage-based perspective and what this means for the concepts of lexicon and grammar.25
... Langacker’s Cognitive Grammar In Langacker’s model, ‘the actual use of the linguistic system and a speaker’s knowledge of this use’ (Langacker : ) have been prominent from early on. Being exposed to language, a speaker is assumed to acquire (and know) a hierarchically structured inventory of conventional linguistic units (cf. Langacker : , : f.), with 22
The relationship between frequency and entrenchment is currently a hotly debated issue in linguistics. The reader is referred to Bybee , Schmid , Blumenthal-Dramé , and Divjak , for example. 23 Their names are associated with the cognitive-linguistic approach, emerging in the s as an alternative proposal to the formalist generative framework. 24 Usage data are not just corpora (as products of language use), but also include data informative about language use as process, i.e. data collected during language processing, such as observations, experimentally elicited data, etc. 25 For an earlier argument on this topic, see Schönefeld (: –).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
lexical entries at the most specific (bottom) level and schematic constructions at the top. These units are understood as cognitive routines, i.e. as stored or entrenched units whose use is a matter of retrieval rather than computation. This is uncontroversial for elements traditionally associated with the lexicon, but what about schematic and partially schematic constructions? The former compare to the structural patterns or rules discussed earlier in the text (such as SVO (or NP V NP) for transitive clauses; Det + N for NP), the latter to structural patterns associated with particular words (such as English ‘V one’s way to X’ (the way-construction), ‘V Y into V-ing (Z)’ (the intocausative),26 or ‘a bunch of N’). In Langacker’s model (and from a usage-based perspective), the (partially) schematic constructions are assumed to emerge from the experience of a number of similar utterances, or parts thereof, by generalization/abstraction from what is variable between them (see Langacker : f.; Bybee and Hopper : ). They are stored as (partial) schemata which are retrievable as unitary entities, and they unify with other, less, or non-schematic linguistic units to represent fully-fledged usage events. For example, the experience of such units as read a book/a letter/an article/a story etc., may trigger schematization of the variable slot N (giving the partial schema read a(n) N), and experiencing other determiners may trigger read NP (i.e. Det + N). Such assumptions have received strong experimental backing in psycholinguistic research into language acquisition, as carried out by Tomasello (e.g., ), Lieven (e.g., , ), and many others. All this amounts to understanding linguistic knowledge as a network of signs organized along a cline of specificity (degree of abstraction/schematization). Besides this, the signs available to speakers differ with respect to entrenchment (stored or constructed) and symbolic complexity (simple or complex). These parameters jointly determine the place of a unit in the network. The typical units at the lexical end, the traditional lexicon, are highly specific (less complex) and entrenched. The typical units at the grammar end (the fully productive schemata in the classical sense of syntactic rules) are characterized by complexity and schematicity and need to be unified with specific units in a usage event. Quite naturally, partially schematic constructions containing both lexical and schematic material find their place in the middle, as entrenched and rather complex, but not yet fully specific, units (see also Langacker : ).27 They are numerous and were shown to be the ones problematic to position in models stipulating a strict distinction between lexicon and grammar. Langacker’s model comprises them all, positing for cognitive grammar a gradation uniting lexicon, morphology, and syntax (cf. Langacker : ), and suggesting ‘that any specific line of demarcation would be arbitrary’ (Langacker : ). The explicit incorporation of morphology into the cline follows from the fact that classical morphology exhibits the same typology of constructions. They may be fully 26 The reader will recognize the allusion of such linguistic units to the notion of construction in the traditional sense, where constructions are usually treated as special structures rather than signs. 27 The (more) schematic constructions account for the productivity of language.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
specific (move-ment, invest-ment) and hence lexical, or (partially) schematic (V-mentN; V-suffixN), and hence (more) grammatical. Finally, Langacker also argues that the emergence of schematic units by abstraction from (variable) instances suggests the co-existence in a speaker’s mind of both instances and schemata (e.g., movement, as well as V-mentN, V-suffixN; or give X a kiss, make a statement as well as V NP NP and V NP respectively). This explains why for a large number of speakers of English, such expressions are retrieved from storage, although they can just as well be assembled from words and the respective schemata. More specific sub-schemata, such as give NP NP or send NP NP instantiating the more general V NP NP, may also be stored, providing the user with a network of interrelated schemata at various levels of schematicity. This conception of linguistic knowledge makes the cognitive approach to language description both maximalist and nonreductive (cf. Langacker : ) in the sense that the abstraction of schemata does not erase the underlying more specific instantiations from memory.
... Goldbergian Construction Grammar28 Goldberg also (, a) makes a strong argument for the existence of syntactic structures (e.g., the ditransitive construction) as signs, claiming that ‘[p]articular semantic structures together with their associated formal expression must be recognized as constructions independent of the lexical items which instantiate them’ (Goldberg : ). She uses the term construction29 to include all linguistic signs (from morphemes to clause-level constructions), irrespective of their level of specificity/schematicity. With this, she takes a position comparable to Langacker’s in acknowledging the existence in speakers’ linguistic knowledge of both specific expressions (instances or instantiations) and schemata. She makes a point of motivating clause-level constructions by the experience of humanly relevant basic scenes or event types,30 and assumes them to co-exist with lexical items that are associated with the more richly specified concepts of particular events (such as give, hand or tell for the transfer-of-object scene). Her claims are tested on data from language acquisition (Goldberg : , f.), where she finds that the child can recognize (schematic) clause-level constructions, if a sufficiently large number of lexical variation is encountered in expressions of the respective basic scenes. As a result, the child will know individual instances, partially schematic constructions, in addition to the (generalized) clause-level construction (Goldberg : ), providing for both the ability to construct a complex expression from more elementary units and its memorization if it is entrenched as a fully specified construction. The same procedure can account for acquiring knowledge of morphological constructions. 28
See also Hilpert, this volume, on constructional approaches to English grammar. A survey of the different readings of the term in linguistics is given in Schönefeld (). 30 For example, the ditransitive is motivated by the scene of someone causing transfer of an object to someone else (cf. Goldberg : , Goldberg ). 29
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
A final aspect to be mentioned relates to the interaction of components in partially and fully schematic constructions, where schematic slots and lexical fillers need to unify. As first suggested by corpus-based investigations,31 this process is constrained in that schematic slots in constructions, e.g., the verb slot in the ditransitive, prefer particular lexical fillers (here: particular verbs) over others of the same word category. Conversely, particular words also have preferences for the constructional slots they fill. Such preferences have been operationalized by using association measures, such as collostruction strength, introduced by Gries and Stefanowitsch (Stefanowitsch and Gries , , , Gries and Stefanowitsch a, b). This measure has been found to be a good indicator of the attractions between words and schemata, accounting in part for the effects in language use that Sinclair described as the ‘idiom principle’ (see section .. above). Other association measures have been suggested, such as Delta Π. Unlike collostruction strength, indicating mutual attraction, Delta Π is informative of the direction of the attraction: from schema to word or vice versa.32
. S
.................................................................................................................................. I will now summarize the main points that emerge from this survey of linguistic theories for understanding the lexicon-grammar issue. Dual-system theories, assuming a dichotomy of lexis and grammar, are widely found within the generative formalist framework. In such models, the allocation of a linguistic unit to lexis or grammar is an either-or decision, most commonly based on such criteria as an item’s regularity/ compositionality and idiosyncracy, although the individual models are quite different with respect to what they qualify as lexical or grammatical. Single-system models, assuming a cline between lexis and grammar, are more naturally related to functionally oriented approaches. They can provide for knowledge of linguistic units, such as the phrase answer the door, as both a lexical and grammatical pattern, covering the fact that it is known as a collocation (i.e., a lexical/specific phrase) and is simultaneously associated with a schematic template, namely [V NP]. Despite these different ‘solutions’ to the issue, some of the criteria used for deciding on a linguistic unit’s nature as lexical or grammatical are substantially similar, such as the criteria of storage, specificity, and irregularity (for lexis) and computation, schematicity, and regularity (for grammar). However, as the previous example suggests, these criteria do not always converge, i.e., schematic units may be stored as lexically specific instances, and specific instances, such as the word undoability, may be computed on the basis of a schema ([Vtrans –ability]) rather than retrieved from memory. Such units have features of both lexicon and grammar, which is inconsistent with a dichotomous view. If, however, the assumption is that lexicon and grammar are 31 32
For a broad discussion of how to generalize from corpus data, see Wallis, this volume. For an informative discussion of such measures, see Gries .
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Ö
positioned along a continuum and the criteria jointly constitute the prototypes of lexis and grammar, there is room for less typical units as well. At one end of the cline, we have the lexical units, which are specific, unpredictable formmeaning pairings (such as words and morphemes), retrieved from storage. At the other end sit grammatical units, which are schematic, regular (such as phrasal and clausal schemata (or syntactic structure)), and employed in computational processes. In between we find partially schematic units (of various complexity) containing both lexical and schematic information, such as the into-causative [V X into V-ing (Z)]. The mid-level is also occupied by morphological constructions, such as [V-mentN], drawing on material from the lexicon and arranging it according to structural templates of co-occurrence similar to those at the syntactic end in principle.33 All in all, this understanding is better compatible with what usage data reveal about the (patterned) nature of linguistic knowledge and how it is used in communication— also in light of recent findings in the neurosciences. In a paper discussing the neuronal plausibility of grammatical theories, in particular of usage-based construction grammar, Pulvermüller et al. () test neuroscientific evidence for some of its basic tenets, among them the assumed lexicon-syntax continuum. Experimental findings demonstrate that, for example, a particular brain response, called Mismatch Negativity, shows up differently depending on whether it is tested in the comprehension of words/pseudowords or grammatical/ungrammatical strings. The authors conclude (p. f.) that ‘word-level units (“lexical items”) . . . are very different things, in neuromechanistic terms [i.e. in terms of the brain mechanisms underlying them, DS], from above-wordlevel units’, and argue that they should be kept distinct. In their view, this conclusion does not rule out a continuum view, if prototypical lexical and grammatical units are understood to sit at its opposite ends. The space in between is occupied by the linguistic units (as cognitive routines) containing both lexical and grammatical material. The importance and the variety of such ‘mixed’ constructions has amply been shown by research in usage-based construction grammar, suggesting a populous area at the mid-level of schematicity in a network of constructions. They comprise polymorphemic words, phrases, and constructional idioms with schematic slots (such as the English way-construction). Studies in morphology, such as Booij (), also cannot but notice the mixed nature of a language’s linguistic units. Focussing on the differences and commonalities of ‘word level constructs’ and ‘phrase level constructs’ from a construction-grammar perspective, he finds that syntactic and morphological schemata are crucially employed in the making of lexical expressions, so that a strict boundary cannot be drawn (cf. Booij : , ). Therefore, it seems more appropriate to consider a speaker’s knowledge of language to comprise lexical and grammatical units (as signs with different properties) as well as 33 This similarity is corroborated by findings in grammaticalization research, where it has long been shown that a language’s morphology is related to earlier forms of syntactic co-occurrence (cf. Givón , for example).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
many ‘mixed’ units containing lexical and grammatical elements (exhibiting properties of both). In this sense, the relation between lexis and grammar can be understood in terms of Aarts’ (a: ) ‘intersective gradience’ as ‘the occurrence of situations in which elements conform to an intersection of sets of properties, rather than an intersection of categories’. According to the usage-based theories surveyed here, the picture of lexis and grammar is more complex than traditionally assumed. A language does not only have a (classical) lexicon, housing the stored units with an unpredictable form-function pairing (from morpheme to idiom) and a (classical) grammar, containing the rules for regular combinations (from complex word to clause), but also a large amount of constructs showing ‘mixed’ features, such as storage computation, or specificity schematicity. The importance of such mixed units in a speaker’s knowledge of language is not a new discovery. It became apparent in studies of language in use, for example, in Sinclair’s () postulation of the idiom-principle (cf. section ..). Along similar lines, Bybee (: f.) argued that speakers make use of a large number of formulaic expressions, rather than employing abstract rules, and that these are a mark of native competence beyond knowledge of a large vocabulary and the command of grammatical rules. With the development of usage-based theories, these findings could be incorporated into a theoretical framework of language as a whole, systematically accounting for mixed-type constructions of all types (complex words, collocations, constructional idioms etc.) as well. They add to the (stored) ‘idiosyncratic’ part of a language and leave a lot less room for the (computational) ‘free’, or unconstrained, use of the rules/schemata of grammar in language use.
A The author would like to thank the anonymous reviewers and the editors of this book for extremely helpful comments and suggestions on earlier versions of the chapter, whose quality was greatly enhanced by their constructive criticism. Needless to say that all remaining errors and inconsistencies are the responsibility of the author.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
......................................................................................................................
......................................................................................................................
. A
.................................................................................................................................. T title of this chapter is an oxymoron for many phonologists. If the hallmark of a grammar is that it involves linguistically-defined structures and rule-governed systems across those structures, then it is not hard to argue that phonology is as much a part of the grammar as the narrow ‘grammar’ of morphosyntax, since the patterning of sounds within and between varieties of English (and other languages) is systematic and amenable to structural analysis. Although we will focus in this chapter on phonology in the speech modality, the same generalization (that structures are linguistically-defined and operate within rulegoverned systems) holds also of sign languages (e.g., Brentari ). This parallel across different modalities may be a clue to how phonology could be perceived as being on the periphery of the grammar ‘proper’: phonology is obliged to take seriously the question of how grounded a grammar is in the physical outworking of abstract cognitive or computational categories, and phonological argumentation is increasingly explored by means of phonetic evidence, in laboratory phonology approaches (Pierrehumbert et al. ). Many of the key debates in phonology revolve around the question of whether phonology and syntax are or should be either ‘autonomous from, or grounded on, extra-linguistic reality’, with phonologists holding a range of positions (BermúdezOtero and Honeybone : ), from the view that syntax and phonology are both autonomous (van der Hulst ), or that they are both grounded (Anderson , Bermúdez-Otero and Börjars ), or that syntax is autonomous and phonology is grounded (Burton-Roberts and Poole ). The title of this chapter places phonology outside of the grammar, and thus expresses tacit adoption of the last of these three positions (autonomous syntax versus grounded phonology). This title is thus an incomplete reflection of the range of views held by phonologists, though it may reflect a view commonly (if not exclusively) held by those working on morphosyntax.
OUP CORRECTED PROOF – FINAL, 23/10/2019, SPi
This tension is exemplified on the theoretical level in the disjoint between the claim that the goal of linguistic analysis is to account for linguistic competence rather than performance (Chomsky ) and the very large body of work on formal structural analysis of phonological patterns within the Generative Linguistics tradition (e.g., Kenstowicz ). On a more mundane level, it may be that phonological patterns are less accessible due to the simple fact of not being written down, resulting in primacy of the written form (text) over spoken language (speech). Although this seems implausible, we cite the example of English Language A-level specifications in the UK, which claim to engage advanced high school learners in analysis of spoken language, in the form of transcriptions of conversations, but in which it is rarely the sound patterns in those texts that are the focus of attention. Our response to the challenge of exploring the interaction of grammar and phonology in a single chapter is to focus on phenomena which, in our view, present the strongest challenge to the notion that English grammar can be investigated without reference to phonology. In other words, we seek to show when and why those who work on grammar, as narrowly defined (morphosyntax), could or should pay close attention to the phonological facts. In section . and section . we provide an overview of phenomena at the interfaces of phonology with morphosyntax, at the word and sentence level, respectively, across varieties of English, along with a snapshot of the theoretical debates which they have inspired. We then present a case study in section . of the role of phonological properties in word class categorization in English, which argues for inclusion of a phonological component within the grammar as narrowly defined, if an adequately rich description of speakers’ linguistic knowledge and behaviour is to be achieved. Section . provides an overview of how phonology is handled in two distinct grammatical frameworks, while section . brings the chapter to a close with some final reflections.
. W
.................................................................................................................................. The precise definition of what constitutes a word is much debated, but there is good evidence for a basic distinction between a phonological word (identified by phonological features) and a grammatical word (identified by grammatical features); the mapping between the two varies across languages, but can always be clearly defined (Dixon and Aikhenvald ). From the listener’s point of view, the main difficulty in identifying phonological words is that speech is presented to us as a continuous stream of sounds, as visualized in (). ()
thespeechstreamiscontinuousakeytaskinparsingspeechistodeterminewh erethe-wordsbeginandend
Arguably the most important question that phonologists must answer is thus the following: how do listeners determine where words begin and end? One answer
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
comes from phonotactics, that is, the regularities in the combinations of sounds occurring in different positions in a syllable (and thus in a word). Phonotactic regularities provide cues for use in the listener’s task of word segmentation (Kaye ), which is the process of determining at which points in the continuous speech stream to attempt lexical retrieval. In English, phonotactic cues are a useful indicator of what is a possible word, precisely because there are phonological constraints on what is a possible word. English speakers, even without linguistic training, know that ‘blint’ is a possible word in English (even though it is not a real word), but that ‘bnint’ is not a possible word. It follows, then, that all and any instances of the sequence [bn] heard in a stream of English speech must contain a word boundary between the [b] and [n], since [bn] is not a possible word-initial or word-final consonant sequence in English. The impossibility of word-initial [bn] in English is an example of phonotactic constraints at the segmental level (involving individual vowels and consonants and their combinations). A speaker’s judgement of the grammaticality of ‘blint’, and the ungrammaticality of ‘bnint’, is likely to be just as clear as their acceptance of a word containing only phonemes found in English and their rejection of a word containing one or more non-English phonemes. Phonological knowledge thus encodes the ‘grammar’ not only of which units are permitted in the language (phonemes) but also of how they may combine (phonotactics). There are equally powerful prosodic or suprasegmental constraints on what is a possible word. All English words are obligatorily comprised of at least one stress foot, comprising either one heavy syllable or two light syllables; syllables which count as heavy in English include those which are closed by a coda consonant (CVC) or which contain a long vowel or diphthong (CVV). Thus, [li] (with a short vowel CV, not a long vowel CVV) is not a possible word in English, because a viable word in English must be heavy enough to be stressed. The word ‘lit’ [li] ‘bed’ is fine in French, of course, and this may be because French is generally agreed to be a language without word-level lexical stress (Delais-Roussarie et al. ). French thus places fewer restrictions on the prosodic shape of possible words, to the extent that words may be comprised of a light syllable, and thus can be as small as a single segment, e.g., ‘eau’ [o]. The position of lexical word stress is governed by a number of interacting factors (Guion et al. ), but stressed syllables prove nevertheless to be a useful cue in word segmentation. In a seminal study, Cutler and Norris () found that English listeners’ attempts at lexical retrieval in a word-spotting task showed sensitivity to stressed syllables in target words, rather than to all syllables (the latter is the pattern shown by French listeners). However, Mattys et al. () show in a series of lexical decision tasks with British English listeners that higher order ‘top-down’ (non-phonological) information is also used in word segmentation; they found that listeners used both lexical and contextual semantic information to predict what type of word might be coming next in the stream of speech, in deciding the points at which to attempt lexical retrieval. Their study further shows that in ordinary listening conditions this top-down information is relied on more than ‘bottom-up’ cues, such as phonotactics or stress, with these latter relied on only in adverse listening conditions. As we discuss in section . below,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
phonological regularities distinguish word classes more generally in English, suggesting that phonology may contribute to the so-called top-down processing also. Once the phonological word has been identified, the mapping from this to the grammatical word (or morphosyntactic word) is not always one-to-one. Selkirk () argues for a different structure for weak and strong pronouns in English, based on their ability to bear stress. Unstressed weak function words must be cliticized to the phonological word (or ‘Prosodic Word’, PWd), within a higher level of phrasing (here, the Major Phonological Phrase, MaP) as shown in (), in a recursive prosodic structure which mirrors syntactic structure. Phrase-final function words, on the other hand, are mapped to a PWd and can thus bear stress (Selkirk : ). () a. English weak pronouns MaP
[Fnc] = ‘function word’ [Lex] = ‘lexical word’ DP
PWd ( fnc (lex) PWd)MaP
NP
Det
N car”
“We sold the b. English phrase-final strong pronouns
[Fnc] = PWd VP
MaP PWd PWd ( (lex)PWd (fnc)PWd)MaP
V “We need
DP Det him.”
The status of affixes in English is subject to debate, and there are different theoretical positions on how much of the patterning of affixes requires phonological explanation. For example, according to the Sound Pattern of English (Chomsky and Halle ), a pair of words like ‘electric’ and ‘electricity’ are related forms, and a rule of ‘velar softening’ is proposed to account for the [k]~[s] alternation between the two words. Other theories argue that there is no role for phonology in alternations of the type between ‘electric’ and ‘electricity’, which should instead be analysed as separate lexical items (Kaye ). A large body of work under the broad heading of Lexical Phonology has sought to account for the range of phonological effects observed when affixes are added to a stem or word in English. For example, some lexical affixes in English induce a stress shift (such as -al), but others do not (such as -ing): ()
parent parental [ˈpeәrәnt] [pәˈrɛntәl]
parenting [ˈpeәrәntɪŋ]
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
The stress-shift is analysed in Lexical Phonology as due to cyclic application of rules, and the theory further differentiates lexical and postlexical affixes. These different levels of analysis are conceived as cycles in derivational models (Kiparsky b), and as strata in their more recent instantiation within Stratal Optimality Theory (Bermúdez-Otero ). In an early example of the ‘autonomy versus groundedness’ debate, postlexical rules were reanalysed as Phonetic Implementation Rules which fall outside the grammar (Liberman and Pierrehumbert ), but this was rebutted as an over-simplification by Kiparsky (: ff.) who argued that ‘even gradient application might not suffice to ban a process from the phonology’, since some processes are phonetically gradient but appear nevertheless to be perceived as categorical (see e.g., Jansen on voicing in English).
. S
.................................................................................................................................. The example in () above over-simplifies matters, of course, since we are more likely to perceive the utterance as laid out in () below, in which the right edges of potential prosodic phrases are marked with an upwards arrow: () thespeechstreamiscontinuous akeytaskinparsingspeech istodeterminewherethewordsbeginandend
↑
(↑)
↑
The string of speech is broken up into prosodic ‘chunks’, known variously as phonological phrases, intonation units, or breath groups (Wells ), and classified as ‘tonality’ in the British School of intonation (Halliday b). In the examples in (), there are strong, and probably obligatory, boundaries at the right edge of full clauses (after ‘continuous’ and ‘end’); there is a weaker, probably optional, boundary (after ‘speech’) separating the subject constituent of the second clause from the verb phrase. Regular constellations of phonological and phonetic (i.e. acoustic) cues are observed at the edges of prosodic phrases. These cues include gradient phonetic effects such as lengthening of the final word in a phrase, as well as more salient categorical cues such as insertion of a pause or use of a particular intonational cue such as a rising or falling pitch contour. Other more subtle cues are also regularly found, such as increased tempo and amplitude (loudness) at the beginning of phrases, but slowing down and reduced amplitude at the end (Cruttenden ). The ‘Prosodic Bootstrapping hypothesis’ (Pinker ) claimed that children could make use of regularities in the mapping between syntactic and prosodic constituency to tap into the syntactic structure of the ambient language around them. Although it is indeed the case that infants are sensitive to some aspects of prosody from the earliest stages of language acquisition (e.g., Jusczyk et al. ), the hypothesis is argued to be overly simplistic, due to high cross-linguistic variability in the syntax-phonology
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
mapping in real language use (Gerken et al. , cf. Venditti et al. ). Nevertheless, there are sufficient trends in the alignment of syntactic and prosodic constituency— such as those sketched above in example ()—to tempt the proposal of ‘rules’ to predict the observed mappings in English (Chomsky and Halle , Halle and Vergnaud ), many of which posit a representational layer of phonological constituents known as the Prosodic Hierarchy (Nespor and Vogel , Selkirk , , ). Crucially, there are mappings which might be expected if phonological phrasing were determined entirely by syntax but which are not observed, and it is these lacunae which call for a ‘grammar’ of the syntax-phonology interface. For example, a well-known case of a phonological effect on syntactic phrasing is the case of Heavy Noun Phrase Shift, whereby a subject or object constituent is more likely to be moved to a peripheral position in the utterance if it is syntactically complex and/or prosodically heavy (Zec and Inkelas ).1 We can see an echo of this effect in the example in (), in which the subject of the second sentence can (optionally) be phrased into its own prosodic phrase (denoted by placing the upwards arrow within brackets). In contrast, in most contexts it would be odd to phrase the subject separately if it is short (that is, syntactically less complex and prosodically lighter) as in (): ()
thetask (?↑)
istodeterminewherethewordsbeginandend ↑
Although a break after the subject is possible in (), it is probably unusual (unless the subject is singled out as a contrastive topic, or in disfluent contexts such as hesitation), suggesting a preference for phrases of a minimal size, other things being equal. The existence of phonological constraints on the size of prosodic constituents is acknowledged by even the fiercest critics of the Prosodic Hierarchy (Scheer ), though there are many proposals in which prosodic constituency is derived entirely from the syntax (Cinque , Zubizaretta , Wagner ). Selkirk () models cases in English, analogous to (), in which prosodic constraints override a direct mapping from syntactic structure, by means of well-formedness constraints on the minimum size of prosodic phrases, within a wider model of variation in phrasing and in the distribution of sentence accents. The impact of top-down syntactic structure has also been shown to affect the finegrained phonetic realization of individual words. In a study of lexical items in a large longitudinal corpus of New Zealand English, Sóskuthy and Hay () show that those words which happen to occur most often at prosodic phrase edges (and thus undergo phonetic lengthening) are over time lengthened in all positions (including phrase medially). They thus argue for an integrated model of language which incorporates a production-perception loop. The existence of such a production-perception
1
See also Kaltenböck, this volume.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
loop—whereby we hear what we say, and take account of what we hear—provides a mechanism by which syntax and phonology may routinely interact. With regard to the distribution of prosodic prominences or sentence accents, or ‘tonicity’ in Hallidayan terms, Bolinger () famously asserted that ‘accent is predictable (if you are a mind reader)’. Bolinger was making an important (and often overlooked) point, as there is rarely only one possible (and thus ‘grammatical’) way to produce an utterance. However, there are sufficient tendencies in how speakers do produce utterances, subject always to wider discourse and interactional goals (Wichmann , Walker ), to have inspired attempts to derive the position of prosodic prominences from the semantic and/or syntactic structure (Gussenhoven , Féry and Samek-Lodovici , Kratzer and Selkirk ).2 Ladd () argues, however, for the role of phonological representation in modelling patterns of sentence accent distribution, and the argument again comes from the existence of patterns that you will generally not (i.e. rarely) hear, in a particular context, or for speakers of a particular dialect. An example is the case of de-accenting; a classic example in English is shown in (), in which capital letters denote the word which bears the main prosodic prominence (or nuclear accent) in each clause: () a. I didn’t buy a red , I bought a car. b. ?I didn’t buy a red , I bought a blue . In (a) the second instance of the noun ‘car’ is typically realized without a prosodic prominence, as shown. Katz and Selkirk () show that deaccenting of the second instance of ‘car’ in an example like this is due primarily to the fact that it is given rather than new information (Schwarzschild ), rather than by virtue of its position following a contrastive focus. A confound between these two potential triggers of reduced accentual prominence is present in many studies of English focus prosody (e.g., Xu and Xu ). The pronunciation in (b), without de-accenting of the repeated noun ‘car’, is likely to sound odd, or even unacceptable, to English listeners who are speakers of an ‘inner circle’ variety of English (Kachru ). Cruttenden () suggests that de-accenting of given information is not universal, since some languages resist de-accenting in the same context. Ladd () points out a typological divide between Germanic languages, which generally favour de-accenting of given material, and Romance languages, which generally resist de-accenting in the same context. De-accenting of given material is not the norm in all varieties of English, however: de-accenting of given noun phrases is resisted in Black South African English (Swerts and Zerbian ) and in other ‘outer circle’ varieties of English such as Indian English (Wiltshire and Harnsberger ), as illustrated in the example in () below (Gumperz , cited in Ladd : ): ()
2
If you don’t give me a I will have to buy a .
See also Kaltenböck (this volume, section .).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Finally, turning to accounts of Hallidayan ‘tone’, which treats the shape of pitch contours, there are some attempts at a ‘compositional semantics’ of English intonation (Pierrehumbert and Hirschberg , Bartels , Truckenbrodt , cf. Steedman ). However, it is not the case that the nature and distribution of possible intonational contours is common to all languages; for example, despite claims to the contrary (Bolinger a), questions do not ‘go up’ in all languages (Rialland ), nor do declaratives ‘fall’ in all varieties of English (e.g., Grabe et al. , who demonstrate the near ubiquity of rising intonation on declaratives in Belfast English). Since there are combinations of intonational tune (form) and meaning (function) which are unacceptable for some or all English speakers, it follows that there is a grammar of intonational tunes to be described for English. The ‘autonomy versus groundedness’ debate is an acute issue for intonational phonology, due to the interweaving of linguistic and extra-linguistic functions of intonation (Ladd ). As noted for postlexical rules above, in sentence phonology we also find cases of ‘meaningful gradient variation’ (Ladd ), in which a phonetically gradient feature is nevertheless perceived as categorical; this is precisely what Ladd and Morton () found in an experimental study on interpretation of variable pitch excursion (scaling) of sentence accents by English listeners as bearing focus (or not).
. C :
.................................................................................................................................. In this section, we present a more detailed case study of a single phenomenon, namely the treatment of word classes, to consider the role that the segmental content of words plays in their definition and classification (see also Hollmann, this volume). Some of the intersections between word classes and phonology are well known, and in some cases, a word class may be quite regularly associated with a distinctive phonological pattern. For example, stress generalizations for polysyllabic noun-verb pairs (e.g., insight [ˈɪnsaɪt] versus to incite [ɪnˈsaɪt] show that nouns tend to have syllable-initial stress, whereas verbs tend to have syllable-final stress (see Kreidler , for an extended discussion). However, there are many other associations between grammatical classes and phonological structures, which we explore below. The purpose of this case study is to look at how formal-generative and functionalcognitive grammars treat phonological detail, showing that in word class categorizations these theories of grammar have focused almost exclusively on syntax (formal-generative grammars) and semantics (functional-cognitive grammars). These grammars are often taken to be somewhat opposed to each other, and so provide an interesting contrast in how they deal with phonology. Formalist grammars typically define word classes in terms of abstract syntactic features (e.g. Chomsky , Adger ) and distributional criteria (e.g., Aarts b). Functional and cognitively-orientated grammars are typically based on semantic properties (e.g. Halliday , Langacker , , a). Before looking at the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
more theoretical issues concerning the extent to which phonology is part of these grammars, and how grammars handle phonology, we discuss a body of empirical work on phonology and word classes. In line with the focus of previous research, we consider the open/closed-class distinction, noun-verb distinction, and adjectives. We consider evidence for the role of phonology in word class categorization from psycholinguistics and speech production and the extent to which this information has been incorporated into different grammatical frameworks. Typically, theoretical grammars keep syntax and phonology separate, treating them as being of a ‘very different nature’ (Bromberger and Halle : ). The same is true for language acquisition and how children attain implicit knowledge of grammatical categories, with research emphasizing semantic and syntactic information, at the expense of phonology. However, word classes (in particular, noun and verb categories) prove to differ from each other in an unexpected number of phonological ways. In an effort to ‘bridge the gap’ between the formal/functional paradigms, Kelly’s () work argues for lexical categorization in terms of distributional, semantic, and phonological properties. Distributional and syntactic cues play a role in lexical categorization, as shown in classic studies in which children demonstrate the ability to interpret morphophonologically relevant distributional cues in novel words (e.g., Brown , Berko ). However, Kelly argues that theoretical linguists have failed to properly acknowledge phonological evidence in theory-building, and, specifically, that they failed to incorporate phonological cues into considerations of word class categorization. There is now a substantial body of experimental evidence pointing to the role that phonological cues play (Kelly and Bock , Sereno and Jongman , Cassidy and Kelly , Kelly , , Shi et al. , Cassidy and Kelly , Durieux and Gillis , Monaghan et al. , Farmer et al. , Don and Erkelens ). Hollmann has also argued for the acknowledgement of phonology in this area of research (see Hollmann , and this volume); he identifies three research questions posed by those seeking to explore the grammar-phonology interface in lexical categorization—all of which can be answered in the affirmative (Hollmann : ): ()
a. Are phonological cues to grammatical class determination available to hearers, and can generalizations be made? In other words, do certain word classes display phonological generalisations? (e.g. Kelly and Bock ; Berg ; Monaghan et al. ) b. Can the cues be used in speech production/perception? (e.g. Shi et al. ; Durieux and Gillis ; Monaghan et al. ) c. Are they used? In other words, do listeners access phonological information in contributing to determining word classes? (e.g. Sereno and Jongman ; Cassidy and Kelly , ; Farmer et al. ; Don and Erkelens ).
Most findings in response to such questions are based on the distinction between open word classes (those that readily admit new members, such as nouns, verbs, and
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
adjectives) and closed word classes (those that do not admit new members, such as determiners and prepositions). For example, in English, one phonological feature of the open/closed-class distinction is manifested in prosodic structure, as already noted. Most open-class words contain at least one strong syllable (containing a full vowel), whereas closed-class words tend to consist of a single, weak syllable (containing a reduced vowel, typically a schwa). In terms of question (a) then, Monaghan et al. (: –) reviewed the literature to generate the findings presented in Table ., below (see these pages for the original Table . Word class-phonology generalizations (adapted from Monaghan et al. : –) #
Generalization
a
Word length in phonemes: open-class words are generally longer than closed-class words. Nouns are generally longer than verbs.
b
Word length in syllables: closed-class words have a minimal number of syllables. Nouns have more syllables than verbs.
c
Presence of stress: words with zero-stress are more likely to be closed-class than open-class.
d
Position of stress: disyllabic words with iambic stress (where the stress falls on the second syllable) are more likely to be verbs, as in [ɪnˈsaɪt]. Words with trochaic stress (where stress falls on the first syllable) are more likely to be nouns, as in [ˈɪnsaɪt].
e
Onset complexity: open-class words are more likely to have consonant clusters in the onset than closed-class words.
f
Word complexity: open-class words are more likely to have consonant clusters in the onset and codas of all syllables than closed-class words.
g
Proportion of vowel reduction: closed-class words are more likely to feature vowel reduction than open-class words.
h
Reduced first syllable: closed-class words are more likely to feature vowel reduction on the first syllable.
i
-ed inflection: adjectives are more likely than other word classes to end with a syllabified -ed. For example, learned is pronounced [lɛ:ˈnәd] as an adjective but [lɛ:nd] as a verb.
j
Coronals: closed-class words are more likely to contain coronal consonants than open-class words. Coronal consonants are sounds produced with the blade of the tongue raised, as in: /t, d, θ, ð, s, z, ʤ, ʧ, n, l, ɹ/.
k
Word initial /ð/: closed-class words are more likely to begin with /ð/ than open-class words.
l
Final voicing: if a word finishes in a consonant sound, then this is more likely to be voiced (articulated with the presence of vocal fold vibration) if the word is a noun rather than a verb.
m
Stressed vowel position: vowels occurring in stressed syllables tend to be back vowels in nouns and front vowels in verbs.
n
Vowel position: vowels in nouns tend to be back vowels, and vowels in verbs tend to be front vowels.
o
Vowel height: vowels in nouns tend to be low vowels, and vowels in verbs tend to be high vowels.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
studies which are cited). In these studies, findings are generated from a corpus-based approach, where items in a corpus are analysed in terms of their phonological properties to generate statistical generalizations. Given the wide variability in human speech production across individuals, large amounts of data are needed to form such generalizations. Corpus linguistic approaches offer a clear methodological affordance here, given the large amount of data they can handle. We provide the whole list here to show the scope of the work and the types of phonological cues that have been investigated. Such information, of course, must be approached critically: the information presents tendencies in language rather than absolutes, and further empirical work is required to validate the findings both within English (on which the findings are based) and across different languages. Nevertheless, the findings point to the fact that phonological information shows distinctive patterns at word class level. The list shows findings related to three distinctions at word class level: open-closed, lexical-grammatical, and specific word class (nouns, verbs, and adjectives). In the list above, (a) to (d) are word level generalizations and (e) to (o) are syllable level generalizations. The generalizations in Table . suggest that there is a wide variety of evidence linking phonological cues to lexical categorization, certainly in English. Some generalizations are undoubtedly more reliable and stable than others—position of stress in nouns and verbs, for example, is a well-studied phenomenon (e.g., Liberman and Prince , Sherman ). It must be noted that although the literature has generally used the term phonological cues, this is taken to be closely aligned with acoustic phonetic cues. The next questions, (b) and (c) concern whether hearers can and do use these cues in speech production/perception and in language acquisition. Experimental evidence suggests that the answer to both questions is indeed ‘yes’, which points to a correlation between lexical categorization, phonological properties, and acoustic phonetic realizations. However, caution must be exercised: most of the studies above use isolated, nonce words as experimental stimuli, which limits the ecological validity of the findings. Further research is needed to establish the role of these cues, and whether acoustic information is exploited as a cue in naturally occurring discourse. In speech production, various studies have demonstrated that correspondences between phonological properties and lexical categorization are systematically observed. For example, when given disyllabic nonce words such as ‘ponveen’ (in a slot filled by either a noun or a verb), English speakers tend to adhere to the stress pattern typical of each word class (Baker and Smith , Kelly and Bock ). Hollmann (, ) reports on a series of experiments where participants produced nonce English nouns and verbs and used them in a sentence. Words were then analysed in terms of word length in syllables, mean syllable length in phonemes, final obstruent voicing, frequency of nasal consonants, frontness of the stressed vowel, height of the stressed vowel and presence versus absence of a final obstruent. Apart from mean syllable length and vowel height, results confirmed that speakers assume different phonological patterns for words they have categorized in different word classes.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
In speech perception, these correspondences are typically investigated by presenting human participants with nouns and verbs and testing their comprehension. Various studies reveal that hearers can classify tokens into their grammatical category quicker and more accurately when the words have the prototypical phonological properties of the particular word class (Sereno , Davis and Kelly ). Evidence also suggests that children learn phonological cues to lexical categorization by around – years of age (MacWhinney , Levy , Cassidy and Kelly ). Furthermore, Shi et al. () showed that new-born babies can discriminate function words from lexical words, using phonological and acoustic information. Whilst we have focused on noun-verb and open-closed class distinctions here, research has also demonstrated phonological correlates of adjective categorization. For example, in a study of the position of adjectives on the noun-verb continuum, Berg () reviewed the syntactic, pragmatic, and psycholinguistic evidence, pointing once again to the lack of phonological evidence in previous research. To challenge this, he compared six phonological criteria of uninflected nouns, verbs, and (attributive) adjectives in the Centre for Lexical Information (CELEX) corpus, which comprises nearly eighteen million words. The following criteria were examined: word length in number of syllables; trochaic versus iambic stress pattern in disyllabic words; stressedvowel fronting; final obstruent voicing and phonological realizations of -ate and -ed endings.3 Across all parameters, apart from cases of tri-syllabic words, adjectives are distributed ‘closer’ to nouns than verbs on a continuum. This, he argues, provides evidence against the ‘equidistance hypothesis’ (Berg : ) of lexical categorization, which claims that adjectives lie in the middle of the noun-verb continuum (as is typically assumed). Instead, Berg offers a theoretical ‘cross-level harmony constraint’ (: ), where word class categorization cuts ‘across’, and includes information at phonological, morphological, syntactic, and semantic levels. This constraint, Berg argues, is the ideal design of a multilevel system in language. Hollman () develops Berg’s work by adding the following criteria for consideration: mean syllable length in phonemes; nasal consonants; height of the stressed vowel, and presence versus absence of a final obstruent, all of which have been identified as additional parameters (see Hollmann : , and Table .). Hollman uses the British National Corpus (BNC), taking advantage of its larger sample size (one hundred million words) and its inclusion of spoken and written language. The results support Berg’s data and add further criticism of the equidistance hypothesis, in that five parameters (word length in number of syllables; number of syllables; front/ height of stressed vowel; trochaic versus iambic stress pattern in disyllabic words, and presence versus absence of a final obstruent) reveal that adjectives and nouns are not only phonologically closer (i.e. more similar) than adjectives and verbs, but are statistically indistinguishable. For the parameters of final obstruent voicing and nasal 3 Hollmann (: –) notes that the -ate ending and the -ed ending adjective-verb allomorphy is so rare that they may reveal nothing about word classes in general. He dismisses it as a parameter in his paper.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
consonants, adjectives were closer to verbs. Speech perception research is yet to investigate how listeners make use of these patterns (as has been shown for the noun-verb distinction) and this would be a useful avenue of future research. The evidence from psycholinguistic, speech perception and production studies suggests that phonological cues are either potentially useful for, or shown to be used in, the task of lexical class categorization, and that phonological generalizations are consistently observed. Taken together then, evidence from speech perception and psycholinguistic experiments provides a strong case for the contribution of phonological factors to word class categorization, at least for open-closed, noun-verb, nounadjective, and adjective-verb distinctions. We now turn our attention to how theories of language incorporate phonology into the grammar.
. I
.................................................................................................................................. In section . we highlighted the distinction made in the literature between autonomous and grounded phonology, and we revisit this here in exploring to what extent phonology is or can be modelled as part of a grammar. This is a somewhat ambitious task for a short chapter, but we aim to at least sketch the most important ideas and issues, by examining two grammatical paradigms that are typically held in ‘opposition’ to each other: formal-generative and functional-cognitive.
.. Formal-generative grammars The objective of phonology in the generative tradition is to describe and account for language specific and language universal phenomena such as: phonological units (phonemes, features, segments, feet, syllables), the way that allophones of a given phoneme are deemed to be similar, and variation in realizations of morphemes in different syntactic and phonemic contexts. Early work in generative grammar (e.g., Chomsky , ) established the standard generative architecture, positing the autonomy of language as a distinct module in the mind, and language-internal modularity, with phonology and semantics kept separate, including elements of word class categorization.4 As previously alluded to in section ., Bermúdez-Otero and Honeybone (: –) suggest that the ‘autonomy of language’ argument is the chief factor which motivates linguists to keep phonology and grammar/syntax as separate domains, and determines where the dividing line is drawn.
4
See also Lohndal and Haegeman, this volume.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
morphosyntax
phonology (form)
semantics (meaning)
. Inverted T-model
The standard architecture of the grammar in generative phonology is the ‘inverted T-model’, shown in Figure .. Chomsky and Halle’s account of generative phonology makes extensive use of a metaphor, describing the modular architecture in terms of ‘computations’, ‘algorithms’, ‘mechanics’, and ‘robotics’ (Chomsky and Halle ). The inverted T-model and the metaphor present a grammar with morphosyntax as the central system, which in language production first concatenates words and clauses before sending the output to two separate modules: phonology (which computes form) and semantics (which computes meaning). A morphosyntactic string undergoes a process of cyclic derivation (or derivation by phase), with grammatical information processed by an inventory of phonological rules. The reverse procedure happens for language perception. As Scheer (: ) explains, the function of autonomous, generative phonology is the ‘translation back and forth between a physical signal and morphosyntactic structure’. Translation is key here, with different linguistic modules each operating as self-sealed systems which pass and translate information to each other. Modularity and derivation are typical hallmarks of strictly generative frameworks and come with various motivating factors. These include the idea that modules possess their own representational vocabulary, undergo linear computations which are inputoutput systems (i.e. A affects B, but B does not affect A) and, typically, do not permit interfaces between modules (see Scheer : – for an extended discussion of the advantages of modularity, in terms of language-mind modularity and languageinternal modularity). Though not without its detractors (McMahon , Scheer ), the dominant paradigm in formal phonological theory since the s has been Optimality Theory (OT, Prince and Smolensky ). OT is output-oriented rather than derivational: constraints on the output of the grammar take the place of derivational rules, and the relative ranking of constraints replaces the ordering of rules. Nevertheless, phonological analysis in OT is typically still generative in its aims and scope. The key difference between OT and rule-based paradigms is the status of the input to the grammar. In classical generative phonology, the underlying form of a sound or word must be deduced; the analysis demonstrates the ordering of the rules involved in deriving the surface form from the underlying form (usually using a worked example, showing the application of each rule in turn to the underlying form). In OT, the input is theoretically indeterminate, due to ‘Richness of the Base’ (discussed further in the examples that follow): an analysis consists of showing which ranking of relevant constraints
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
successfully excludes potential surface forms which are not in fact observed; the analysis is presented in a table, showing on each row how a candidate surface form is evaluated by a set of relevant constraints. An illustrative example for each approach is shown in (–) below. (a) shows a simple derivational analysis of categorical ‘glottal replacement’ in UK English (Smith and Holmes-Elliott ), in which ‘butter’ /bʌtә/ is realized as [bʌʔә] when /t/ occurs foot-internally between two vowels (or sonorants); (b) shows the distributionally analogous process of ‘t-flapping’ in US English, which is argued by Iverson and Ahn () to result from the cumulative effects of a phonological rule of Coronal Lenition and a phonetic shortening effect (which they also call ‘flapping’). The simplified tableau in () illustrates the interaction of a putative markedness constraint (which mitigates against fortis realizations of coronal stops in foot-internal position) with faithfulness to the putative input form (such as I). The choice of input in () is used to demonstrate which constraint ranking is needed to correctly model the observed surface output, but if the ranking is correct, positing a different input (such as surface-true [bʌʔә]) should still yield the correct result (vacuously). ()
UK English glottal replacement and US English flapping in a rule-based analysis ɹ a. Glottal Replacement: /t/ ! [ʔ] / [ . . . __ [V] . . . ]Foot V Underlying representation
/bʌtә/ ‘butter’ (UK English)
Glottal replacement
[bʌʔә]
Surface representation
[bʌʔә]
b. Coronal Lenition: /t/ ! [d̥] / [ . . . [vocalic] __ [sonorant] . . . ]Foot (after Iverson and Ahn )
()
Underlying representation
/bʌtә˞/ ‘butter’ (US English)
Coronal lenition
[bʌd̥ә˞]
Flapping (phonetic shortening)
[bʌɾә˞]
Surface representation
[bʌɾә˞]
UK English glottal replacement in Optimality Theory (simplified) /bʌtә/ ‘butter’ !
a)
[bʌʔә]
b)
[bʌtә]
M
F
* *
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OT analysts typically model the relationship between only plausible inputs and relevant candidates (i.e. which would violate a constraint that the analyst wishes to determine the ranking of). The strength of an analysis is thus dependent on the choice of candidates, though this can be partially remedied by automated computation (Boersma and Weenink –, Hayes et al. ). Classical generative phonology relied on Morpheme Structure Conditions (MSC) to account for the ungrammaticality of ‘bnint’ in English (Halle , Chomsky and Halle ), but the resulting analysis suffers from redundancy, if word-initial [bn] might be ruled out by some more general phonological rule, and also duplication, if such a rule can be shown to be independently needed (Booij ). In OT the principle of Richness of the Base excludes the existence of constraints on input forms: all surface patterns are claimed to result from the interaction of violable constraints, including English listeners’ knowledge that ‘bnint’ is not a possible word in English; differences between languages or dialects in what is a possible word are due to differences in ranking of the same set of putatively universal constraints. OT is typically associated with the assumption that the constraint set is universal, and thus innate and part of Universal Grammar; this is not necessarily entailed by the underpinning logic of OT, however, and proposals exist for mechanisms by which at least some constraints may emerge from use (Bermúdez-Otero and Börjars ). This re-framing of the origin of constraints is possible because OT is primarily a theory of computation, not of representations (Moreton ). In OT, the details of representation reside in the precise definition of constraints, and its computational power is as readily weakened by poor practice as any other theory: imprecise definition of constraints and unregulated proposal of highly context-specific constraints have attracted criticism. Finally, because OT is a theory of computation, rather than strictly a theory of phonology, it is applicable to any module of the grammar, and is widely implemented also for syntax. As a result, OT is well-suited for modelling the interaction between different modules of the grammar, since constraints of different types can be treated within the same analysis. Approaches of this type attract fierce criticism from analysts committed to modularity; a specific example is rejection of interface constraints on the mapping between syntactic and prosodic structure (Selkirk , ) which violate modularity (Scheer ). The computational architecture of OT has been amended by some analysts to allow variable ranking of constraints to model variation (Anttila and Cho ), or weighting of constraints to model probabilistic phonological generalizations (Hayes and Wilson , Pater ).
.. Cognitive-functional grammars Cognitive linguists do not categorize and separate language in terms of modular ‘layers’ of phonology, morphology, and syntax. Instead, a cognitive approach argues that aspects of language exist on a continuum. In cognitive linguistics, this is referred to as the generalization commitment (Evans and Green : ), which posits that although it can be useful to treat different levels of language separately, it is equally if
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
not more beneficial to investigate how linguistic knowledge emerges from a common set of human cognitive abilities, rather than assuming they exist in autonomous modules of the mind. An obvious practical implication of this is that different areas of linguistic study can be more easily integrated into the same theoretical description. For cognitive linguists working on phonology (e.g. Bybee , Välimaa-Blum , Pierrehumbert , Nathan , Mompean ), phonemes, syllables (and other higher order units) are abstract schemas, or generalizations, accrued over several instances of sounds which emerge through repeated production and perception. The accrual of schemas in this way reflects the usage-based view of language which cognitive linguists commit to, in which language is construed as a vast and complex network of phonological, lexical, and semantic relations. Usage-based phonology, which shares some of the commitments of cognitive linguistics, has made advances in clarifying how phonology may emerge from use (Bybee , Pierrehumbert , Bermúdez-Otero and Börjars ). In cognitive phonology, an allophone is a sub-schema of the phonemic super-schema, of which there will be prototypical and peripheral members. For example, in the case of the English phoneme /t/, the voiceless alveolar stop [t] is considered as the prototype, with other allophones such as taps, aspirated [tʰ], and the glottal stop [ʔ] all peripheral members (Mompean : ). In phonological instantiations of Exemplar Theory, a phonemic representation such as /t/ arises due to clustering of clouds of exemplars stored in episodic memory; these exemplars are phoneticallyrich, encoding extra-linguistic indexical or contextual information alongside linguistic or structural information (Pierrehumbert ). These types of model are wellequipped (by design) to handle encoding of the type of ‘meaningful gradient variation’ which Ladd () raises as an issue for phonological theory. The most comprehensive theory of grammar in the cognitive paradigm is Ronald Langacker’s Cognitive Grammar (CG) (Langacker , , a, see also Taylor, this volume). In cognitive linguistics, it remains the grammar which has made the most (although still rather limited) attempts to incorporate phonology into its architecture. CG claims that grammar is symbolic in nature, with a ‘symbol’ defined as the pairing between a semantic unit and a phonological unit (Langacker a: ). The phonological unit has been neglected in CG, especially by Langacker, and so Lakoff’s claim that ‘cognitive phonology is part of CG’ (: ) is not yet reflected in existing CG research. In Langackerian CG, the phonological unit includes some schematic phonological information, which is defined and constrained by articulatory movements (e.g., voiced stop) or phonotactic patterns (e.g., CVC), which both emerge through usage. Taylor’s () treatment of word-class phenomena within CG is a detailed attempt to incorporate phonology by identifying four correlations between nouns/verbs and phonological features, all gleaned from the psycholinguistic literature as outlined in section .: ()
a) Word length in syllables (nouns generally longer than verbs) b) Stress location (disyllabic verbs likely to feature iambic stress; disyllabic nouns likely to feature trochaic stress)
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
c) Stressed vowels (stressed syllables in verbs tend to have front vowels; stressed syllables in nouns tend to have non-front vowels) d) Final obstruent voicing (verbs more likely to have final obstruent voicing, rather than unvoiced) (see Taylor : –) Taylor proposes that combining these four generalizations allows us to build a phonological prototype for and , based on ‘tendencies not absolute properties’ (: ), which can be incorporated into CG architecture as ‘sub-schemas’. Sub-schemas indicate preferred word length, stress patterns and final obstruent voicing, and slot in between the maximally schematic and the fully specified phonology of individual verbs. Figure . shows a schema network, which includes the sub-schemas gleaned from phonology (in the lower boxes), such as those shown in b–d above:
‘…’ […]
[VERB]
‘…’
‘…’
‘…’
[σ 'σ]
[…'Vfront…]
[…Cobs, voiced]
. A partial schema network for a verb (adapted from Taylor : )
. C
.................................................................................................................................. In this chapter, we have argued for the importance of phonology as a component of any grammar. We have provided a summary of the evidence which demonstrates the role of phonological constraints in word-formation and in the surface realization of morphosyntactic structures, at the interfaces with morphology and syntax, respectively. We have gone further also in highlighting the claim that phonology has a role in lexical categorization, and explored some of the ways in which this information has been modelled into grammatical frameworks. Some grammarians may feel this latter move is a step too far, since incorporating phonology fully into a grammar entails dealing with stochastic generalizations and continuums, rather than tightly defined rules and categories. Many of the grammatical frameworks we have discussed in this chapter were developed for the most part through the analysis of English, and a useful avenue of future research will be the application of the different frameworks to a wider variety of languages. Nevertheless, the focus on English has also greatly benefitted the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
development of linguistic theory (see Carr and Honeybone , for an overview of the specific contribution of work on English to phonological theory). Advances in laboratory phonology, speech technology, and acoustic phonetics are having increasing influence on phonological theory and can be expected to further illuminate the issues at the grammar-phonology interface we have highlighted here.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. T study of linguistic meaning is normally divided into three fields: lexical semantics, compositional semantics,1 and pragmatics. Lexical semantics studies the meanings of words and relations between these meanings, as well as how these meanings are realized grammatically (i.e., morphosyntactically). Compositional semantics studies how the meanings of larger grammatical units—phrases, sentences, discourses—are built out of the meanings of their parts. Pragmatics studies how speakers enrich meanings contextually and the relation between context and meaning. Each of these aspects of meaning is related to grammatical aspects of linguistic structure (morphology, syntax, prosody), although pragmatics generally displays weaker strictly grammatical effects than the other two aspects of meaning, for which the relationship with grammar is intrinsic. In this paper, I understand grammar to be a productive system that characterizes the well-formed (potentially) meaningful units of a language. I do not take a crucial stance on what the minimal meaningful units are or what the precise relationship is between morphology and syntax, but I will tacitly assume a form of strong lexicalism (Lapointe ), such that the minimal units of syntax are words. The terms ‘grammar’ and ‘meaning’ are themselves open to a variety of interpretations, but a thoroughgoing consideration is not possible in this paper. I assume here that ‘grammar’ refers to syntax and that ‘meaning’ refers, on the one hand, to a truthconditional conception of compositional and lexical semantics and, on the other hand, to how meanings produced by the linguistic system (syntax and semantics) are enriched contextually with knowledge from outside the linguistic system by speakers and hearers (pragmatics). The distinction between these two different aspects of meaning should be obvious in what follows. Compositional semantics is often merely called ‘semantics’, but I use the term ‘compositional’ to contrast this aspect of semantics with lexical semantics. 1
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
These decisions reflect my own background and competence, but also the currently predominant viewpoint in theoretical linguistics (see, e.g., Partee for a lucid introductory presentation of this view of grammar and meaning from first principles). As such, this seems like a natural starting point for students and researchers wishing to find an overview and an entry point into the literature. Nevertheless, this means that various other perspectives are not represented here. For example, I do not review morphological contributions to lexical and compositional semantics (see, e.g., Hewson , Bolinger , Leech ). Nor do I review theories of lexical semantics and its interaction with compositional semantics which emphasize the cognitive/conceptual underpinnings of meaning, most prominently Cognitive Grammar (Langacker , , , a, Taylor, this volume). I similarly focus on a particular view of the relationship between lexicon and grammar that has received wide currency in various linguistic theories, but set aside other views (see, e.g., Duffley , Jaszczolt ).
. L
.................................................................................................................................. In his recent overview, Geeraerts () lists quite a few kinds of theories of lexical semantics, but I do not seek to endorse any particular one of them here.2 In section .., I review two key truth-conditional relations between lexical items. In section .., I briefly review some central aspects of the relationship between lexical meaning and grammar.3
.. Lexical relations In this section, I review two fundamental lexical relations: synonymy and antonymy.
... Synonymy A standard definition of synonymy is that two lexical items are synonymous if and only if substitution of one for the other in any sentence necessarily preserves truth 2 Cruse () is a classic general overview of lexical semantics. Geeraerts () is an ecumenical review of major theories and methodologies in lexical semantics. Wechsler () is an authoritative recent overview of the relationship between lexical semantics and syntax. There are chapters on lexical semantics and argument structure in various handbooks; see footnote below for some suggestions for handbooks on semantics. Two key computational resources for lexical semantics are WordNet (http:// wordnet.princeton.edu; Miller , Fellbaum ) and FrameNet (http://framenet.icsi.berkeley.edu; Fillmore , Ruppenhofer et al. ). WordNet is a lexical database, for English, that organizes lexical items into synsets (sets of synonyms) and represents lexical relations between synsets. FrameNet is a lexical database, for English, of argument structure frames. Versions of both WordNet and FrameNet have also been developed for various other languages. 3 Another rich and interesting aspect of lexical semantics concerns the challenge of indeterminacy in lexical meanings, such as problems of ambiguity and polysemy (Cruse , Pustejovsky ), vagueness (Williamson , Kennedy , ), and whether lexical meanings should be represented as prototypes (Rosch , Rosch and Mervis ). These aspects of lexical meanings do have grammatical effects, but they are often subtle and therefore difficult to address within the space constraints of this paper. For a recent overview and further discussion, as well as key references, see Wechsler (: Chapter ).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
(Cruse : ). In other words, for any sentence in which the first lexical item occurs, substitution with the second lexical item necessarily results in a sentence with the same truth conditions. This is a truth-conditional definition of synonymy, which Cruse () calls propositional synonymy, since it depends on truth-conditional equivalence between propositions modulo substitution of corresponding lexical items. Propositional synonymy can also standardly be understood as mutual entailment: two lexical items α and β are synonymous if and only if every sentence φ which contains α entails and is mutually entailed by the sentence φ′ with β substituted for α (where φ′ is identical to φ except for the substitution of β for α). For example, violin and fiddle are synonyms, according to Cruse (: ), because ‘these are incapable of yielding sentences with different truth values’. Unfortunately, this is isn’t quite true, for a number of reasons. First, the substitution cannot occur in any of the standard contexts where substitution breaks down (so-called intensional or opaque contexts; Frege , Carnap , Quine , , Montague ), such as clauses embedded under propositional attitude verbs. For example, it is possible for the sentence Kim believes Sandy owns a violin to be true while the sentence Kim believes Sandy owns a fiddle is false, if Kim doesn’t realize that fiddles are violins. Second, substitution can fail in collocations. For example, Kim plays first fiddle in the orchestra sounds decidedly odd, because the position is called first violin. Third, otherwise synonymous words may have different connotations (Geeraerts : ) or may differ in sentiment or expressivity (Potts b). For example, prostitute and whore (on one usage) are arguably synonyms, but the latter expresses more negativity than the former. It may therefore be preferable to adopt a slightly different definition of synonymy that builds on notions of model-theoretic interpretation (see section .. below) but eschews substitution in sentences. For example, violin and fiddle are synonymous in our world because the set of violins is the same as the set of fiddles (nothing is in one set but not the other). This doesn’t capture the complexities of collocations and connotation, but it provides a minimal notion of synonymy which can then be accommodated in theories of those phenomena.
... Antonymy Speakers often have strong and consistent intuitions about oppositeness and even quite young children grasp lexical opposition. In ordinary language, the term antonym is often used very generally for all lexical opposition. But opposition in lexical semantics is normally considered a more general concept than antonymy; three entire chapters of Cruse () are dedicated to ‘opposites’ but only a fraction of this is specifically about antonymy. One possible definition of antonymy could treat it as the opposite of synonymy (building on the ordinary intuition that antonym and synonym are antonyms) and we could similarly define the relation in terms of entailment, based on the notion of contradiction, where two propositions are contradictory if and only if whenever the first is true the second is false and vice versa. This very strict definition of antonymy could then be given as follows: Two lexical items α and β are antonymous if and only if every sentence φ which contains α contradicts the sentence φ′ with β substituted for α,
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
where φ′ is identical to φ except for the substitution of β for α. But this is far too strict. Only opposites like dead and alive or true and false would thereby count as antonyms, but surely we want many other contrary but not strictly contradictory words to count as antonyms, such as good/bad, long/short, fast/slow and hot/cold. Cruse (: –) therefore reserves the term complementaries for strictly contradictory pairs and follows Lyons (: –) in defining antonymy in terms of the following properties (Cruse : ): . Antonyms are gradable. For example, something can be very long or not very long. . The members of an antonymic pair denote degrees of a scalar property, e.g., in the examples above: goodness,4 length, speed, temperature. . Intensification of the members of an antonymic pair moves them further apart on the scale. For example, very hot and very cold are further apart on the scale of temperature than the unmodified adjectives. . The members of an antonymic pair do not bisect their domain. For example, it is not a contradiction to say Kim is neither tall nor short. Property goes hand in glove with the following additional property (also see footnote above): . There is no definite transition point on the scale such that the opposing member then strictly applies. In other words, antonyms are vague. For example, there is no specific point on the scale of human height that we would say separates tall people from short people (except by arbitrary decree). Thus, gradable, scalar, vague adjectives are paradigmatic antonyms. These semantic properties interact with other aspects of grammar, such as the structure of comparatives and superlatives and adverbial modification of adjectives. There has been considerable work on this since the pioneering modern work of Kennedy (, ), which itself builds on prior work by Klein (, , ) and others. As an illustration of the interaction with adverbial modification, consider the following typology of scales and accompanying examples from Kennedy and McNally (: –); the judgements are theirs: ()
4
A typology of scale structure Scale structure Picture (Totally) open $ Lower closed ! Upper closed (Totally) closed —
Examples tall/short, deep/shallow, eager/uneager bent/straight, loud/quiet, famous/unknown certain/uncertain, pure/impure, safe/dangerous full/empty, open/closed, visible/invisible
This illustrates that the property in question does not have to be easily definable independently or even be strictly measurable.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
() Open scale pattern Her brother is completely ??tall/??short. ()
Lower closed scale pattern The pipe is perfectly ??bent/✓straight.
()
Upper closed scale pattern This product is per cent ✓pure/??impure.
()
Closed scale pattern The room was completely ✓full/✓empty.
In all of these, the middle of the scale is indeterminate, though, and context is important in reducing or eliminating the indeterminacy. Thus, scalar adjectives also interact with pragmatics; for further discussion, see Barker (), Kennedy (, ), and Kennedy and McNally ().
.. Argument structure: the lexicon–grammar interface There are both intra-language and inter-language (typological) generalizations about how lexical meanings correspond to elements in the grammar. For example, if a predicate has two arguments, such that one of them is the entity that is the actor in the action that the verb denotes and the other is the entity acted upon (cf. the macroroles of actor and undergoer in Role and Reference Grammar; Foley and Van Valin , Van Valin ), then in an active sentence we see the following correspondence between these arguments of the predicate and syntactic arguments:5 ()
predicate
〈
actor
undergoer
subject
object
〉
A theory of argument structure thus minimally requires three parts: . Some representation of the semantic roles of the arguments of a predicate (actor and undergoer above); . Some representation of syntactic grammatical functions (subject and object above); . Some representation of the mapping between elements of and elements of (the lines above). 5 This representation bears a passing resemblance to representations in the Mapping Theory of Lexical-Functional Grammar (for discussion and further references, see Bresnan et al. : Chapter ), but it is intended entirely pre-theoretically.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Different theories make different choices about these parts. For example, in many theories the elements of are thematic roles (Gruber , Fillmore , Jackendoff ) like agent and patient. However, Dowty () has argued that thematic roles are actually cluster concepts (proto-roles), not unitary elements. Turning to the elements of , theories make different choices here, too. For example, Lexical-Functional Grammar (LFG) directly represents grammatical functions like and in the syntactic level of functional structure (Bresnan et al. ), Head-Driven Phrase Structure Grammar (HPSG) represents grammatical functions positionally in one or more lists that represent syntactic subcategorization, and so-called Mainstream Generative Grammar (Culicover and Jackendoff ), i.e. Principles & Parameters Theory and its direct ancestors and descendants (e.g., Chomsky , , ), represents grammatical functions positionally in a syntactic tree. Lastly, the mapping itself can be represented in different ways. For example, in LFG the mapping is a correspondence between separate elements of a predicate–argument structure and a functional structure, whereas in HPSG the mapping is represented as structuresharing of elements of an ()-() list and elements of () lists (where this was a single list in earlier versions of the theory and two separate lists in later versions). There are far too many interesting aspects of the lexicon–grammar interface to adequately consider here; see Schönefeld in this volume and Wechsler () for recent authoritative overviews. However, a central question about the relationship between grammar and meaning that arises in this context is the following: ()
Does lexical meaning determine grammatical structure or does grammatical structure determine (apparent) lexical meaning?
In other words, is it theoretically correct ) to predict structure from lexically stored meanings or ) to construct the meanings of potentially radically underspecified lexical elements (often called roots in recent versions of the relevant kind of theory) from the structure in which they are instantiated? On the former sort of view, the lexicon is a rich generative component in its own right (among many others, Bresnan , Dowty , Kaplan and Bresnan , Pollard and Sag , Jackendoff , Pustejovsky ); these are often called lexicalist theories. On the latter sort of view, the lexicon, to the extent there is one, consists of syntactically and semantically underspecified elements, which are inserted into highly articulated syntactic structures (e.g., Hale and Keyser , , Borer , a,b, , Ramchand , Lohndal ) or constructions (syntactic templates; e.g., Fillmore et al. , Kay and Fillmore , Goldberg , a, Goldberg and Jackendoff , Jackendoff , ); these are often called phrasal or constructional theories and are, to some extent, updates of certain key ideas of Generative Semantics (Lakoff , McCawley a,b). See Spencer (this volume) for an overview of morphological approaches. See Rauh () for a recent discussion of, and further references to, the relationship between meaning and syntactic subcategorization and see Müller and Wechsler (, ) and
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Bruening () (as well as the accompanying replies and author’s replies) for further discussion of and further references to lexicalist versus constructional/phrasal theories; also see Hilpert (this volume) and Lohndal and Haegeman (this volume). A hybrid sort of theory is offered by Asudeh et al. (), using the formal notion of templates from recent work in Lexical-Functional Grammar (Dalrymple et al. , Asudeh ).6 A template is just a named bundle of grammatical information and wherever the template is invoked, the grammatical information is substituted for the template. For example, we could define an English agreement template sg which states that the subject’s person is rd and the subject’s number is singular. The lexical entries for, e.g., smiles and smile would both invoke this template, but respectively as @sg and ¬ @sg, which captures the fact that the former is a third-person singular form, whereas the latter is incompatible with precisely this cell of the relevant agreement paradigm. Templates thus do not increase the power of the grammar but allow cross-lexical generalizations to be captured, somewhat similarly to types in HPSG (although see Asudeh et al. on important distinctions between templates and types). Crucially, as argued by Asudeh et al. () and Asudeh and Toivonen (), the same templatic information could in principle be associated with a word or a phrase, thus allowing some of the insights of constructional approaches to be captured, but without admitting constructions per se into the theory. For example, building on work by Dalrymple (), Asudeh and Toivonen () show how a single template can be defined for English to capture the modificational semantics of relativization and that this template can be associated both with relative pronouns in non-reduced relative clauses (e.g., who in the person who she met) and with the reduced relative clause structure itself (e.g., the person she met), thus allowing the commonalities between the two realizations to be captured without positing either a null relative pronoun or a specific construction for reduced relatives. The templatic approach has been explicitly incorporated into work on argument structure in Asudeh and Giorgolo (), Asudeh et al. (), and Findlay (, ).
. C
.................................................................................................................................. The semantic counterpart to syntactic productivity/creativity is compositionality,7 which is a general semantic procedure for assigning interpretations to novel linguistic 6
Müller and Wechsler () lump the Asudeh et al. () approach with phrasal/constructional approaches, but this is a mistake (Asudeh and Toivonen ). Unfortunately, we have perhaps not been clear enough in explaining our hybrid theory, because some misunderstandings persist in Müller (b) and Müller (), although the latter provides an expanded and improved discussion of our work and provides much food for thought. 7 There are many textbooks and overviews of compositional semantics. For excellent introductory overviews, see Kearns () and Elbourne (). Heim and Kratzer () is very influential, but more advanced. Dowty et al. () is the classic advanced introduction to Montague Semantics. Gamut (a) is also an influential advanced introduction to Montague Semantics and its formal prerequisites,
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
expressions that have no upper bound on their length.8 That is, given a syntactic mechanism for modelling the natural language property of productivity, there must be a semantic correlate for assigning interpretations. This is typically enshrined in the principle of compositionality, often attributed to Frege (perhaps incorrectly; see Janssen ): ()
Principle of Compositionality The meaning of a linguistic expression is determined by the meaning of its parts and their arrangement (syntax).
The principle, stated as such, does not place substantive constraints on exactly how it should be realized. In practice, there are two principal ways in which it is deployed. Either syntax feeds semantics, i.e. semantics interprets the output of syntax in a systematic fashion, or syntax and semantics are built up together. The former kind of theory is often called ‘Logical Form Semantics’ and is typically the default kind of theory assumed by semanticists who are influenced by Chomsky’s theory of generative grammar, narrowly construed (e.g., Government and Binding Theory, Principles and Parameters Theory, the Minimalist Program; Chomsky , b, ). I will call this kind of theory Interpretive Composition. The other kind of theory is more typically associated with semanticists who have sought to further develop the ideas in Montague’s seminal works on formal semantics (Montague ) and/or have been influenced by the logical work of Ajdukiewicz (), Bar-Hillel (), and Lambek (). There are various modern manifestations of this thread, but a particularly prominent one is typically referred to broadly as Categorial Grammar, of which there are two main varieties: Combinatory Categorial Grammar (e.g., Ades and Steedman , Steedman , , , Jacobson ) and Type-Logical Grammar (Morrill , , Carpenter ).9 However, we also see reflexes of the key idea that syntax and semantics are built in tandem in other
as well as to subsequent dynamic semantic approaches, which are discussed briefly in section ... For readers particularly interested in dynamic semantics, Kamp and Reyle () remains a key text. Partee et al. () is a comprehensive introduction to the formal tools of compositional semantics. Gamut (b) is also an excellent introduction that focuses especially on the logical tools used in formal semantics. Allwood et al. () is a considerably gentler but less comprehensive introduction to logic. Other more advanced introductions to formal semantics include Carpenter (), Chierchia and McConnell-Ginet (), and Jacobson (). Handbooks on semantics include Maienborn et al. (a,b, ), Hinzen et al. (), Lappin and Fox (), and Aloni and Dekker (). Other handbooks contain considerable relevant material, including van Benthem and ter Meulen (, ), Ramchand and Reiss (), and Heine and Narrog (). Portner and Partee () and Davis and Gillon () collect many classic papers in formal semantics and philosophy of language. 8
A common theoretical approach to capturing productivity is recursion, but really it is the empirical phenomenon of grammatical productivity that is strictly relevant here, not how it is formally captured. See Thomas (this volume) and Taylor (this volume) for some further discussion of recursion. 9 Wood () is a good introductory overview of Categorial Grammar.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
approaches to the syntax–semantics interface, such as that of Head-Driven Phrase Structure Grammar (Pollard and Sag , Copestake et al. ) and Glue Semantics (Dalrymple ). This kind of theory is often referred to as ‘rule-to-rule composition’, in reference to the ‘rule-to-rule hypothesis’ (Bach ), because in Montague’s original treatment each syntactic rule of structure building was paired with a semantic rule of interpretation. However, this term is now inaccurate, due to generalizations in the field that removed these pairings of individual rules, particularly the type-driven translation of Klein and Sag (). I will therefore call this kind of theory Parallel Composition. Jacobson () is a (somewhat higher level) introduction to semantics that explicitly compares interpretive and parallel composition. Interpretive composition and parallel composition are introduced briefly in sections .. and .., but first I will introduce some very basic model theory (section ..) and type theory (section ..), which are the two principal formal foundations of compositional semantics. I will use some of the concepts introduced in sections .. and .. in what follows those sections.
.. Model theory Much of modern compositional semantics is model-theoretic. To say that a semantic theory is model-theoretic is to say that statements in the object language (the language being interpreted) are verified against mathematical models that are meant to be representations of the entities and relations that expressions in the object language denote. In other words, if we are interested in truth or falsity of sentences of some natural language, we interpret these sentences, and the expressions which make them up, relative to a model. In his seminal work on model-theoretic semantics, Montague experimented with two kinds of model-theoretic interpretation: direct interpretation (Montague ) and indirect interpretation (Montague ), which can be schematized as follows: () ()
Direct interpretation Language )interpretation Model Indirect interpretation Language )translation Meaning Representation )interpretation Model
There have occasionally been attempts in the literature to argue for one mode of interpretation over the other, typically direct interpretation over indirect interpretation,10 essentially based on the observation that the translation mapping itself may be poorly
10
For example, this seems to be part of the argument in Jacobson ().
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
understood. But the truth of the matter is that indirect interpretation is prevalent, because it is normally more expedient to reason about the intervening meaning representation, which is typically formalized in some appropriate logical language11 than it is to reason about a direct relation between language and a model. Also, for those semanticists who are primarily concerned with the foundational issue of compositionality, the intermediate formal meaning representation typically offers a better testing ground for theories. Partee et al. (: ff.) offer a detailed and formally precise overview of model theory for formal semantics, but here I will present only the very basics. Minimally, a model M for a language is an ordered pair of some domain, D, and a valuation function, V, that assigns semantic values to the basic expressions of the language:12 ()
M = hD,V i
The fundamental domain D is the domain of individuals/entities. The values of complex expressions all boil down to functions from the domain of entities to the domain of truth values, {, }—where is True and is False—and functions on such functions, as well as functions from entities to entities and truth values to truth values.13 Once we move into model-theoretic interpretation, the notion of truth of a sentence S, or more generally interpretation of an expression α, is always with respect to a model M. This can be represented in various ways. The standard notation from Montague Semantics is: () For any expression α, ½½αM is the semantic value of α in model M; i.e. ½½αM ¼ VðαÞ. The brackets, ½½ , represent the interpretation function on expressions in the object language (the language being interpreted). V is the valuation function for basic expressions from () above. So for basic expressions, the interpretation function simply returns whatever V does. The kind of model-theoretic interpretation discussed so far is very impoverished, because it has no way of interpreting variables. Therefore, it is typical to see interpretation relative to an assignment function: ½½αM;g is the denotation of an expression α relative to a model M and an assignment function g that assigns values to variables. The model’s valuation function, V, provides the
Montague himself used intensional logic (Fitting ). Different presentations might name the parts of the model differently, but the basic idea is the same. For example, Dowty et al. (: ) use A for what I have called D and F for what I have called V. 13 Note that this is a minimal, extensional model theory. In order to capture intensional phenomena, such as modality and propositional attitudes, a common move is to add a set of possible worlds, W, to the model. 11 12
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
interpretation of constants, whereas the assignment function, g, provides the interpretation of variables.
.. Type theory Type theory essentially offers a way to capture semantic well-formedness according to some syntactic calculus over a vocabulary of types. In other words, types are semantic categories and the type system is meant to guarantee that well-defined combinations of types are always semantically well-defined. Since the foundational work of Montague (), type theory has often been adopted for natural language semantics due to the fact that it is more expressive than other natural options, such as first-order predicate logic. In standard (first-order) predicate logic, we can only talk about individuals (e.g., Robin), properties of individuals (e.g., sleeps), and relations between individuals (e.g., loves). But natural language abounds in expressions that are more complex than this. For example to say that Robin is similar to Sandy minimally requires that they have some property in common, but first-order predicate logic does not allow us to quantify over properties (‘some property’), only over individuals/entities (Gamut a: –). The minimal type theory that is typically favoured for simple extensional natural language semantics is as follows (Gamut a: ; Partee et al. : –):14,15 ()
Syntax of the type theory The set of types T is the smallest set such that: a. e is a type in T b. t is a type in T c. If a and b are types in T, then ha, bi is a type in T
() Semantics of the type theory Let D be a given domain of individuals/entities. Then a. Domaine = D b. Domaint = {, }, the set of truth values c. Domainha,bi = the set of functions from Domaina to Domainb We thus see that the types and their interpretations are related to elements of the models introduced in section ... In order to pump intuitions about what types are, it may be useful to think of complex types as ‘input–output devices’. A complex type ha, bi basically says, ‘If you give me a thing of type a, I’ll give you a thing of
14
The type e is the type for entities and the type t is the type for truth values. In order to handle intensions, it is common to include a type w for possible worlds (see footnote ). In Montague’s foundational treatment, the type s is used to create intensional types; e.g., hs, he, tii is the intensional type for properties. However, the type s is only introduced by Montague in a complex type definition similar to clause (c); s is not a base type in the system of Montague (). 15
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
type b’.16 This intuition forms the basis of thinking about types both as functions and as the basis for computation. (A computer is also an input–output device!).
.. Interpretive composition Interpretive composition theories assume that semantics interprets the output of syntax. A central exemplar of such a theory is the one developed by Heim and Kratzer () in their influential textbook. In the remainder of this section, I present an example of interpretive semantic composition, adapted from Heim and Kratzer (: –). First we need an inventory of possible denotations. This will be as defined in sections .. and ..: the domain of individuals/entities, the domain of truth values, and functions on these domains. We also need denotations for all our lexical items, the basic elements of composition:17 ()
Lexicon a. [[Kim]] = Kim etc. for other proper names b. [[smokes]] = The function f of type he, ti from entities to truth values such that for all x in Domaine, f(x) = if and only if x smokes. etc. for other intransitive verbs
Lastly, we need rules for interpreting syntactic structures, which we assume are generated by an independent syntactic component: ()
Rules for syntactic structures X
S1. If α has the form β S2. If α has the form
X
then ⟦α⟧ = ⟦γ⟧ (⟦β⟧) or ⟦α⟧ = ⟦β⟧(⟦γ⟧), as appropriate.18 γ
then ⟦α⟧ = ⟦β⟧.
β
The compositional semantics works as follows for a simple example (based on Heim and Kratzer : –): Consequently, complex types are often alternatively represented as (a ! b), with outer parentheses typically left out. This makes the computational (input/output) nature of types more evident, but is somewhat more cumbersome for larger types. 17 Heim and Kratzer () adopt the convention that expressions of the object language are written in bold. 18 In other words, as determined by the semantic types of β and γ; this is the notion of type-driven translation referred to above (Klein and Sag ). 16
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
Kim smokes. S
1.
NP
VP
N
V
Kim
smokes S
S 2.
NP
VP
N
V
Kim
smokes
=
NP
VP
N
smokes
by S2
Kim
S 3.
NP
VP
N
V
Kim
smokes
S
=
smokes
by three further applications of S2 ([vp smokes], [n Kim ], [np N])
= ⟦ smokes ⟧ (⟦ Kim ⟧)
by S1
Kim
S 4.
NP
VP
N
V
Kim
smokes S
5.
NP
VP
N
V
Kim
smokes
=
The function f of type 〈e, t〉 from entities to truth values such that for all x in Domaine, f(x) = 1 if (Kim) and only if x smokes. from Lexicon
S 6.
NP
VP
N
V
Kim
smokes
= 1 iff Kim smokes
by Functional Application
Functional application is the fundamental rule by which the function that is the denotation of smokes is applied to (i.e., combines with) its argument, the individual that is the denotation of Kim, which gives the right truth conditions for the sentence Kim smokes.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
In order to know the actual truth value of Kim smokes, we have to know the denotation of smokes in our model. This is a function that can be represented in any number of standard ways (Partee et al. ), such as: ()
a. ⟦smokes⟧ = {〈Kim, 1〉, 〈Robin, 1〉, 〈Sandy, 0〉}
b. ⟦smokes⟧ =
c. ⟦smokes⟧ =
1 1 0
Kim Robin Sandy Kim Robin Sandy
→ → →
1 1 0
In (a), the function is represented as a set of pairs. In (b), it is represented in a standard graphical format. In (c), we have a tabular function, which is a good compromise between the set notation and the graphical notation.
.. Parallel composition Categorial Grammar and other parallel composition theories assume that ) each lexical item pairs its syntactic category with a semantic interpretation and ) each syntactic operation is paired with a semantic operation. Therefore, once the syntax is built, the semantics has also been determined, rather than semantics interpreting the output of syntax. The presentation of Categorial Grammar is often quite different from the presentation of Logical Form Semantics in Heim and Kratzer (). However, I will try to maintain as much similarity as possible between the presentation in this section and the presentation in the previous section, in order to maximize comparability. We therefore continue to adopt the same notational choices as in () We also continue to assume the same denotations for Kim and smokes. The lexicon is presented slightly differently in Categorial Grammar, because the whole point of the parallel composition enterprise is to associate syntactic categories directly with their semantic types and interpretations. We follow the presentation of Steedman (), except for the notational choices mentioned above: ()
Lexicon a. Kim := NP : [[Kim]] b. smokes := S\NP : [[smokes]]
The slash means that smokes is a complex category: the backward slash indicates that smokes takes an argument to its left, an NP, and the result is an S. On the semantic side, the function [[smokes]] applies to the argument [[Kim]].
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
This application is captured in the following combinatory rule: () Functional application a. X/Y : f Y : a ) X : f(a) b. Y : a X\Y : f ) X : f(a) The rules in () have two elements on the left-hand side of the arrow and one on the right-hand side. The forward slash rule in (a) states that a complex category X/Y can be combined with a category Y to its right (hence forward slash) to yield the result X. Similarly, the backward slash rule in (b) states that a complex category X\Y can be combined with a category Y to its left (hence backward slash) to yield the result X. In each case, the complex category on the left-hand side of the rule is twinned with an interpretation f, a function, and the result of combining the complex category with the category Y is to apply this function to the interpretation of Y; i.e. f is applied to a, yielding f(a). For example, in English, the combination of a transitive verb with its object would involve the forward slash rule (a) (because transitive verbs precede their objects), whereas the combination of a subject with its predicate would involve the backward slash rule (b) (because predicates follow their subjects). We can then construct the following derivation: ()
Kim
smokes
NP : ⟦Kim⟧
S\NP : ⟦smokes⟧
S : ⟦smokes⟧ ( ⟦Kim⟧) The result is interpreted as before: It is true if and only if the function that is the denotation of smokes maps the individual that is the denotation of Kim to . In other words, if Kim indeed smokes, [[smokes]] ([[Kim]]) yields (True); otherwise it yields (False). As mentioned above, the key idea in parallel composition is that syntax and semantics are built up together, rather than semantics interpreting the output of syntax. This does not mean that different choices cannot be made about the syntax, though. We have seen a simple example from Categorial Grammar in (), but another kind of parallel composition theory—Glue Semantics (Dalrymple et al. , Dalrymple , , Asudeh , a,b, , , Lev , Kokkonidis , Andrews , Bary and Haug , Lowe a,b, Findlay )—makes a different assumption about the connection between syntax and semantics.19 The crucial difference is that Categorial Grammar assumes that syntax/grammar is directly responsible for semantic composition, which is a traditional strong assumption derived from Montague Grammar (Jacobson , , Barker and Jacobson ). In other words, Categorial Grammar assumes that the syntax of the compositional semantic derivation (proof) is 19
See Asudeh () for a discussion of Glue Semantics versus Logical Form Semantics and Categorial Semantics in light of questions of compositionality.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
isomorphic to the syntax that generates all and only the well-formed strings of the language (i.e., the syntax of the object language, in the standard sense intended by circumlocutions like ‘the syntax of English’). In contrast, Glue Semantics assumes that terms in the compositional semantics are instantiated by the syntax, but that semantic composition is handled by a separate ‘glue logic’ (a fragment of linear logic; Girard ) which ‘sticks’ meanings together (i.e., composes them). This allows semantic composition in Glue Semantics to be commutative (i.e, order-insensitive), whereas composition in Categorial Grammar is normally non-commutative (since syntax is by definition order-sensitive). Commutative composition arguably more accurately and purely captures the nature of functional application, the key operation of semantic composition (for further discussion, see Asudeh : Chapter ).
. P
.................................................................................................................................. Pragmatics20 concerns the information conveyed by utterances—that is, particular tokens or uses of a given abstract sentence type (Korta and Perry ). For example, consider the following sentence: ()
Kim is allergic to peanuts.
This sentence potentially conveys very different information before and after Kim has been served and partaken of pad thai (a dish that typically contains peanuts). If the sentence is uttered beforehand, it may convey the information that Kim should be served something else, or function as a warning to the host, or constitute a conspiratorial suggestion that this may be an effective way to dispatch Kim. If the sentence is uttered afterwards, it may function as an admonishment to the host or convey the information that an ambulance should be called immediately. Yet, the compositional meaning of the sentence surely has not changed and remains constant whether the sentence is uttered before or after Kim eats the pad thai. This highlights another way of looking at pragmatics: It is the study of the effects of context on meaning. The twin pillars of ‘classical pragmatics’ (Chapman , Korta and Perry ) in this sense are Austin’s theory of speech acts (Austin ) and Grice’s theory of implicature (Grice ).21 These will be discussed in the following two sections. Certain more recent developments in pragmatic theory will be discussed in section . below. 20
The classic overview is Levinson (). A somewhat gentler but correspondingly less comprehensive introduction is Chapman (). Handbooks on pragmatics include Horn and Ward (), Huang (), Allan and Jaszczolt (), and Östman and Verschueren (). Davis () is an excellent collection of original papers in pragmatics. Kasher () is a mammoth multi-volume collection of papers and offers a very catholic selection. The classical works of Austin () and Grice () are deep, but not inaccessible. 21 The dates on these publications belie the much earlier genesis (in the s and s) and circulation (most notably in prominent lectures in the s and s) of these ideas.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
.. Speech acts The title of Austin’s book, How to do things with words, gets across very well the central idea behind speech act theory. In communicating propositions, we also perform actions; what these actions are depends in part on lexical choices in the utterance, but also on the speaker’s intentions and on social and cultural factors. We have already seen some examples of speech acts in the introduction to section ., when I wrote that () could function as a warning or admonishment, given contextual knowledge. Austin originally made a binary distinction between performatives and constatives. The latter convey propositions (i.e., a declarative statement that is capable of being true or false), whereas the former constitute actions, such as promising, warning, declaring, commanding, apologizing, baptising, sentencing, etc. The last two examples illustrate that certain performatives depend on cultural or legal authority under an institutional aegis. Only an ordained priest can perform a baptism, and even then only in certain circumstances. And only a judge may pass a sentence and only in a court of law. Thus, Austinian pragmatics places a special emphasis on language in its sociocultural setting. Whereas a constative can be false, Austin held that a failed performative was instead infelicitous and proposed felicity conditions for the successful execution of performatives. He divided infelicities into two major classes (Austin : Lecture II): abuses (‘act professed but hollow’) and misfires (‘act purported but void’). For example, if a speaker utters I promise to come to the party without any intention of upholding their promise, then this would count as an abuse for Austin. An example of a misfire would be a wager made but not taken up: If I say I bet you £ that the United Kingdom does not leave the European Union but you make no acknowledgement of accepting the wager, then a wager has not successfully been made. In the latter part of How to do things with words, Austin abandons the performative/constative dichotomy and instead develops a more general theory of speech acts in terms of three kinds of communicative forces that all utterances have. This is essentially motivated by the fact that an apparently constative sentence can have a performative function, as we have already seen with example (). On the revised view, an utterance consists of a locutionary act, an illocutionary act, and a perlocutionary act. The locutionary act is the act of uttering a sentence as a form–meaning mapping with some structure mediating the mapping; in other words, the locutionary act consists of a tripartite activity of uttering a sentence of some language, with its proper phonetics–phonology (the part of the locutionary act termed the phonetic act), syntax over some lexicon (the phatic act), and semantics (the rhetic act). The illocutionary act is the act of doing something through the utterance of the sentence with the force of a statement, warning, promise, etc; thus, the illocutionary act is the closest correspondent of the prior performative category and is at the heart of the part of speech act theory that concerns a speaker’s communicative intentions. This act is complemented by the perlocutionary act, which is the uptake of the communicative intentions by the addressee(s).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
For example, consider an utterance of the following example: ()
I have a gun.
The locutionary act is the uttering of the sentence () The illocutionary force depends on the speaker’s intentions: is it a threat, a warning, or a mere statement of fact? Suppose, the speaker means the latter. Depending on the circumstances, the perlocutionary force may nevertheless be a threat, if the addressee feels threatened by the utterance. Speech act theory was subsequently further developed by Searle, who proposed a set of rules or schemas for capturing speech acts (Searle ) and a resulting putatively exhaustive taxonomy of speech acts (Searle ), which revised Austin’s own proposed taxonomy. Searle’s rules are however not predictive or generative, merely classificatory, and the taxonomy is based only on English and its attendant sociocultural norms, rather than a broader typological survey of languages. Both of these factors could be considered shortcomings from the perspective of linguistic theory. Nevertheless, Austin and Searle’s work on speech acts excited some interest among grammarians, particularly among the Generative Semanticists (Newmeyer , Lakoff , Harris ), who argued that speech acts were actually syntactically encoded (see, e.g., Lakoff b, Ross , Sadock ), in what Levinson (: –) has called the performative hypothesis. At the heart of the performative hypothesis is the claim that every sentence has as part of its underlying structure a highest performative clause, yielding a frame like the following (Levinson : ): () I hereby Vperformative (to) you (that) . . . In an explicit performative the performative verb and possibly other aspects of the structure are overtly realized, but even an utterance of, e.g., () is an implicit performative, more or less equivalent to: ()
I hereby tell you that I have a gun.
Proponents of this syntactic account of performatives adduced evidence for their position from anaphora, adverbial modification, and other phenomena that, at least at the time, were seen to favour an underlying structure like (). In sum, the performative hypothesis suggests a very tight relationship between grammar and the aspect of broader, pragmatic meaning captured by speech acts. However, it was pointed out early on that this does not seem to give a particularly satisfying truth-conditional semantics for declarative sentences,22 since it would seem to predict that () and () are equivalent, contrary to fact (Lewis ): 22 In fact, Austin (: Lecture V) himself seemed to express wariness about any attempt to codify performatives in a grammatical form and consequently does not formulate anything like the performative hypothesis explicitly.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
The world is flat.
()
I state to you that the world is flat.
My merely uttering () renders it true, but not so with (). Lakoff () responded to these criticisms, but Levinson (: –) argues that the overall resulting conception of meaning remains unsatisfactory. The performative hypothesis has largely been abandoned as a claim about syntax. However, a related, subtler claim has been offered more recently, starting in work by Krifka (, ), that there are (typically nonexplicit) speech act operators in the compositional semantics.
.. Implicature In the introduction to section ., I suggested that an utterance of () (Kim is allergic to peanuts) could convey the information that Kim should be served something other than pad thai or could convey the information that an ambulance should be called. However, it is clear that () does not have as its semantic interpretation either of these propositions. Rather, these propositions are (potentially) implied by the speaker, under certain conditions, and normally inferred by the hearer, given the same conditions. These extra propositions are what Grice (, ) called implicatures.23 Grice divided implicatures into two main classes: conventional implicature and conversational implicature. The latter was further subdivided into particularized and generalized conversational implicature. Given the focus in this section on the role of pragmatics in the relationship between grammar and meaning, it is the conventional implicatures and generalized conversational implicatures that are of key interest. But in order to see why, it is useful to first consider conversational implicatures more broadly, which also serves as a broad introduction to Grice’s theory. Grice was interested in rational action and construed conversation as a particular, linguistic instance of such activity. He therefore proposed the following ‘rough general principle’ (Grice : ) that participants in conversation observe and expect each other to observe. ()
The Cooperative Principle (CP) Make your conversational contribution such as is required, at the stage at which it occurs, by the accepted purpose or direction of the talk exchange in which you are engaged.
It is important to realize that this is offered as a characterization of rational conversation, not as a prescription for rational conversation (Chapman : ). 23 Implicatures are distinct from entailments, because the latter necessarily follow from the truth of a proposition, whereas implicatures are merely implied through the use of a proposition by a speaker. For example, implicatures can be cancelled felicitously, but cancelling an entailment leads to contradiction.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Grice proposed that there are four conversational maxims. These maxims, together with the CP, yield implicatures. . Quality24 (a) Do not say what you believe to be false. (b) Do not say that for which you lack adequate evidence. . Quantity (a) Make your contribution as informative as is required (for the current purposes of the exchange). (b) Do not make your contribution more informative than is required. . Relation25 Be relevant. . Manner Be perspicuous. (a) Avoid obscurity of expression. (b) Avoid ambiguity. (c) Be brief. (d) Be orderly. Again, despite the imperative mood, these are offered as characteristics, not as prescriptive instructions. Let us look at one of Grice’s examples: () A: I am out of petrol. B: There is a garage round the corner. If B is following the CP and the maxims, then they must assume that their reply is relevant. Their reply is only relevant if they believe the garage is possibly open and willing to sell A petrol. So, given the first submaxim of Quality, B’s utterance in () implicates or conveys the proposition φ that the garage round the corner is (possibly) open and (possibly) sells petrol. However, this is not what B literally said and if φ turns out to be false, A could not accuse B of having lied (only if in fact there is no garage round the corner). This implicature is an example of a particularized conversational implicature (PCI), because φ arises only in the context of what A said. If A had said, for example, I have a puncture, then B’s utterance would not implicate φ, but rather φ′: The garage round the corner is (possibly) open and (possibly) fixes punctures. This property of contextual 24 Grice (: ) also proposes a Quality ‘supermaxim’: ‘Try to make your contribution one that is true’. 25 This maxim is also often called Relevance. Although it is the inspiration for Relevance Theory (Sperber and Wilson , Wilson and Sperber ), the notion of relevance in the two theories should not be conflated.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
variance means that particularized conversational implicatures cannot easily be claimed to be grammatically encoded. That would be tantamount to claiming that B’s utterance is generally ambiguous between having the implicatures φ and φ′, but it is clearly not: in one context it implicates φ and not φ′ and in another context it implicates φ′ and not φ. This is further underscored by the property of nondetachability of conversational implicatures: any other way of expressing the information expressed by B in () implicates the same proposition, given the same context (Grice , Levinson ). For example, if B’s reply to A’s utterance in () had been See that corner? If you turn there, you’ll find a garage, B would still implicate φ. But if this is so, particularized conversational implicature cannot have much to do with grammatical encoding, as not even the most fanciful syntactician would attempt to derive See that corner? If you turn there, you’ll find a garage from There is a garage round the corner, or vice versa. In contrast to PCIs, generalized conversational implicatures (GCIs) arise unless there is specific contextual knowledge to the contrary (Levinson : ) and can thus be thought of as default, defeasible propositions (Levinson : ff.). Moreover, GCIs are often associated with specific lexical items and specific logical meanings. For example, the following example leads naturally to a reading of temporal order between the conjuncts, such that the event described by the first preceded that described by the second: ()
The cowboy leapt onto his horse and rode off into the sunset.
This seems to follow by the fourth Manner submaxim (‘Be orderly’) (see Grice : Chapter and Levinson : Chapter ). The GCI is so strong that if we reverse the conjuncts, we nevertheless infer that the cowboy rode off into the sunset (on something other than his horse) and then leapt onto his horse: ()
The cowboy rode off into the sunset and leapt onto his horse.
But the GCI is also partly predictable from grammatically encoded aspects of the semantics, in particular that we have a conjunction of two phrases headed by nonstative verbs. If we conjoin two stative phrases, the two orderings are not distinct— there is no GCI of temporal order: () Ottawa is the capital of Canada and Canberra is the capital of Australia. ()
Canberra is the capital of Australia and Ottawa is the capital of Canada.
Thus GCIs seem to be at least in part grammatically encoded; e.g., they are predictable from the lexical meaning of the conjunction and and the grammatical features of the conjuncts. It is conventional implicature (CI) that most strongly demonstrates a connection between grammar and implicature, because grammatical encoding is a definitional
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
aspect of CIs.26 Let us first consider two of Grice’s examples (without endorsing their implicatures): ()
She is poor but honest.
()
He is English. He is therefore brave.
An utterance of () conveys the information φ that the speaker believes there to be a contrast between poverty and honesty (or, that the latter is unexpected given the former). An utterance of () conveys the information ψ that the speaker believes that bravery is a consequence of Englishness. Grice (, ) contended that these are implicatures that arise from the conventional meanings of but and therefore—i.e., what in modern terms we might call some aspect of their lexical meaning. However, he contended that these aspects of the meaning of these words, although always associated with the words, are not part of what is said, i.e. part of their semantics. Rather they are implicatures, but conventional ones that are always associated with these words rather than conversational ones subject to contextual variation. That this distinction is at least broadly correct can be demonstrated by considering the interaction of () with negation: () It’s not the case that she is poor but honest. A speaker cannot utter () to deny that there is a contrast between poverty and honesty, but rather only to deny the truth of one or both conjuncts. Thus, the contribution to compositional semantics of but seems to be just that of and. However, it also conventionally implicates contrast. Conventional implicature and generalized conversational implicature can thus be seen as part of Grice’s larger programme of preserving the underlying logicality of the natural language correlates of the logical connectives. The GCI account of temporal or causal readings of and sentences mitigates the need, at least from these sorts of examples, for positing an ambiguity in the meaning of and such that there is a nontruth-functional lexical meaning of and in addition to the logical, truth-functional meaning (Carston ). Similarly, the conventional implicature treatment of but apparently accounts for its lack of interaction with the logical operation of negation. This feature of conventional implicature would also be explained if CIs were presuppositions (see section ..) and there have been prominent attempts to unify the two phenomena (e.g., Karttunen and Peters ). However, Potts (, a,b) has argued persuasively, in a pioneering formalization of CIs, that presupposition and conventional implicature are distinct. The essential reasoning is not hard to grasp: while presuppositions are taken for granted by the speaker—i.e., are a kind of 26 I follow Potts () and much subsequent literature in using ‘CI(s)’ as the abbreviation for ‘conventional implicature(s)’, despite the unfortunate fact that it would equally well abbreviate ‘conversational implicature(s)’.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
backgrounded information—conventional implicatures are not taken for granted but are rather some kind of assertion, albeit a side assertion—i.e., CIs are not a kind of backgrounded information (Potts ). I return to presupposition in section ... Potts himself avoids some of the controversy surrounding Grice’s own examples of CIs by focusing instead on cases like appositives () and expressive adjectives () ()
a. Donald Trump, who is a narcissist, was elected President in . b. Donald Trump, a narcissist, was elected President in .
()
That damn knife is very sharp.
In uttering either of () the speaker makes the side comment that Trump is a narcissist. In uttering () the speaker expresses an attitude about the knife or its sharpness. For Potts, () and () are thus paradigmatic examples of CIs. Crucially, once again we see that conventional implicatures are at least partially grammatically encoded. Appositives require a specific kind of intonational off-setting (represented orthographically with a comma, hence Potts’s term ‘comma intonation’). And expressive adjectives are positionally restricted to attributive, nominal-internal position and cannot function predicatively: ()
*That very sharp knife is damn.
This contrast is demonstrated nicely in French, which normally has postnominal attributive adjectives (a), but requires expressive adjectives to be prenominal (b): ()
a. Cette maison maudite est effrayante. This house cursed is scary ‘This cursed (lit.) house is scary.’ b. Cette maudite maison est effrayante. This damn house is scary ‘This damn house is scary.’
The adjective maudite when used postnominally only has its literal meaning of ‘cursed’, but when used prenominally has an expressive meaning similar to English expressive damn. In sum, conventional implicature is another example of pragmatic meaning that is grammatically encoded.
. B
.................................................................................................................................. The relationship between grammar and meaning is essential and undeniable, but this does not mean that the boundary between grammatically encoded/conventionalized
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
meaning and non-encoded/unconventionalized meaning is sharp (Szabó ). In this section, I review three phenomena whose boundaries have been contested. In doing so, I hope to illustrate the sorts of issues that render the relationship between grammar and meaning a rich and fruitful topic of continuing investigation. The section concludes with a brief consideration of dynamic semantics, which yields a somewhat different perspective on some of these issues.
.. Presupposition In making an utterance, a speaker invariably takes some information for granted, either because the information has already been established between the speaker and their addressee(s) (it is part of the common ground) or the speaker believes that it can be accommodated by their addressee(s) (added to the common ground). This assumed information can be characterized as a set of propositions which are the presuppositions of the speaker’s utterance. Presupposition is intimately tied to the context of the speaker’s utterance, but it is also conventionalized in that there are a large number of presupposition triggers that invariably give rise to predictable presuppositions as part of their conventional meaning. Presupposition thus seems to straddle the boundary between pragmatics (pragmatic presuppositions) and semantics (semantic presuppositions).27 Consider an utterance of the following example without any prior relevant context: ()
Kim stopped breastfeeding Saint.
Among the presuppositions of this utterance are: . The addressee understands English. . The addressee knows who the relevant Kim and Saint are. . Kim used to breastfeed Saint. Presuppositions and are pragmatic presuppositions:28 they are general assumptions the speaker is making about the conversation.29 In contrast, presupposition is a semantic presupposition specifically triggered by the aspectual verb stop. It is part of
27 Presupposition is a huge topic in its own right. Overview articles on the topic include Beaver (), and Beaver and Geurts (), Atlas (), Simons (), and Potts (). 28 A reviewer points out that presupposition could equally be considered a semantic presupposition triggered by the proper names, but knowing which Kim and Saint are relevant strikes me as more than is strictly warranted by the normal existential presupposition triggered by proper names, which in this case would merely be that Kim and Saint exist. Therefore, as presented, it seems fair to characterize as a pragmatic presupposition. But this discussion, if nothing else, is further evidence that presupposition is a challenging boundary phenomenon between semantics and pragmatics! 29 Indeed, it could be argued that Grice’s Cooperative Principle is itself a pragmatic presupposition, to the extent that a speaker assumes that the addressee would normally assume that the speaker is following the CP.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
the conventional meaning of stop. If we change the example by replacing stopped with started, presupposition goes away (and is replaced by a presupposition that Kim did not use to breastfeed Saint, since start is also an aspectual verb and presupposition trigger). However, the pragmatic presuppositions and persist. How do we know that presupposition is indeed a presupposition and not just a logical entailment of ()? First, if the presupposition is false—i.e., it is known that Kim did not use to breastfeed Saint—example () is not true, but nor does it seem to be strictly speaking false (Strawson , Heim and Kratzer : Chapter ). Rather, it seems meaningless. In contrast, if an entailment of () is false, () itself is necessarily false. For example, if it’s the case that Kim still breastfeeds Saint, then () is not true, because () entails that Kim does not breastfeed Saint. Second, the presupposition survives certain embeddings that logical entailments do not, e.g., negation: ()
It’s not the case that Kim stopped breastfeeding Saint.
The entailment that Kim does not breastfeed Saint no longer holds, but the presupposition that Kim used to breastfeed Saint still does. This is called presupposition projection or inheritance and raises the following issue in the study of the interaction of presupposition with grammar: Is it possible to fully characterize and explain the environments from which presuppositions do and do not project? The so-called projection problem was recognized early in the history of generative grammar (Morgan , Langendoen and Savin , Karttunen ), but has continued to be of central interest (for an influential recent investigation and further references, see Tonhauser et al. , ).
.. Free enrichment and implicit variables Another debate on whether aspects of meaning are grammatically encoded has centered around examples like the following (see, e.g., Stanley and Szabó , Carston ): ()
Every bottle is empty.
() It’s raining. If a speaker utters () it is probable that they wish to communicate, and succeed in communicating, that every bottle from some contextually restricted class of bottles is empty, not that every bottle in the universe is empty. In other words, in order to know whether () is true in the intended sense, the hearer needs to know which bottles are contextually relevant. Stanley and Szabó () call this the problem of quantifier domain restriction (von Fintel ), and point out that it is an instance of the broader problem of context dependence, which is how context together with grammar determines the proposition conveyed by an utterance.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Example () demonstrates another instance of the problem. In order to know whether () is true (in the relevant sense), the hearer seems to minimally need to know when and where it was uttered. But things are not quite this straightforward (Carston : –). Suppose Matt and Elinor live within an hour’s drive of each other and have decided to play tennis near Elinor’s in the afternoon, but only if it hasn’t rained recently. Right before Matt is to leave, they skype30 to make final arrangements. Elinor is facing away from the window behind her and has not yet realized that it’s started to pour, but this is visible to Matt in the video. It’s not raining where Matt is. Elinor asks, ‘Are we set to play?’ and Matt replies, ‘It’s raining.’ Clearly, Matt, the speaker, does not mean that it’s raining where he is, the location of the utterance—this is neither true nor relevant—but rather that it’s raining where Elinor is. The question, then, is how context and grammar together determine what proposition has been communicated. This problem was already evident in the semantics of indexicals (like I, you, here, now) and deictics (like this and that), as in influential work by Kaplan () and Stalnaker ().31 However, Stanley (: ff.) has argued that the relevant reading of, e.g., () cannot simply be due to a hidden indexical, because the implicit location in examples like the following seems to be bound by the quantificational operator, which Stanley () claims indexicals resist (Stanley : ): ()
Every time John lights a cigarette, it rains.
The relevant reading that Stanley () argues cannot arise from an indexical is one in which the ‘the location of the rain co-varies with the location of the cigarette-lighting’ (Hall : ), i.e. Every time John lights a cigarette, it rains at the location and time where he lights the cigarette.32 Stanley and Szabó () and Stanley () argue that resolving the problem of context requires a semantic solution such that there are implicit variables (and binders) in the semantic representation33 rather than eschewing grammatical encoding in favour of a purely pragmatic resolution. A purely pragmatic account proposes that, for example, the propositional interpretation of () requires what are sometimes called
30
Other video chat tools are available. The publication dates are misleading: Kaplan’s work was more or less contemporaneous with Stalnaker’s. 32 It could be pointed out that the following sentence with an overt location indexical seems to allow a reading either identical to or very similar to the bound reading: 31
(i) Every time John lights a cigarette, it rains there. This seems to cast some doubt on Stanley’s argument, but mainly highlights the need for independent linguistic evidence/motivation for the presence of a phonologically null operator that binds a location variable. 33 Stanley and Szabó (: ff.) also consider, but reject, syntactic solutions, in which there is more than just a variable in the linguistic structure, but rather elided lexical material, as in e.g. Every bottle [that I just bought] is empty.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
unarticulated constituents34 (Perry ), meaning material that is not present in the grammatical encoding of the sentence (its syntax or semantics) but is rather supplied purely pragmatically. It would be as if every bottle in sentences like () is always understood as every relevant bottle (but without the word relevant being present in the structure). It is unsurprising, then, that early focus on the pragmatic solution to the problem came from Relevance Theory (Sperber and Wilson ) and its notion of the propositional form of the utterance, which is now more commonly discussed under the rubrics of free enrichment and explicature (Carston , , ), although the term free enrichment is sometimes confusingly also used as a general descriptive term for solutions to the problem of context dependence, whether semantic or pragmatic (e.g., Stanley : , fn.). Free enrichment/explicature is quite similar to the notion of impliciture (Bach ), but see Bach () on distinctions between the two. The basic idea of free enrichment/ explicature is that sentences need to be fleshed out pragmatically (with contextually provided unarticulated constituents) before they can have propositional force. It thus constitutes one sort of pragmatic solution to the problem of context dependence. Stanley () develops the binding argument discussed with respect to () above in his argument against pragmatic free enrichment. The basic idea is that if a variable can be operator-bound it is necessarily present in the semantics, i.e. grammatically encoded. This conclusion has been denied by proponents of free enrichment—see, e.g., Bach (), Carston (, ), Recanati (),35 and Hall (), among others. Hall (: ) even contends that, ‘A serious weakness in the binding argument, though, is that it relies on the stipulation that bound variables cannot be supplied pragmatically, but must be present in logical form.’ However, it must be said that from the perspective of formal semantics it is not clear what it would mean for a bound variable to ‘be supplied pragmatically’, since operator-binding is fundamentally considered a part of compositional semantics (Lewis , Montague ), indeed as Stanley contends. That is, if we grant Hall her premise, we are fundamentally obliterating the semantics–pragmatics boundary, so nobody interested in maintaining this traditional boundary, as Grice () did, would grant this. Hall’s remark highlights how this entire debate has been undermined by a lack of independent linguistic evidence and formal rigour. The proponents of free enrichment 34
This is a truly unfortunate choice of terminology, since it could easily be mistaken for a constituent that is present in the syntax but unpronounced (i.e. what is often called syntactic ellipsis, routinely posited in certain theories), which is the very thing that the term seeks to avoid; see footnote . 35 Martí () offers a counter-argument to Recanati (), claiming that a different formal treatment of variables renders free enrichment unnecessary. However, this argument seems to be a kind of parsimony argument that rests on understanding free enrichment as an additional process. It is unlikely that proponents of free enrichment would grant that premise, since they would contend that their overall approach is more parsimonious in positing less syntactic and semantic machinery. Even Stanley and Szabó, who seek to reject the pragmatic solution to the problem of quantifier domain restriction, i.e. free enrichment, grant that ‘The obvious advantage [of the pragmatic approach] is that one can propose a syntax and semantics for sentences containing quantifiers that is extremely simple and does not involve covert expressions or covert semantic values’ (Stanley and Szabó ).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
have little to gain from proposing an adequate formalization, since they are after all arguing that compositional semantics is not responsible for the phenomenon. But the proponents of the semantic account do owe a full formalization, but offer only a sketch (Stanley , Stanley and Szabó ), which undermines the proposal and leaves room for formal doubts (Neale ). This is allayed by Elbourne (), who is concerned with incomplete definite descriptions, like the table (Strawson ), which raise the same issues as quantifier domain restriction (Neale ), since the uniqueness of the incomplete definite description is satisfied contextually (i.e., it need not be the case that there is a unique table in the universe for the incomplete description to be used successfully). Elbourne () provides an explicit formal syntax and semantics and an analysis of a fragment of English. Moreover, he provides independent linguistic evidence from ellipsis that he argues shows that the implicit variable is a variable over situations, rather than the kind of variable proposed by Stanley and Szabó ().
.. Scalar implicature Another recent boundary conflict has arisen around the phenomenon of scalar implicature, with some claiming that it is grammatically encoded rather than a result of purely pragmatic reasoning. A scalar implicature results when a speaker fails to make an alternative, intuitively stronger assertion, and thereby implicates that the stronger alternative is false. For example, an utterance of () results in the scalar implicature in (): ()
Some students protested.
() Not all students protested. The speaker could have uttered () with all substituted for some, but chose not to. They thereby implicate () Scalar implicatures are so-called because they seem to crucially involve a scale, ordered by informativeness/entailment; e.g., hall, most, somei (Horn , Levinson , Hirschberg ). On Gricean reasoning, this is a kind of Quantity implicature (see Potts for a carefully worked-through example) and some see scalar implicatures as paradigmatic cases of generalized conversational implicatures (Levinson ). The alternative, grammatical view of scalar implicature holds that they are in fact part of the semantic meaning of relevant lexical items and are compositionally derived, typically through interaction with a covert exhaustification operator (essentially a covert correlate of only). Consider the following example from Chierchia (: ):36 It is useful for what follows to bear in mind the distinction between inclusive and exclusive or. ϕ or ψ, where or is interpreted inclusively, is true if and only if ϕ is true or ψ is true, including the case where they are both true. ϕ or ψ, where or is interpreted exclusively, is true if and only if ϕ is true and ψ is false or else ϕ is false and ψ is true, thus excluding the case where they are both true. 36
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
If you assign a problem set or an experiment, students will love it. But if you assign a problem set and an experiment, they will hate it.
If the or in the first sentence is interpreted inclusively (i.e., such that ϕ or ψ is true if both ϕ and ψ are true), then the second sentence would result in a contradiction. The fact that it doesn’t is taken by proponents of the grammatical view of scalar implicature to show that there is an embedded, or local, scalar implicature in the first sentence, arising from a covert exhaustification operator, which we mark as O following Chierchia (); this renders the disjunction exclusive: () If you assign O[a problem set or an experiment], students will love it. The meaning of () given the exhaustivity operator, is thus equivalent to that of () which is not contradictory. ()
If you assign a problem set or an experiment but not both, students will love it. But if you assign a problem set and an experiment, they will hate it.
Proponents of the grammatical view have argued that these effects are, in essence, too systematic and predictable, given other features of the grammar, to be the result of standard Gricean reasoning (see, e.g., Chierchia , , Chierchia et al. ). But the theory remains controversial and others have argued that the effects can be explained through more standard pragmatic reasoning (see, e.g., Russell , Geurts , Geurts and van Tiel , Franke ). The debate has also resulted in an interesting series of psycholinguistic studies, and commentary on these studies, that seek to empirically bolster either the grammatical explanation (among others, Chemla , Chemla and Spector , Crnič et al. , Singh et al. ) or the standard pragmatic explanation (among others, Geurts and Pouscoulous , Sauerland , van Tiel et al. ).
.. Dynamic semantics Another perspective on these boundary issues comes from dynamic semantics (Kamp , Heim , Groenendijk and Stokhof , Kamp and Reyle ), an umbrella term for semantic theories that take the fundamental role of meaning to be information growth (Nouwen et al. ). In the dynamic approach, the meaning of a proposition is its capacity to update a context, which is in turn a representation of the entities under discussion and the information that has thus far been accumulated about them. This contrasts with the kinds of theories sketched above, which are static: they concern truth relative to some given situation and do not model information growth. However, truth-conditional semantics can be recovered from dynamic semantics, so the two approaches are not fundamentally oppositional, although their understanding of
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
compositionality is distinct (Nouwen et al. ). Dynamic semantics grew out of the recognition that there are linguistic conditions on the introduction of and subsequent reference to entities in a discourse (Karttunen , ). In particular, indefinite and definite noun phrases behave broadly differently: a key role of indefinites is to introduce entities to the discourse (discourse referents), whereas a key role of definites is to refer to already established discourse referents (Kamp , Heim ). In this respect, definites behave similarly to pronouns.37 This also suggests a new way to think about presuppositions associated with definite descriptions as conditions on the discourse referents that the definites bind to. This in turn suggests that presuppositions may also somehow have to bind to discourse referents and are at least partly anaphoric in nature (van der Sandt ). This means that, from the dynamic perspective, there may be distinct connections between the three aspects of meaning that we have focused on (lexical, compositional, pragmatic) and therefore distinct boundaries and boundary issues. In particular, ) the boundary issues of presupposition and free enrichment discussed above are potentially closely connected in dynamic semantics and ) the implicit variable-binding on one view of the free enrichment issue may in fact be most profitably understood as a kind of dynamic variable binding (Heim , Groenendijk and Stokhof ).38 However, notice that adopting the dynamic perspective does not dispel the issue: if there is some discourse variable that permits dynamic binding, then there is some structural element (implicitly) present, which is still a version of the implicit variable approach and still goes against the free enrichment view. More generally, it is fruitful to think of dynamic semantics as a kind of augmentation of static semantics rather than as a radical alternative (Nouwen et al. ). But this cuts both ways: it is unlikely that dynamic semantics could give solutions that are fundamentally distinct from static semantics if the issue is at heart one of allegedly otiose representations. Nevertheless, if seen 37
Caution should be taken here. Similarity of behaviour does not necessarily entail that pronouns in fact are (hidden) definite descriptions. The latter view, though still controversial and far from universal, is indeed a long-standing and well-defended view in the philosophy of language and linguistics, but is a stronger claim than the mere observation of similarities of behaviour. See Elbourne () for a thorough defence of the stronger claim and further references. See Nouwen (forthcoming) for further discussion and a dynamic semantics defence of the weaker, non-conflationory claim that pronouns can behave like definite description without literally being disguised descriptions. 38 A reviewer suggests that the implicit variable in () above (Every time John lights a cigarette, it rains) is a kind of donkey pronoun, which have received a lot of attention in both dynamic and nondynamic semantics (Geach , Groenendijk and Stokhof , Heim , Elbourne ): (i) Every farmer who owns a donkey feeds it. Here the interaction between the indefinite (a donkey) and the pronoun (it) is such that it has universal force, despite appearances (i.e., the sentence is true if and only if the relevant farmers all feed all their donkeys). It may well be that the mechanism involved on the implicit variable-binding view is whatever mechanism is involved in interpreting donkey pronouns (see Elbourne for a thorough overview and references and Nouwen, forthcoming, for a recent reappraisal). But notice that the essential problem remains how a cigarette binds a locational variable, which it could not otherwise do. This is quite distinct from what is happening in (i), where we have exceptional binding of a pronoun it which could otherwise corefer to or be bound by a donkey.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
through the distinct lens of dynamic semantics, there is no doubt that the phenomena and problems discussed above would look different and suggest substantively different analyses and solutions.
. C
.................................................................................................................................. This paper has surveyed the three key areas of the linguistic study of meaning: lexical semantics, compositional semantics, and pragmatics. All three of these aspects of meaning interact with grammar, the first two uncontroversially so. The third, pragmatics, was classically viewed as primarily post-grammatical, operating on the output of semantics. However, this view has been facing challenges for quite some time and continues to do so. The challenges have been mounted both by those who view pragmatics as a crucial part of identifying the proposition expressed, i.e. those who advocate a greater role for pragmatics as inferential reasoning from world knowledge and context, and by those who would not necessarily accept this notion but view certain problems that have historically received pragmatic solutions as instead requiring grammatical solutions, i.e. those who advocate a lesser role for pragmatics.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. I this chapter we review the relationship between grammar and discourse. A discussion of this kind needs to elaborate both the notion of ‘grammar’ and the notion of ‘discourse’ itself. We will assume simply that grammar refers to regularities of linguistic structure (typically morphological and syntactic). Discourse has a range of definitions. Here we will present those that seem most relevant for our discussion. The term ‘discourse’ can be used to refer to regularities of language above the level of the sentence, i.e. regularities and patterns in the construction of spoken or written texts. It can also refer to the general domain of language use and interaction in various contexts and modalities. ‘Discourse’ can also be used to refer to ‘representations’, i.e. how entities and events are represented in language in some particular situation of use, what perspectives, viewpoints, and ideologies are expressed in particular texts in particular situations of use (for more detailed discussion see, for example, Gee (: –)). Just as there are a number of ways of viewing discourse, there are a number of ways of exploring the relationship between discourse and grammar. We can test our grammatical models against actual language used in actual communicative situations, interrogating discourse to see how it exhibits speakers’ knowledge of grammar. We can ask whether our grammatical models should or can include the level beyond the sentence. We explore this in section .. Another way of approaching the relationship is to enquire what role different grammatical choices play in specific situations of language use, i.e. how grammatical choices align with communicative aims and intentions. One aspect of this question relates to how grammatical choices can support a particular representation of reality, or help build a particular perspective, viewpoint, or ideology. We discuss this in section .. Yet another perspective is to explore how language use, that is, the production of texts in a range of situations, can drive changes
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
in the grammatical system, i.e. we can approach the relationship from the point of view of language change or grammaticalization. We take this perspective in section .. It is important to point out that discussions of grammar and discourse are often linked to particular theoretical conceptualizations of language structure. Cumming et al. (), for example, focus their discussion of grammar and discourse on the socalled discourse-functional approaches to grammar, which view grammar as fluid, constantly changing in response to the communicative needs of the language user, and ultimately emerging from discourse. In a similar vein, Hopper () outlines the basic properties of his emergent grammar approach in contradistinction to what he calls ‘fixed-code’, or formal grammar. Whereas the former traces language design ultimately to its functions, the latter sees it as autonomous of use. Functional approaches give priority to usage data, where fixed-code grammars mostly rely on introspection. Functional approaches take into account and consider important the larger context of use; formal (generative) approaches focus on the sentential level and consider sentence structure an independent domain. Whereas for the formal approaches grammar exists a priori and is deployed in discourse, for emergent or usage-based approaches grammar comes into being in discourse, in interaction. Most of the approaches that concern themselves with discourse are therefore functionalist and usage-based in orientation.1 We refer to a number of them in the following sections.
. G
.................................................................................................................................. Standard grammatical analyses tend to focus on the written ideal of a sentence, taken as comprising a complete main clause or a coordination of main clauses (where a main clause may or may not contain an embedded subordinate clause as a component). The domain of grammar is usually held not to apply beyond the sentence. For instance, Huddleston and Pullum (: ) write: ‘The sentence is the largest unit of syntax . . . the study of the relations between sentences within a larger text or discourse falls outside the domain of grammar. Such relations are different in kind from those that obtain within a sentence’. However, while there are good grounds for distinguishing the domains of grammar and discourse, the boundary between them is not always clear-cut. In this section, we will look at some of the problems encountered in drawing the boundary, and consider what grammarians can learn from looking at the way grammar is deployed in building discourse. First, in section .., we will look at how grammatical resources contribute to making a text ‘coherent’, a central topic in text linguistics, which has focused mainly on written texts. Then, in the remainder of section ., we will turn to interactive spoken discourse, as this poses the greatest challenges for grammatical analysis. 1
See, however, Guéron () for recent work on the interaction between grammar and discourse within a generative grammar framework.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. Grammar and text coherence While syntactic principles determine what is a structurally well-formed sentence, it is generally agreed that the ‘well-formedness’ of a discourse or text is mainly to do with its ‘coherence’ (e.g., Sanders and Sanders ): the connectedness between parts of text which makes for a unified whole. This involves global constraints such as relevance, in contrast to the local constraints imposed by syntax (Ariel ). The connectedness of text has been an important topic in the field of text linguistics (e.g., de Beaugrande and Dressler , Sanders and Pander Maat ) and within functional approaches which concern themselves with text analysis, notably Halliday’s Systemic Functional Grammar (e.g., Halliday , ; see Mackenzie, this volume). Connections between textual elements may be explicitly marked by linguistic forms, but can also be left implicit, requiring inference to make the connection. Consider (): ()
Sam Jenkins was knocked off his bicycle by a bus. He broke his collarbone.
We are likely to infer from this sequence of sentences that it was the accident described in the first sentence that caused Sam’s collarbone to break, even though the causal relationship is not explicitly expressed (it could have been expressed, for instance, by adding as a result to the second sentence). Because connections between textual elements are often implicit rather than explicitly expressed, it has become widely accepted that coherence is more appropriately viewed as a cognitive phenomenon than as an inherent property of a text: ‘Language users establish coherence by relating the different information units in the text’ (Sanders and Sanders : ). The use of explicit linguistic devices to indicate connectedness is often labelled ‘cohesion’ in distinction from coherence as a cognitive phenomenon. An early, seminal work on this topic is Halliday and Hasan’s Cohesion in English (), which considers how grammatical (as well as lexical) resources are used to contribute to cohesion. The grammatical resources surveyed include anaphora, conjunction, ellipsis, and substitution. Quirk et al. () include in their English grammar a chapter ‘From sentence to text’, which focuses on the contribution to textual cohesion made by a broad range of grammatical devices (also including, for instance, adverbials, tense, and aspect). As an example of the cohesive role played by grammatical resources, consider the use of anaphora in (), a paragraph from a personal letter. The source of the example is ICE-GB, the British component of the International Corpus of English (Nelson et al. ), which comprises one million words of British English from the early s. ()
[Context: the writer has recently moved from the UK to Brussels] Paul was due to come out this weekend but, has had decided not to now. That’s a shame – I had been looking forward to his visit. I daresay he may make the trip sometime in April now. (ICE-GB, WB- #–; the strike-out indicates a deletion by the writer)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Referential continuity contributes to the cohesion of this passage. For example, his in the second (orthographic) sentence and he in the third are anaphors which relate back to the antecedent Paul in the first sentence. That in the second sentence is also interpreted anaphorically as referring to Paul’s decision, described in the first sentence. As noted by Huddleston and Pullum (: , ), many kinds of anaphoric relation can hold both within and across sentences—including various kinds of ellipsis which can also be treated as anaphoric relations. For instance, there is an anaphoric gap in the first sentence of () following not to which relates back to the antecedent come out this weekend (we understand ‘Paul has decided not to come out this weekend now’). This can be compared with a parallel example where the same kind of relationship holds across a sentence boundary, as in ()—or indeed, if we extend our discussion to dialogue, across different speakers, as in (): ()
Paul was due to come out this weekend. However, he has decided not to now.
() A: Isn’t Paul due to come out this weekend? B: No, he has decided not to now. The fact that such relations hold both within and across sentences creates some difficulties in drawing the boundary between grammar and discourse, as pointed out by Ariel (). A further issue is that delimiting the sentence as a syntactic unit is in fact ‘quite problematic’, as noted by Huddleston and Pullum (: ) in their chapter on punctuation. For instance, clausal coordination need not be marked by any formal device. Consider their example, reproduced in (): ()
Some went to the concert, some stayed at home.
The chosen punctuation makes this a sentence in orthographic terms, but the syntax here does not distinguish between a coordination of clauses and a sequence of separate main clauses. Grammatical descriptions have tended to show a bias towards written language (see Linell , for discussion), and work in text linguistics has tended to focus mainly on written monological texts (Sanders and Sanders ). However, it becomes even more difficult to draw the boundary between grammar and discourse when we consider interactive spoken discourse. This area of study is especially valuable and challenging for grammarians. It is valuable in providing a source of evidence about speakers’ knowledge of grammatical structures, as we can observe how they build these structures in real time and respond to the structures being built by others. It is challenging as a testing ground for our grammatical models. For instance, such data are notoriously difficult to divide into grammatical units such as ‘sentence’ and contain frequent instances of ‘fragmentary’ structures which, although not integrated into sentential units, make complete and coherent contributions to the discourse. The remainder of section . will focus on the value and challenges to the grammarian of looking beyond the sentence in studying interactive spoken discourse.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
In section .. we will briefly discuss several different lines of relevant research on spoken discourse. We will then look at initial problems in the delimitation of grammatical units such as ‘sentence’ in spoken data (..) before focusing in more detail on the challenges posed by ‘clause fragments’ (..).
.. Strands of research on grammar and spoken discourse Various lines of research have investigated the grammar of spoken English. Some of this research has been stimulated by the increasing availability over recent decades of computerized English corpora which include spoken data. Such research has often included some comparison of written and spoken genres in terms of the frequency of particular grammatical features (see Dorgeloh and Wanner, this volume, for more on this line of work). For example, the Longman Grammar of Spoken and Written English by Biber et al. () draws on corpus data to compare the four genres of fiction writing, news writing, academic writing, and conversation. Leech (a), one of the authors of that grammar, notes that ‘conversation . . . stands out clearly as being frequently very different, in terms of grammatical probabilities, from the written varieties. Some grammatical features (such as dysfluency phenomena, in so far as they are grammatical) are almost entirely restricted to the spoken variety, but in general the same descriptive framework applies to all four registers’ (p. ). Nonetheless the differences are not restricted to those of frequency of particular grammatical features; there are also differences in the way grammar is deployed in spoken interaction, more relevant to the topic of this chapter. Some of these are described in the chapter on ‘the grammar of conversation’ in the Longman Grammar; see also Miller and Weinert (), Miller (), and Quaglio and Biber (). A growing field of research is that of interactional linguistics (IL), surveyed in a recent textbook by Couper-Kuhlen and Selting (). IL gives serious consideration to spoken interaction as the natural ‘home environment’ in which knowledge of grammar is deployed. IL has developed from earlier functional approaches to linguistics, especially ‘West Coast functionalism’ (e.g., Givón b, Du Bois , Chafe ), and has also drawn heavily on conversation analysis (CA), a strand of sociological research that focuses on conversation as a form of social action (see, e.g., Sidnell and Stivers ). CA has made important contributions to the understanding of various aspects of conversational organization, including turn-taking (see Sacks et al. () for a seminal early discussion) and the advancement of social action through sequencing patterns (including ‘adjacency pairs’ such as request–acquiescence but also more complex multi-turn sequences). Whereas speech act theory (see König, this volume) has tended to focus on particular types of social action carried out by single utterances, CA examines how actions unfold in conversational sequences and covers responsive as well as initiating actions (see Levinson , for discussion). While IL focuses more on linguistic form than CA, it has continued CA’s strongly empirical methodology: attending carefully to audio or video recordings, making
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
transcriptions which include considerable prosodic detail, and often engaging in quite detailed analysis of unfolding interactions. This kind of methodology tends to be extremely ‘bottom-up’: generalizations emerge slowly, as researchers gradually build up collections of instances of similar phenomena encountered in the data. This contrasts with much corpus linguistic research, where recurrent formal patterns are often readily identified by computerized searches across a large database of spoken extracts, but where less attention is often paid to the extended contexts within which these patterns occur. Also unlike much corpus linguistic work, IL and CA work has tended not to be quantitative. However, some quantitative work in this line has started to be carried out. An example is the large cross-linguistic study of question–response sequences in conversation reported in Stivers et al. (), which includes a study of American English by Stivers (). A coding system was systematically applied to the data to allow quantitative analysis of the formal and pragmatic properties of questions and responses. Another, very different line of work that also stresses the real-time unfolding of dialogue is that oriented towards language processing. Research in this field often involves the computational modelling of dialogue for the practical purpose of developing dialogue systems (see, e.g., Ginzburg and Fernández ). This has presented huge challenges in dealing, for instance, with ellipsis and the incorporation of contextual information. Researchers in this field have also contributed theories of human language processing (e.g., Ginzburg ). A notable theory is dynamic syntax (e.g., Kempson et al. , Cann et al. , Kempson ). This is a formal model which aims to capture the real-time progression of language processing, with hearers incrementally building a semantic representation from the linguistic input and contextual information. In this model, knowledge of language amounts to knowing how to parse spoken language in context—a radical departure from most formal models since Chomsky where such ‘knowing how’ would be regarded as ‘performance’ and separate from the language system (‘competence’).
.. Delimiting grammatical units in dialogue In this section and the next we use examples from English dialogue to illustrate some of the challenges it presents for grammatical analysis. The source, except where otherwise identified, is ICE-GB. As noted earlier, ICE-GB comprises one million words of British English from the early s. It includes written material and spoken monologues, as well as spoken dialogues. The spoken dialogues comprise around , words drawn from a range of text categories, with private face-to-face conversation being the largest. All texts in the corpus are fully parsed; they are divided into ‘parsing units’: sentences of written text or their rough equivalents in spoken texts. In the cited examples, a short pause (of a syllable’s beat) is marked by the symbol and a longer pause by , while self-corrections are indicated by a strikeout on text considered not to form part of the ‘finally achieved’ grammatical unit. Occasionally a slight amendment has been
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
made to a transcription after listening to the audio recording. We give identifying codes for the source text and specific units cited, but in multi-unit examples we have added simple sequential numbering of speaker units for reference purposes (retaining the letters used in the corpus to identify speakers, e.g. A, B, C). As examples in the literature often use different conventions, citations of these have been adapted, with some of the details of pronunciation and delivery omitted for simplicity. This section discusses some initial problems concerning the delimitation of grammatical units in spoken dialogue. The division of such data into sentences is notoriously difficult. Consider (), uttered by a single speaker, C, who is discussing (with two others) her involvement in a dance group for both able-bodied and disabled dancers. () C: There has to be a l a greater sensitivity to that person because it’s very easy uhm to hurt someonedy somebody who has a disability because they haven’t got so much control C: So you have to be very very sensitive to their particular uh disability in order to s stop them from damaging themselves C: And I think that really brings you closer (ICE-GB, SA- #–) The three units shown follow the division into parsing units (the rough spoken equivalent of ‘sentences’) in the corpus, where so and and (the initial words of C and C) are treated as markers of discoursal links that introduce new grammatical units. However, they might alternatively be analysed as coordinators that link the clauses they introduce to preceding material to form a larger grammatical construction, a clausal coordination. While prosodic factors such as pauses and intonation can be taken as a guide, they often do not provide clear-cut criteria. It can also be hard to distinguish subordination and discoursal linkage in some instances: for example, the relations marked by because (or its shortened form cos) seem to range from tighter subordinative relationships (as in the two instances in C above) to much looser discoursal links to preceding stretches of conversation (e.g., Burridge ). Such difficulties have led some analysts to abandon the sentence as a unit for the analysis of spoken language (e.g., Miller and Weinert , Biber et al. : Chapter ). Clausal structures often seem easier to identify. Consider (), which follows the line divisions used in the source; the full stops indicate a falling, or final, intonation contour. ()
[Context: A moves towards ending a long phone conversation with B, his girlfriend] A: Okay. I sh- I shall leave you. A: to get on with your hard studying. A: that I know I interrupted. A: rather rudely B: (Oh yes.) (cited in Couper-Kuhlen and Ono : ; adapted; parentheses indicate uncertain transcription)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Here speaker A utters a syntactically complete clause with final intonation in A, only to expand this initial structure several times on receiving no response from B, who finally responds in overlap with rudely in A. Despite the prosodic breaks, the clear grammatical dependencies here support a (retrospective) analysis of A–A as a unitary clausal structure (I shall leave you to get on with your hard studying that I know I interrupted rather rudely)—albeit one whose production was incremental and responsive to interactional contingencies.2 However, there are also difficulties in delimiting the clause. One reason is that various elements are often loosely attached at the start or end (discussed by Leech a as ‘pre-clause and post-clause satellites’), or interpolated within it. Examples () and () below show loosely attached nominals in final and interpolated positions respectively, each serving to clarify the reference of a preceding nominal (they in (), this girl in ()), while () shows an interpolated interrogative tag. () They’ve got a pet rabbit Laura and her boyfriend Simon (ICE-GB, SA- #) ()
Apart from that he tried to smuggle this girl back Vera in the train com compartment where you’re supposed to shove the luggage (ICE-GB, SA- #)
()
I mean your mother there was a large picture of your mother’s mother wasn’t there in a sort of (wig) looking as fierce as anything (ICE-GB SA- #; parentheses indicate uncertain transcription)
Some loose attachments involve recurrent structures which are recognized as constructions in standard grammars, for instance ‘left dislocation’ and ‘right dislocation’ (see Kaltenböck, this volume), the latter of which is arguably exemplified in () and (). Dislocations are often treated as involving extended clausal structures, though Leech (a), for example, prefers to see them as involving discoursal rather than grammatical links. It is probably best to acknowledge a fuzzy boundary between these two types of links. Note that in () there is a clear intonational break, as well as a pause, before the addition of the final nominal as a kind of ‘afterthought’; however, we have already seen in () that ‘standard’ elements of clausal structure can also be added as ‘afterthoughts’ following prosodic breaks. Example () shows that a dislocated NP need not occur at the right periphery of the clause but can be interpolated. Various kinds of loose attachments such as parentheticals, afterthoughts, and dislocations have been discussed in the literature, and are grouped together by Kaltenböck 2
This unitary clausal structure is a main clause which incorporates subordinate clauses at several levels of structure. The infinitival clause added in A is arguably a second complement of leave, so we might see this addition as not simply extending the structure in A, but changing it from a monotransitive to a complex catenative construction (to use Huddleston and Pullum’s (: Chapter ) term).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
et al. () as ‘theticals’ which show special properties not adequately captured in standard grammatical accounts (see also e.g., Dehé and Kavalova and Dehé on parentheticals). Huddleston and Pullum (: –) call such loose attachments ‘supplements’: ‘elements which occupy a position in linear sequence without being integrated into the syntactic structure of the sentence’ (p. ) in that they do not function as dependent to any head. For them ‘supplementation’ is a type of construction contrasting with both coordination and dependency constructions, and they take supplements to include a very broad range of formal types. It should be noted that loose attachments are by no means restricted to spoken discourse—some types (such as appositives and unintegrated relative clauses) are frequent in written texts. Thus, we have seen some initial difficulties in delimiting ‘sentence’ and ‘clause’ in spoken interaction. Nonetheless, the clause has generally been considered a useful unit in the analysis of spoken discourse. Of the , parsing units in the ICE-GB spoken dialogues (each of which is spoken by a single speaker), per cent take the form of a main clause while per cent are ‘non-clauses’ (with most of the remaining units being coordinations of clauses, or subordinate clauses parsed as independent units). Biber et al. (: –) found similar proportions in a much smaller sample of conversational data from British and American English that they divided exhaustively into ‘syntactically independent’ clausal and non-clausal units (treating coordinated main clauses as separate clausal units because of the practical difficulties we discussed earlier concerning the identification of clausal coordinations as units); of the , units they identified, per cent were clausal and per cent non-clausal. The data from these studies underline the importance of non-clausal or nonsentential units (NSUs) in dialogue. There are different kinds of NSU. Frequently found are ‘free-standing’ single-word constructions (e.g., Hi, Oh, Okay, Uh-huh, Wow). Many of the words involved can either stand alone, or be prosodically attached to other structures without being syntactically integrated into them (as in Oh that’s wonderful). They have a range of pragmatic functions. The boundaries of this group of words are hard to draw and various terms are used in the literature. For instance, such words are discussed by Biber et al. (: –) as ‘inserts’ and by Couper-Kuhlen and Selting (: Chapter ) as ‘particles’. They are sometimes subsumed under the heading of ‘discourse markers’, a category whose ascribed membership varies widely in the literature, often including also formulaic multi-word expressions such as in fact or you see (see, e.g., Heine ). NSUs also include free-standing multi-word utterances such as How about Friday afternoon after the meeting?; The more questions, the better; What a disappointing set of results! This type involves a range of conventionalized structures that do not conform to canonical sentence form (see, e.g., Culicover and Jackendoff : –). They are sometimes labelled as ‘minor sentences’ or ‘irregular sentences’. Of more interest here, however, is another type of NSU that we label ‘clause fragments’, to be discussed in the next section. We will see that these pose even more severe problems in the delimitation of grammatical structures, as well as difficulties in distinguishing between grammatical and discoursal links.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. Clause fragments in dialogue Clause fragments are of particular interest here because they involve elements which have the grammatical potential to be clausal constituents (e.g., noun phrases, prepositional phrases) but which are not integrated into any clause. Again, terminology varies; for example, Biber et al. (: –) refer to this type as ‘syntactic non-clausal units’. They can involve single words, phrases, non-embedded subordinate clauses, or combinations of those. Like clausal structures, they can include more peripheral elements, such as the ‘inserts’ mentioned above or vocatives. The discussion in this section draws on examples from a study of clause fragments in ICE-GB by Bowie and Aarts (). Consider first B in the following sequence uttered by a single speaker: ()
B: My sister and I were going to get a picture of of she and I done B: Well we’ve been meaning to do it through this friend of mine who’s a photographer for about four years B: Just never got round to it B: For my mum and dad (ICE-GB, SA- #–)
Following B, which takes the form of a prepositional phrase, we understand the speaker to have conveyed something like ‘My sister and I were going to get a picture of she and I done for my mum and dad’, with the PP functioning like an adjunct which extends the clause uttered in B. There is considerable intervening material, so speaker and hearer are probably unlikely to have retained the exact form of B in their memories by the time B is uttered, but nonetheless a similar meaning is conveyed. We also find examples where a speaker adds to an initial structure after intervening material from another speaker, which may be a short response (e.g., Oh; Yeah) or a longer contribution. The amount of intervening material varies, so where do we draw the line between grammatical and discoursal links? The PP in B is certainly not presented as integrated into the clause in B, but our ability to interpret it in context seems to draw on our knowledge of how such PPs function within larger structures. An even greater challenge to standard analyses is posed by the phenomenon of coconstruction by speakers (see, e.g., Szczepek a, b, Sidnell ). Examples from the literature often involve ‘co-telling’ by two speakers who share knowledge of something to another who lacks that knowledge, as in (): ()
Cathy: She had this big hairy mole you know those kinds really gross ones Cindy: on her neck Terri: Oh how disgusting (Lerner : ; some transcription details omitted)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Here Cindy’s contribution is a PP which extends Cathy’s initial clause so that we understand ‘She had this big hairy mole on her neck’. Even core elements of a clausal structure can be supplied by another speaker, as in extract () (where two speakers, A and C, each utter two numbered units): ()
[context: discussion of recording equipment] A: That looks [unclear word] if somebody comes down and starts C: What C: Swearing and cursing A: Which is the the off switch (ICE-GB, SA- #–)
Here, A pauses after the utterance of starts, without supplying the complement that one would expect. C then supplies a complement (swearing and cursing), using presentparticipial verb forms to fit this grammatical context. Speaker A appears in A to be expressing concern about the recording equipment being on (a concern further pursued in A), and C evidently offers her completion as a guess about the nature of his concerns or to ridicule his concerns (her tone is dismissive). The units on her neck in () and swearing and cursing in () would in many analyses (including that of ICEGB) be considered as non-clausal units uttered by their respective speakers, since each speaker’s contributions are treated separately; but a possible alternative analysis would see them as parts within a larger jointly built grammatical structure. The clause fragments described above link to other units in the sequence of turns, and it is these sequential links that enable them to be interpreted as making complete contributions to the discourse. Analysis of data from the ICE-GB dialogues suggests that fragments recurrently exploit just a few broad kinds of grammatical link to serve a wide range of discourse purposes (Bowie and Aarts ). The examples in () to () above involve linkage on the syntagmatic dimension, with fragments that extend or complete preceding structures. We also find fragments which link on the paradigmatic dimension, as ‘matches’, whereby the fragment is interpretable as an alternative constituent of an antecedent structure in context (to which it ‘matches’). (The distinction between ‘matches’ and ‘extensions’ draws on partially similar distinctions made by Culicover and Jackendoff (: ) and Couper-Kuhlen and Ono (), but generalized to cover links across both same-speaker and other-speaker utterances.) A common use of matches is to answer open (or wh-) questions, which involves a semantic relationship of ‘filling in’ the value of a variable. Example () provides a straightforward illustration; the underlining indicates the constituent to which the fragment matches, and we understand ‘It is twenty past eight’. () A: What time is it B: Twenty past eight (ICE-GB, SA- #–)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
However, matches can fill many other discourse functions. Some examples are seen in (): ()
[Context: A has mentioned some family photos that were left behind when a former family house was sold; B is his wife and C their daughter] B: I don’t want great big life-size photographs of relatives hanging on the wall thank you C: Especially not that side of the family A: Any side of the family thank you from that era (ICE-GB, SA- #–)
Here, following B’s statement with clausal form, C and A respond by uttering nonclausal units. C expresses agreement with B’s evaluation of the photographs of relatives, but indicates that she finds it especially applicable to a particular side of the family. C’s contribution combines matching with extending: her NP that side of the family matches back to B’s relatives, her not matches B’s negation, and especially acts as an extension. Speaker A then gives the contrasting assessment that B’s point is applicable to any side of the family from that era: his NP any side of the family from that era (produced with an intervening discourse marker, thank you) can be seen as part of a chain of matching links, matching in the first instance to C’s that side of the family which links back to B’s relatives. The occurrence of numerous units with non-clausal form is problematic for theories which adopt a ‘strict ellipsis’ account of clause fragments, which hold that there is a ‘sentence’ or tensed clause underlying all such fragments and that the ‘missing’ material can be recovered directly from the preceding context. The correct analysis has been debated within the generative literature.3 For instance, Culicover and Jackendoff () argue against the strict ellipsis account and propose an alternative whereby ‘the unexpressed parts of the fragment’s interpretation are supplied not through underlying syntactic structure but via direct correspondence with the meaning of the antecedent sentence’ (pp. –). They use a mechanism of ‘indirect licensing’ to account for the semantic and syntactic relationship of the fragment to its antecedent. They point out that a strict ellipsis account runs into problems when the interpretation of the fragment requires ‘adjustment’ of aspects of the antecedent (e.g., illocutionary force, the use of you versus I/me, the embedding of clauses). These ellipsis debates rarely consider authentic examples, but an examination of spoken data readily provides examples requiring such ‘adjustment’, such as () and (). () B: I don’t know if it would be cheaper to do it in her name but I don’t think A: No not if not if we’re using your no claims bonus (ICE-GB, SB- #–)
3
We cannot offer detailed discussion of the ellipsis debates here. See for example van Craenenbroeck and Merchant () and chapters in van Craenenbroeck and Temmerman ().
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
() [Context: two friends are discussing plans for an outing.] A: Well Xepe seems to love this idea of having a picnic but I’m not too sure about this B: Not if you’ve had lunch (ICE-GB, SA- #–) In () B indirectly expresses a question in a subordinate interrogative if-clause which functions as the complement of know. To arrive at the correct interpretation of the fragment, a strict ellipsis account would require considerable ‘adjustment’: ‘extracting’ this subordinate interrogative clause to make it a declarative main clause, and changing its polarity from positive to negative: ‘It would not be cheaper to do it in her name if we’re using your no claims bonus’ (with the fragment, a conditional if-clause, functioning as an added adjunct within the main clause). Example () requires even greater adjustment, pragmatic as well as syntactic. Here B seems to be supporting A’s objection to Xepe’s idea: the intended meaning is not ‘You’re not too sure about this if you’ve had lunch’, but rather something like ‘Having a picnic is not a good idea if you’ve had lunch’. When considering examples of fragments in spoken data, the sheer number of instances and the variety of ways in which they relate to their ‘antecedents’ make a ‘strict ellipsis’ account seem very hard to sustain. This section has explored the challenges of looking at grammar beyond the sentence, suggesting that we have much to learn from how grammar is deployed in building discourse, especially interactive spoken discourse. In the next section we point to some quite different research, which looks at discourse not with the aim of refining our understanding of grammar, but rather with a view to discovering how speakers and writers exploit grammatical resources when using language in order to construct a particular view of reality.
. G
.................................................................................................................................. Grammar is relevant to any use of language, but researchers’ foci can be different: when dealing with spoken data the focus is often on grammatical choices made by speakers in relation to the development of the interaction and the communicative aims of the participants. We saw some of this above and we return to some aspects of grammar in spoken interaction in section . below. In contrast, other discussions of discourse (very often written, but what is said below applies equally to spoken discourse) often take as a starting point the central observation that language supplies alternative ways of describing the same situation. Choosing amongst these different ways could be related to different representations of social reality and therefore to different systems of thinking and beliefs, or ideologies. This link between discourse and the representation of reality has been a central preoccupation for those working in the tradition of Critical Linguistics (Fowler et al. , Kress and Hodge ) and later, Critical Discourse Analysis (CDA; work by Fairclough, Van Dijk, Wodak, and others).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Since making choices about how to present situations also relates to how information is packaged, and often correlates with the genre or style of a given text, we should point out that what we discuss here has many overlaps with the discussion in the chapters in this volume on information structure, the relationship between grammar and genre, and grammatical variation in literary texts (Kaltenböck, Dorgeloh and Wanner, and Jeffries respectively). This section should be read as complementary to those three chapters. Any aspect of language can potentially be seen to have some ideological effects, but in practice major areas of enquiry in Critical Linguistics and CDA have been transitivity, modality, and nominalizations. A wider area of study related to modality—stance—has emerged more recently. We will provide some brief illustrations of how these aspects of grammar have been brought to bear on critical analyses of discourse. When constructing a text, speakers/writers choose linguistic structures that allow them to control how events are construed, e.g., what verbs (typically) are used to encode them, and which participants are included/excluded or foregrounded/ backgrounded as a result. Verb valency4 and how the arguments of verbs are linked in particular sentences are some of the aspects of grammar that are frequently invoked in discourse analyses that focus on how texts present the social world. This can be illustrated briefly with a few examples taken from different news items published recently in a range of newspapers, paying specific attention to the verb separate and its nominalization separation (emphasis ours): ()
Mexico’s foreign minister Luis Videgaray Caso, who has branded the policy ‘cruel and inhuman’, last night claimed US immigration agents had separated a Mexican mother and her ten-year-old daughter with Down syndrome. (Daily Mail, June )
() “Other governments have separated mothers and children,” he wrote on Twitter, above a photograph of Birkenau, part of the Auschwitz death camp complex. (The Times, June ) () When journalists were briefly admitted to the facility, one teenager explained how she had been looking after a toddler—no relation to her—who had been separated from her family for three days. (The Times, June ) ()
Parents are now being convicted through the criminal system, which means that they are imprisoned and separated from their children. (The Times, June )
()
President Donald Trump urged House Republicans to pass broad immigration legislation in a Tuesday evening meeting, but he stopped short of telling them he would immediately reverse a widely condemned policy that has separated thousands of migrant children from their parents. (Wall Street Journal, June )
4
For a discussion of verb valency see Herbst, this volume.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Separate in this kind of use is a transitive verb which creates an expectation that there will be three participants: one participant who does the separating, and two (or more) participants who are being separated from each other. This is exactly what the bolded clause in () delivers, placing the NP US immigration agents in subject/ agent position and placing the two entities being separated from each other, a Mexican mother and her ten-year-old daughter, in a coordinated object NP, thus giving them equal prominence. This clause makes clear who, according to the foreign minister’s claim, has undertaken (and potentially should be held responsible for) the action of separating, and who has been affected by it. We see a similar structure in the bolded clause in (): the agent (other governments) is explicitly expressed as subject and so made more prominent, and the affected entities are expressed in a coordinated object NP: mothers and children. By contrast, in () the verb separate is used in a passive relative clause. The clause modifies a noun which represents one of the participants subjected to separation (the toddler), while the other participant (the toddler’s family) appears in a prepositional phrase which is a constituent of the relative clause. Crucially, the participant who is the agent of the separation remains unexpressed. Similar points can be made about (): the focus is on the participants subjected to separation, but the agents of the separation act remain backgrounded. Different verbs are associated with different types of situations. Some place specific requirements on their subjects or objects (e.g., they may admit only animate or sentient subjects or objects). Discourse analysts consider such properties important in terms of how the world is construed by language speakers, especially since language allows alternatives. The verb separate allows for an inanimate abstract subject. For example, in () above we see the NP a widely condemned policy as the understood subject of separate. Whereas () and () placed the agency of the separation with sentient agents, in () the agency is placed with an abstract entity, a policy, which allows the author not to name those responsible for the policy. A similar effect of deleting or backgrounding agency can be achieved via nominalization (Fowler et al. , Fowler , Fairclough ; see also Billig and van Dijk for some recent debates and further references). The events mentioned in the news reports above can also be referred to as follows: ()
Even better would be for Congress to pass the leadership’s compromise that legalizes Dreamers, ends the family separation fiasco, and gives Mr. Trump some of his priorities. (Wall Street Journal, June )
() The separations were not broken down by age, and included separations for illegal entry, immigration violations or possible criminal conduct by the adult. (Irish Independent, June ) In () and () the events previously named with the verb separate are now referred to with the nominalization separation. Again, this allows the writer(s) more choice of
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
which participants to name and which to background or leave out. In () similar flexibility is afforded by using modification with a participial adjective. () Dona Abbott, Bethany’s refugee program director, said that these newly separated children frequently have nightmares, anxiety and stomachaches. (The New York Times, June ) Of course, such choices become significant only if they are a part of a consistent pattern in a particular text or collection of texts. Where such consistent patterns are spotted, they are seen as patterns of representations of social actors and practices that can be thought to reflect and construct coherent systems of values and ways of thinking. One further area of grammar that merits mention in the context of discourse analysis of this kind is modality. Modality, discussed in this volume by Ziegeler, is a resource which allows the expression of speakers’ attitudes, states of knowledge, or relationships of obligation or permission between participants in the discourse. Modality can be linked to power and authority. For example, powerful and authoritative speakers can use some modal forms (e.g., conferring obligations upon others, expressing high degrees of certainty) to a greater extent than others. Modality is important for the construal of events, their participants, and the relationships between participants in discourse. For example, the use of the modal must in (–) below in statements that come from two sides of a current debate shows that on both sides there are strong perceptions of what the morally and ethically valid positions to take are. ()
The issue animated their weekly lunch and a consensus emerged that Congress must act, possibly as early as this week, leaders said. (The Guardian, June )
()
U.N. Secretary-General António Guterres said on Monday in a statement that “ . . . Children must not be traumatized by being separated from their parents. Family unity must be preserved.” (Wall Street Journal, June )
()
. . . Border Patrol officials say they must crack down on migrants and separate adults from children as a deterrent to others trying to get into the US illegally. (The Telegraph, June )
()
They also must prove that their home government is either participating in the persecution or unable or unwilling to protect them. (Wall Street Journal, June )
Here we have illustrated just some of the grammatical features of sentences in a text that might be highlighted as significant by researchers whose interest is in the link between discourse and ideology. (We gave examples from written discourse, but similar points can be made about speech.) This isn’t to say that all instances of such grammatical features have ideological effects, and of course an analysis will explore not
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
just these properties of the data, but many other aspects, such as vocabulary choices and genre characteristics, and will look for patterns rather than single instances. Another research strand related to subjectivity more generally is the study of how grammatical (as well as lexical and paralinguistic) devices can be used to express attitudes, emotions, as well as judgements and assessments of the validity of propositions, and so on. We will use the cover term ‘stance’ for these (for references to relevant scholarship, including that using different terminology, see Biber (a) and Gray and Biber (), for instance). A range of grammatical constructions used to express stance are laid out in the Longman Grammar of Spoken and Written English (Biber et al. : Chapter ). Biber (a) illustrates some common grammatical devices used in spoken and written academic discourse. Stance can be expressed with the help of adverbials (a and b) which convey some attitude or assessment towards the proposition expressed in the main clause.5 Another relevant grammatical structure is the so-called stance complement clause, a construction comprising a complement clause controlled by a stance verb or adjective, for instance, which indicates what stance is being expressed with respect to the proposition contained within the clause (see (c–g)). () a. b. c. d.
Obviously you don’t have to come to class on May eighth. Maybe someone mentioned this in speaking about it. I know a lot of people avoid Sacramento because of the deathly smog there. We are becoming increasingly certain that the theory has far reaching implications . . . e. You think I did a bad job. f. They needed to rebuild the entire government system. g. It seems fairly obvious to most people that Watson tremendously oversimplified the learning process.
As (c) and (d) show, sometimes stance is explicitly ascribed to the speaker; it could also be explicitly ascribed to the addressee (e), or possibly a third person (f), or it can be left implicit (a, b).6 Modality, which we touched on already, is of course another grammatical manifestation of stance. A number of studies have shown not only the variety of grammatical and lexical resources for the expression of stance, but also how the expression of stance can respond to the physical mode (i.e. whether a text is spoken or written) and to the different communicative purposes of different registers. For example, in his study of university registers within both speech and writing Biber (a) finds that stance is expressed more frequently in speech, but also more frequently in what he calls the The examples and analysis in () are from Biber (a: –), underlining in the original. The attribution of stance is a complex matter, which unfortunately, we can’t give the attention it deserves. See remarks in Ziegeler (this volume) as well as discussions of stance in the literature for some of the issues. 5 6
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
‘management’ registers (i.e. interactions and texts relating to classroom management and course management), whether written or spoken; see Biber (, b), as well as Gray and Biber () for further examples and references. Stance and modality, like other areas of language, can of course be studied in their own right, or in relation to aspects of social structure. Scholars whose aim is to uncover relationships between language use and power or ideology frequently deploy the conceptual tools of Systemic Functional Linguistics (SFL), associated with M. A. K. Halliday (see for example Halliday ). SFL is a functional approach to the study of language, which prioritizes attested data and the study of texts. An important aim of any investigation in SFL is to show how linguistic structure (conceived as one module, i.e. without a strict separation between the lexicon and grammar) contributes to the meaning of a text. Linguistic structure is conceptualized as a network of choices, and is linked to linguistic functions of reference (the ideational function), of relationship management (the interpersonal function), and of managing the information flow (the textual function). A detailed introduction to this framework is outside the scope of this work, so the reader is directed to Halliday (), or for shorter presentations see, amongst many others, Coffin et al. (), Martin (), Schleppegrell (), Martin (); see also the chapters by Mackenzie and Schönefeld, this volume. Recently, some researchers have extended CDA to link it to cognitive processes implicated in the interpretation of texts. To achieve this aim, they have adopted a Cognitive Linguistic approach (e.g. Langacker ) to the grammatical features of a text. Language is seen as a process of construal of events, experiences, etc. that can have an ideological basis or effect. This construal, as evidenced in the language forms used, is linked to cognitive processes of interpretation. Some extensions seek to demonstrate this link experimentally (for points along these lines and implementations of such approaches see Chilton , , Li , Hart , and references therein; see also Harrison et al. ). Other extensions have sought to provide CDA with sounder empirical coverage by looking for significant patterns in extended collections of discourse with the methods of corpus linguistics (e.g., Gabrielatos and Baker , Baker and Levon ). The focus on the process of subjective construal is not confined to studies of language and ideology or studies of written discourse. Subjectivity in language can be seen as a foundational property that affects language structure and function, as well as language change (see, for example, volumes like Athanasiadou et al. () or Davidse et al. (); see also some of the remarks in section . below).
. D
.................................................................................................................................. In the previous section our main aim was to show that grammatical resources are exploited in discourse by giving speakers choices that allow them to present reality in
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
different ways and express their subjective attitudes and beliefs about reality. Here we will refocus the discussion to highlight research which suggests that language use shapes and enriches grammatical resources, or influences how they can be used. For example, some research has shown interactions between discourse and clausal grammatical structure. Du Bois (, see also references therein) demonstrates that across speakers and in a number of languages in spontaneous face-to-face interaction there is a tendency to find no more than one full lexical NP per clause. What is more, such full lexical NPs, which tend to express new information, i.e. referents not previously introduced in the discourse, are much more likely to be found in the position of either the subject of an intransitive verb, or the direct object of a transitive or ditransitive verb. Conversely, the subject in a clause with a transitive/ditransitive verb and the indirect object in a clause with a ditransitive verb tend to be realized as reduced NPs, e.g., pronouns. This generalization, which Du Bois links to the relative cognitive costs of processing new versus old referents, he takes to show that information management in discourse, i.e. discourse pragmatics, ultimately shapes grammar: this discoursal pattern could be seen as the basis of an argument structure patterning like ergativity (see Du Bois ).7 In a somewhat similar vein, Englebretson () links the distribution of attributive versus predicative adjectives to their discourse function: attributive adjectives tend to help introduce new referents, whereas predicative adjectives tend to add information about already established referents (he follows observations by Thompson and Ferris ). Hopper and Thompson () argue for a more general link between lexical categories like nouns and verbs and their discourse functions, e.g., introducing discourse participants or events, respectively; see also Hollmann’s chapter on lexical categories in this volume. The role that discourse (i.e. the interactive use of language) plays in enriching the functional potential of language is also often discussed in the field of grammaticalization (Heine et al. , Traugott and Heine , Heine and Kuteva , Traugott and Dasher , Hopper and Traugott , Kuteva , Narrog and Heine , Smith et al. () and references therein). Grammaticalization relates to a set of language changes that create grammatical/functional elements out of lexical ones. It is often associated with a development presented on a cline like the one in () below (from Hopper and Traugott : ): ()
content item > grammatical word > clitic > inflectional affix
This shows that the emphasis in studying grammaticalization is often on the structural transformation from an independent lexical item to a syntactically more dependent or tightly fused function word, clitic, or morpheme, and the concomitant change from lexical meaning to (grammatical) function. Various semantic, pragmatic, and structural changes have been observed along the way. This can be exemplified with the
7
See Haig and Schnell () on some of the debates around ergativity and information management.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
English a bit of, given as a case study in Traugott (). The source of a bit of is a nominalized expression meaning ‘biting’. This was reinterpreted to mean not the act of biting, but the amount being bitten off, as in a bite of bread, i.e. it became a partitive. The partitive was extended further so that it could be used in expressions like a bit of a fool. Traugott () notes that this stage involved a pragmatic expansion, or enrichment of the meaning of the expression during its use, since the partitive was associated with negative speaker evaluations, that is at this stage we can see subjectification in the development of a bit of. A semantic/pragmatic expansion accompanied by a reduction can also be seen in the next move to a quantifier, as in a bit wiser, a bit richer. Further development allows a bit to be used as an adjunct (as in I don’t like it a bit) (for further details and examples see Traugott : –). As we can see, grammaticalization is driven by a number of semantic processes of reinterpretation, which happen in language use. Traugott () uses the pragmatic subjectification in the history of the development of a bit of and other examples to argue that theories of language change cannot ignore the role of the speaker and, more generally, the role of speaker–hearer interactions. The speaker innovates in the flow of speech, in the course of an inherently subjective speech event. In other words, if language change is seen to happen incrementally in language use, then it becomes intrinsically linked to discourse. This view of language change is contrasted with theories that attribute change to child language acquisition (Lightfoot , see also Waltereit and references therein; for a comparison and an attempt to reconcile the two views see also Öhl ). Some processes of grammaticalization, and language change more generally, have also been linked to frequency, both type and token (e.g. Bybee , ). Frequency effects, which can be taken to be responsible for phonological reduction, for example, or the entrenchment of some patterns, can only be understood when language in use, i.e. discourse, is taken into account. Thus grammaticalization research aligns more generally with functionalist and usage-based approaches to language (see Mackenzie, this volume). Not only can language change be seen to happen within language use, i.e. discourse, but languages also develop resources expressly for the purpose of discourse management: discourse particles or markers, i.e. elements like well, but, however, though. These markers are discussed in the literature as functional elements that help speakers and hearers manage interaction (e.g., express the relationships between different chunks of discourse, or express their attitudes to propositions expressed in discourse). As functional elements, they are considered by many researchers in the field to be part of the grammatical resources of the language—this is, however, a debated issue as such scholars are adopting an ‘extended’ view of grammar relative to more traditional approaches (see, e.g., Degand and Evers-Vermeul for discussion). There is also a considerable literature concerning how these functional elements arise in discourse. Barth-Weingarten and Couper-Kuhlen (), for example, discuss the development of though from a conjunct of concession to a discourse marker with concessive and (increasingly) textual uses (e.g., as a topic shifter, i.e. as a marker which contrasts two chunks of discourse in terms of topic). Example () can be used as an
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
illustration (their example on p. , with some adaptations, emphasis ours). It comes from an American English radio phone-in. The caller, Jim, praises his lesbian neighbours for helping with childcare whilst he was a single parent. Freddy Merts, the moderator, asks him whether he felt sexually attracted to them. ()
J: FM: J: FM: J: FM:
J: FM: J: FM: J:
i was too bUsy for women bUt, yeah RIGHT, yeah I WAS. [what an exCUSE, [(you know if you are) takin CARE of a kId and stuff [(it’ll keep you) BUsy.= [that’s TRUE yEAh, but the kId can be a great PROP though. i know a lot of single Fathers who bring their kids to the pArk, (.) like a MAGnet, oh(h)(h) ye(h)ah(h) [h, tha/ [or a MAGgot. thAt’s kind of sIck somehow though don’t you think? [((laughs)) (h) (h) (h) [we:ll Using your kid to dAte-
Key to symbols: ACcent Accent . , ? -
primary accent secondary accent final intonation falling to low final intonation rising to mid final intonation rising to high final level intonation
: (h) (.) / = (( )) () [
lengthening laugh particle pause break-off latching meta-comment suggested transcription overlap
In line seven FM concedes the points Jim has made so far, and in line eight he puts forward a counterclaim. The though at the end of this turn both marks the concession and signals the move to a new topic (from the difficulties of being a single father to dating strategies for single fathers). The development of though with the function illustrated above does not match the understanding of grammaticalization in all respects—for instance, it does not meet some of the criteria laid down by scholars like Lehmann (), e.g., there is no reduction of scope, phonological reduction, or move to an obligatory marker. The authors argue, however, that if grammaticalization is conceptualized as a phenomenon exhibiting prototypicality, then the development of though as a discourse marker can be treated as non-prototypical grammaticalization. In the case of though, what we see is a bleaching of the concessive semantics (what is contrasted are not
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
propositions, but shifts of topic) and a concomitant increase of abstractness.8 There is also an increase in textual meaning, i.e. conveying the relationship between two chunks of text, which the authors designate as pragmatic strengthening. On the syntactic level there is an increase in scope (textual though connects larger chunks of text). Couper-Kuhlen () discusses though, as well as other phenomena like left dislocation and extraposition, as arising from conversational routines collapsed into single conversational turns. For debates in the literature over whether the development of discourse markers should be considered as grammaticalization, see for instance Degand and Simon-Vandenbergen (), Heine (), and Degand and Evers-Vermeul (). Mulder and Thompson () argue that grammaticalization processes similar to those described above can be seen in the development of but from a conjunction to a discourse particle in American and Australian English. Mulder et al. () argue that this process is fully completed in Australian English only. We illustrate the use of final particle but in Australian English with their example () (p. ) which we reproduce in () below. The example shows a football coach ending a practice session. ()
Coach: That’ll do it, lads. ! Good work but.
Used in this way, but is uttered with a final prosody, completes a turn, and marks contrastive content. In the example above the coach signals that the session is over but that he is satisfied with progress made. Some of the processes visible in the development of though have also been traced in the development of negative mental verb constructions like I don’t know, for instance, discussed in Lindstrom et al. () amongst others. In a paper with a cross-linguistic perspective the authors point out that know and similar verbs in similar constructions have moved away from their traditional transitive use with epistemic meaning to become discourse markers, which have interactional meaning (e.g., heading off sensitive topics) or indicate speakers’ stance (e.g., casting a contribution as a guess, or hunch). In the case of I don’t know in English, when used as an intransitive verb in a discourse marker-like construction there is often also morphophonological reduction: dunno. Studies of grammatical change like the ones we summarize above often focus on language used interactionally in speech. Recently some scholars have argued, however, that change can also originate in writing (see introductory chapters in Biber and Gray , as well as the concluding remarks in Fox ()). In a study of the historical developments in academic writing, Biber and Gray () find that specialist science writing has moved away from a style characterized by its reliance on verbs and dependent clauses, typical of the eighteenth century, and has evolved a new discourse style with innovative use of grammatical features. This discourse style is characterized 8
For an earlier seminal discussion of semantic and pragmatic changes accompanying grammaticalization see Brinton , for example.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
by complex phrasal syntax, namely by the increased use of nominalizations (consumption, comparison, sustenance), attributive adjectives (gradually expanding cumulative effect), nouns as nominal pre-modifiers (baggage inspection procedures), prepositional phrases as nominal post-modifiers (a high incidence of heavy alcohol consumption amongst patients), and appositive noun phrases (Dallas Salisbury, CEO of the Employee Benefit Research Institute) (examples from Biber and Gray : ). Such structures lead to greater compression of information. The possible degrees of compression are illustrated with the following examples (Biber and Gray : , underlining in the original): () a. And if his Computation, which was made for Greenwich, had been reduced to the Meridian of London, the Difference would have been still less. b. And if his Computation made for Greenwich had been reduced to the Meridian of London, the Difference would have been still less. c. And if his Computation for Greenwich had been reduced to the Meridian of London, the Difference would have been still less. d. And if his Greenwich Computation had been reduced to the London Meridian, the Difference would have been still less. In (a) above the first underlined NP is modified using a finite relative clause; in (b) the same information is expressed with a non-finite relative; in (c) it is compressed further into a post-modifying for-PP; and in (d) the compression is maximal: the information is now expressed via the noun pre-modifying another noun. The move away from clausal embedding resulting in this compression enables writers to give as much information as possible in as few words as possible; however, it also comes with a cost: loss of explicitness. In (a) we are told what the semantic relationship is between his Computation and Greenwich (the computation was made for Greenwich), whereas in (d) we can recover this semantic link only if we have the necessary background knowledge. The development of such a phrasal discourse style, especially in specialist science writing characterized by the increased use of complex phrasal structures, can be explained by adopting a functional linguistic perspective and relating it to the changing requirements of the respective linguistic community. The developments in science in the last centuries have led to increased communication within a greatly increased number of sub-disciplines and increasingly specialized fields. Compression responds to the need for economy of communication prompted by the sheer information explosion since the eighteenth century and especially in the course of the twentieth century. The lack of explicitness can be tolerated because specialist science writing is done by and for experts in narrow domains (see Biber and Gray and references therein). Biber and Gray () argue, however, that the developments they trace via quantitative corpus-based studies are not simply a matter of variation in the rate with which available grammatical resources are used. Rather, the increased use of some resources, e.g., nouns modifying other nouns (or NN structures), is accompanied by shifts in the grammatical characteristics of these structures, the range of elements that can enter
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
into them, and the semantic relations that are possible between them. Thus NN structures which in the sixteenth century are attested with only very restricted semantic relations between the two nouns (mostly titles in expressions like King David or Master George) gradually expand and in the course of the twentieth century come to be used very widely with almost any noun being able to modify any other noun (see Chapter of Biber and Gray for detailed descriptions of the functional extension of a range of structures). The examples discussed in this section illustrate a view of grammar and discourse that sees the relationship between them as mutual and dynamic. Speakers and writers avail themselves of linguistic resources in order to show their understanding of the social action being undertaken in a particular interaction and to achieve their interactional goals. In doing so, however, speakers (re)shape the grammatical tools at their disposal and create new ones. Where such uses are repeated by a number of speakers on a number of occasions, the new use may become part of the linguistic code. The studies we have noted try to capture the creation and remodelling of grammatical resources, and thus strive for a usage-based perspective on grammar.
. C
.................................................................................................................................. In this chapter we have reviewed the relationship between grammar and discourse from three different perspectives. First, we focused on the value of looking beyond the sentence to investigate how grammatical structures are used in building discourse. We noted problems in drawing a strict grammar–discourse boundary: in delimiting the sentence as a grammatical unit, and in analysing cohesive relationships of ellipsis and anaphora that can hold both within and across sentences. We illustrated particular challenges posed by spoken interaction, including the co-construction of grammatical units by different speakers, and the frequent use of clause fragments that are not syntactically integrated into sentential units. Our second perspective concerned the effect grammatical choices have on discourse. Here we highlighted some research in the tradition of discourse analysis that sees grammatical (as well as an array of other) choices as instrumental in presenting situations and events in different ways, including to suggest different ideologies and value systems, or to express different appraisals of and attitudes towards what is being talked about. Our discussion focused on choices that allow the foregrounding or backgrounding of participants, as well as on modality and the wider area of stance. Finally, our third perspective considered how discourse can shape grammar. Scholars adopting this perspective see grammar as malleable and responsive to the contexts in which language is used. In this approach, grammar is not something that speakers simply deploy—on the contrary, it can change in response to (frequent) patterns of use.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Given that generative grammatical theories have drawn a sharp distinction between competence and performance, and have prioritized introspection over usage data (see Sprouse and Schütze, this volume, for discussion and developments), our brief review here has focused on those theories and approaches that see structure as bound up with function and use. By considering a range of frameworks and perspectives, we have tried to show some of the richness of recent work at the interface of grammar and discourse.
A We are grateful to Bas Aarts, Heidrun Dorgeloh, and two external reviewers for their valuable comments on drafts of this chapter.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.............................................................................................................
GRAMMATICAL VARIATION AND CHANGE .............................................................................................................
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. I the course of its history, English has undergone a series of—sometimes related— morphological and syntactic changes that have essentially resulted in a fundamental typological change, from a largely synthetic to a much more analytic language. It has (almost completely) lost certain inflectional properties commonly found in related Indo-European languages (such as case, voice, mood, most tense and person contrasts) while at the same time developing new means of marking grammatical categories or functions (such as word order, periphrastic verbal constructions or complex prepositions). An example of how some of these changes are related would be the loss of case, on the one hand, and, on the other hand, development towards a (fairly) rigid SVO word order and the subsequent emergence of new passive constructions as well as a remarkable flexibility with respect to the semantic roles that can fill the subject slot in active declaratives in English (see e.g., Hawkins , Hundt , or Dreschler ). A look at Old English verb paradigms will show that, by comparison with its Indo-European ancestors, English had already lost most of the inflectional marking for voice, tense, and aspect on the verb; in the course of its history, new, periphrastic means have evolved that nowadays allow speakers to combine a modal verb with a perfect, a progressive, and a passive in complex verb phrases such as the one illustrated in example (): ()
1
He might have been being skinned alive, or having his soul torn out of his body . . . (COCA, , FIC)1
Emphasis in examples has been added throughout. For details of COCA and other corpora cited, see the list at the end of the chapter.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
In other words, the history of English grammar is one of both loss and (quite substantial) gain, where relatively small changes in one area of the language (such as a change in the basic stress pattern of words) have had repercussions in other areas (as for instance loss of inflectional case marking). Similar changes of loss and gain in the area of mood and modality will be discussed in more detail in section .. Different theoretical approaches to grammar have provided their own accounts of grammatical change in English, e.g., in terms of ‘catastrophic’ change between one generation and the next (the traditional generative account) or in terms of a series of incremental changes and new patterns seen as emerging, for instance, out of shifts in frequencies (usage-based and functional models). This chapter provides an overview of different types of change (morphological and syntactic) against the background of different models of grammar and the methodologies on which they rely (see section .). It looks at both grammatical loss (surprisingly still a fairly under-researched topic) and gain (fairly well studied within grammaticalization and—more recently—constructionalization). The case studies used to illustrate grammatical change focus on mood and modality.2 This area of English grammar is ideally suited to illustrate the general typological change (from inflectional to a largely analytic marking of grammatical categories) that characterizes long-term developments in English. Moreover, mood and modality is a field that allows us to look at both loss and rise of morphological and syntactic categories. Evidence comes from previous stages in the history of the language as well as more recent developments, and is often drawn from corpus data. The latter allow us to map both the (near) demise of categories as well as revival of a pattern that had almost faced extinction. The focus in this chapter is on language-internal processes, but underlying mechanisms such as analogy, reanalysis or ambiguity and their psycholinguistic underpinnings are not discussed in detail.3 External reasons for change will be touched on occasionally (see section .. on the revival of the mandative subjunctive in American English), but for an in-depth discussion of the role that language contact may have played in grammatical change, see the relevant chapters in Nevalainen and Traugott () or contributions in Schreier and Hundt ().
2 Since the case studies come from a relatively restricted area of grammatical change in the verb phrase, the reader is referred to other publications for changes in the noun phrase or word order. A good chronological overview of morphological change in English (and interaction with phonological change) is provided in Lass (). Fischer and van der Wurff () give an excellent and detailed description of the main syntactic changes from Old English to Modern English. Los () covers both morphological and syntactic change, as well as discussing pragmatic aspects of syntax. 3 See the contributions in Hundt et al. () for an in-depth treatment of these factors in the history of English as well as in psycholinguistic research.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. A
.................................................................................................................................. With respect to theoretical modelling of grammatical change, we can distinguish two main approaches: a formal (or generative) approach, on the one hand, and a functional one, on the other hand; the latter I understand to subsume cognitive, usage-based, and descriptive approaches (see the relevant chapters in Part II of this volume for background). As data, linguists have relied on manuscripts (and editions thereof ) for earlier periods of the language; these have become efficiently searchable in the form of electronic corpora and text databases.4 For more recent and ongoing change in English, linguists can also rely on their intuition as native speakers or use experimentation (e.g., eliciting speaker judgements of acceptability/grammaticality, investigating priming effects, etc.). Since the advent of recording devices, it has been possible to include the spoken medium in corpora and investigate variation between speech and writing (e.g., by comparing evidence from the Brown family corpora with data from the Diachronic Corpus of Present-Day Spoken English, DCPSE). For periods where we lack recordings of actual speech data, spoken language has been studied by proxy, i.e. on the basis of speech-like, speech-based or speech-purposed writing (see Culpeper and Kytö () for the distinction and a detailed description of linguistic characteristics of these approximations). A fundamental tenet of generative theories is that language is modular, in other words, the grammar and the lexicon are considered to be separate, autonomous components or domains (see also Los ), while most functional and usage-based approaches conceive of grammatical function and lexical meaning as closely related, if not inherently linked, aspects of language. Within the generative approach, grammars change from one generation of speakers to the next, and thus relatively abruptly, as Lightfoot (: ) points out: ‘ . . . while the grammar of a single adult could change only in minor ways in the course of a life-time, there could be greater differences between the grammar of a child and his parents or models’. Within usage-based approaches, on the other hand, change typically proceeds gradually (slowly at first, then picking up ‘speed’ before slowing down again—in the typical ‘slow-quick, quickslow’ pattern of the S-curve).5 The following sections look in more detail at how these fundamentally different notions of language and linguistic knowledge affect models of grammatical change. These will then be taken up in the case study on changes in the English mood and modality system in section .. 4
Background on corpus methodology is given in Wallis (this volume). For the distinction between a ‘corpus’ and ‘text database’, see e.g., Hundt (: , ). For a thorough discussion and re-evaluation of different kinds of evidence in historical linguistics, see the chapters in Part I of Nevalainen and Traugott (). 5 The rhythmical allusion to the foxtrot is from Aitchison (: ). For a recent critical discussion of the adequacy of the S-curve model, see Nevalainen (). For a detailed discussion of the maths behind this finding, see Sean Wallis at https://corplingstats.wordpress.com////logistic-multinomial/.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. Abrupt or ‘catastrophic’ change As pointed out above, generative models of language and grammatical change build on the assumption that change takes place from one generation to the next, and that—due to changes in the input—one generation of learners may end up with different parameter settings from the ones of the preceding generation. These settings concern, for instance, whether the grammar allows the dropping of pronouns (as in Italian Vedi questo libro? ‘Do you see this book?’) or not (as in English, where the pronoun subject is taken to be obligatory). According to the seminal work on generative diachronic syntax by Lightfoot (: ), ‘piecemeal changes’ successively make the grammar more complicated and marked: . . . these piecemeal changes are followed by a catastrophe . . . , a major re-analysis of the grammar eliminating the markedness and complexity which had been gradually accumulating. The symptoms of such a cataclysmic re-structuring will be a set of simultaneous but apparently unrelated changes.
The key word in this passage is the ‘simultaneous’ occurrence of a set of changes linguists can observe in the data. In section .., we will look at a case study that, according to Lightfoot (), illustrates such a set of simultaneous (apparently unrelated) changes, leading to the emergence of modals, a new class of auxiliary verb.
.. Stepwise (incremental) change In functional, usage-based accounts of grammatical change, there is no a priori need for change to occur between one generation and the next, and therefore no theoryinherent requirement for any abrupt changes in grammatical representation. Instead, change is conceived of as a gradual process by which lexical items in the process of grammaticalization, for instance, become increasingly bleached with respect to their original semantic content and take on more and more grammatical meaning (see e.g., Hopper and Traugott ). Moreover, old and new uses of a grammaticalizing item exist side by side, both in the use of individual speakers and in the language community. As Traugott and Trousdale (: f.) point out, [m]icro changes are discrete . . . and, as conventionalizations, cognitively abrupt (in a tiny way) for individual speakers. However, on the assumption that innovation is not change, only consolidation of an innovation via transfer to a community is, changes at the level of the community are not discrete/abrupt.
In section .., we will therefore contrast the generative account of the evolution of modal auxiliaries with that given in grammaticalization theory. To sum up, grammatical change takes place in the individual’s acquisition of a language (i.e. between generations) in the generative approach whereas functional, usage-based models conceive of innovation as being initiated by the individual, but grammatical change as happening on the level of the language community.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
. M E
.................................................................................................................................. Modality, in a broad definition, is a cover-term for linguistic strategies used to express speaker attitudes towards the proposition expressed in the clause, typically encoded on the lexical verb or within the verb phrase, but also elsewhere in the sentence, e.g., through modal adverbs like possibly. The focus here is on verbal expression of speaker attitudes. This can be done through inflectionally marking the lexical verb, i.e. grammatical mood, or through periphrastic means, i.e. the combination of a base form of the lexical verb with a modal auxiliary. For more details on mood and modality in English, see Ziegeler, this volume.
.. Mood: the (near) loss of an inflectional category Old English had inherited a (fairly) systematic distinction between indicative and subjunctive mood from Proto West Germanic. Subjunctives were mostly used in subordinate clauses to express counterfactuals and following verbs of command and desire (where they continue to be used in Modern English) but also in concessive clauses and (variably) in reported speech or after certain verbs such as þencan ‘to think/ consider’.6 The inflectional paradigm for the strong verb bı d̄ an ‘await’ (Table .) can be used to illustrate the different forms for early West Saxon; distinct subjunctive forms have been italicized. Towards the end of the Old English period, the system began to collapse, however, with the generalization of plural -on to the subjunctive, thus neutralizing the mood distinction in the preterite plural (Hogg : ). The ending weakened to schwa in Table . Inflectional paradigm for OE bı d̄ an ‘await’ (based on Hogg and Fulk : ) Present
Sg. Pl.
Past
Indicative
Subjunctive
Indicative
Subjunctive
bı̄de bı̄dst bı̄tt bı̄dað
bı̄de bı d̄ e bide bı d̄ en
bād bı̄de bād bı̄don
bı d̄ e bı̄de bı d̄ e bı d̄ en
Traugott (: ) points out, though, that ‘[c]hoice of mood (indicative vs. subjunctive) in complements is extremely complex, and is not adequately understood. . . . there appear to be no or at least few absolute rules.’ On semantically ‘empty’ subjunctives, see Fischer and van der Wurff (: f.). 6
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
late Old English (Lass : ). Further neutralization took place during the Middle English period with the loss of /-n/ in the plural; once word-final schwa disappeared, eventual loss of the ending occurred (Lass : ). Finally, when the second person singular pronoun fell into disuse in the Early Modern Period, one of the few remaining mood distinctions disappeared, leaving Present-Day English with only residual use in very restricted contexts. A remnant of a historical present subjunctive is identifiable in mandative contexts (following expressions of command, suggestion, and so on) in instances where the bare form of the verb occurs with a third person singular subject (see ()). The bare form of be is also used in mandative and conditional contexts (see () and (), respectively). In hypothetical conditional clauses, finally, were is a remnant of a historical past subjunctive with first and third person singular subjects (see examples () and ()).7 ()
The rigid command structure of the Tenth Empire demanded that he deal with Chufu, the bureaucrat in charge of the slave mines. (FROWN, N)
() Lunacharskii also stipulated in this contract that his wife, Natalia Rozenel, be given the female lead – at a time when directors were being fired for nepotism. (FROWN, F) ()
If the truth be told, . . .
(FROWN, D)
() “I felt that if I were kind to you, maybe they wouldn’t make me ride back in the luggage car.” (FROWN, N) () It sounded as if the man were calling him: . . .
(Brown, N)
With respect to the use of subjunctive forms, López-Couso and Mendez-Naya (: –), on the basis of evidence from the Helsinki Corpus, show that clearly marked subjunctive forms as complements to requests and commands decrease from per cent at the beginning of the Old English period to just per cent at the end of the Middle English period. The inflectional subjunctive, with its drastically reduced paradigm, survives in formulaic contexts (such as Long live the Queen) and as a relic form after certain conjunctions (e.g., Lest there be any doubt). In addition, it is still regularly used after certain mandative triggers (such as important, request, advise) where its use even increases in the twentieth century (see section ..), whereas past subjunctive were in hypothetical counterfactuals is on the decline (see Leech et al. : –). While English thus (largely) lost the inflectional means to express modality on verbs, it gained a new class of verbs that still allow it to express modal meaning in the verb phrase. The focus in the next section will be on the development of the core modals, i.e. 7 There have been debates in the literature over how the subjunctive in Present-Day English should be analysed (e.g., whether it should be considered an inflectional category). See Aarts () for a discussion of the issues.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
verbs like will, would, can, could, rather than semi-modals used to, dare to, want to, etc. (see Krug ) or modal verbal idioms like try to/try and (see e.g., Hommerberg and Tottie ) or (had) better (see section ..).
.. Modality: the rise of a new class of verbs The development of the majority of modal auxiliaries from preterite present verbs8 has been used as a paradigm case to model grammatical change within both generative and functional, usage-based approaches. The focus in what follows will be on the syntactic changes involved in this grammatical change; for details on semantic changes that accompany the development, see Ziegeler (this volume) and section .. below. Common ground for all accounts of the modal story is that the ancestors of verbs like will, might and can (i.e. Old English willan, magan, and cunnan) were (full) lexical verbs taking nominal elements (an NP or clause) as their complements, as in the following example (with sibbe ‘peace’ as the direct object of wolde ‘wanted’): geornor wolde sibbe wið hiene þonne gewinn () þæt he that they sooner wanted peace with him than conflict ‘that they wanted peace rather than conflict with him’ (Alfred, Oros. (Sweet) , –) But as early as Old English, these verbs could also combine with the infinitive of other lexical verbs, as with forspillan (‘destroy’) in the following example: wolde he hiene selfne on þæm gefohte forspillan () þa then wanted he him self on that battle destroy. ‘then he wanted to destroy himself in the battle’ (Alfred, Oros. (Sweet) , ) The inflectional properties of Old English pre-modals, too, resembled those of full verbs in that they mostly agreed with the subject for person (with some exceptions) and number.9 In Present-Day English, modal verbs no longer take complements of the type illustrated in () but exclusively combine with the base form of lexical verbs in complex verb phrases; moreover, they have all the NICE properties typical of auxiliary verbs (see Huddleston and Pullum : –), as illustrated in the following examples:10 8
Preterite present verbs are verbs that combined a historical past tense form, which had acquired present-tense meaning, with a new (weak) past tense marker. 9 The main exception, for historical reasons, involves present tense third person singular forms. 10 Note that negation and inversion are not innovative features of core modals. These properties used to be shared by all verbs, but the grammaticalization of do-support for lexical verbs in the history of English restricted their use to auxiliaries.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
()
She wouldn’t move. (Negation)
()
Would she move? (Inversion)
()
She would move, and he would, too. (Code)
() She woúld move (if you asked her nicely). (Emphasis) In Present-Day English, core modals take direct negation without do-support and with negative contraction on the verb (N), and they allow interrogative formation by simple inversion (I); in addition, they can be used in reduced constructions with ellipsis of the main verb (C), and carry nuclear stress in positive declaratives (E). For all of these properties to apply, substantial changes had to take place between Old English and Present-Day English, for which generative and functional accounts differ substantially. The changes (loss of their original syntactic and morphological characteristics) are described in Lightfoot (: –) and summarized in Fischer and van der Wurff (: ). In the following, the focus shall be on the general line of argumentation in generative and usage-based accounts rather than on the details of the change itself. Fischer and van der Wurff () also provide a detailed critique of Lightfoot’s account,11 which distinguishes two phases for the change: an early one of apparently unconnected losses that stretches over a longer period of time (from Old English to the end of the Middle English period), and a second phase of connected changes that occur more or less simultaneously in the sixteenth century. According to Lightfoot (: ), [t]he reanalysis was provoked by a number of changes which made it unclear whether the pre-modals were verbs or a unique category. . . . Thus the category membership of pre-modals became opaque and the grammar moved to avoid such opacity.
The criticism of Lightfoot’s account mainly focuses on the difficulty of dating syntactic loss, with scholars providing examples of patterns during the second phase that should have been extinct after the first phase (e.g., instances of the pre-modals where they take a nominal complement after ; see e.g., Warner and Fischer and van der Wurff ). Moreover, Lightfoot’s account builds on the assumption that the preterite present verbs were ‘normal’ full verbs in the Old English period, an assumption which has also been questioned: most of the verbs that developed into modals showed some unusual morphological properties and syntactic behaviour as early as Old English (one of the morphological peculiarities being that the historical strong preterite forms had developed present-tense meanings, and new, weak past tense forms had evolved, which in turn lost their past tense meaning as they developed into modal verbs). In addition, functional accounts of the development draw on the observation that the difference between full verbs and auxiliaries is not a categorical one but rather of a gradient nature: core modals display all of the NICE properties, with marginal modals 11
For critical discussions of Lightfoot () up until the early s, see Denison (: –).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
and semi-modals like need (to) and have to showing variability with respect to the criteria that are used to distinguish between full verbs and auxiliaries (see e.g., Leech et al. : –). The gradient nature of the distinction also holds for the history of English: Warner (: ) points out that variable do-support in the seventeenth century makes for category overlap of the two kinds of verb during that period. We could argue that, to the extent that do-support is still variable with full verb have in formal contexts (see examples in ()), and in view of the fact that a marginal modal like dare occasionally takes dosupport in interrogatives and negated sentences (see examples in ()),12 N(egation) and I(inversion) remain somewhat problematic as a diagnostic for auxiliarihood. () a. But I haven’t anything worth leaving! (BNC, A) b. They can teach me so much about the important things in life I don’t have anything so important to teach them. (BNC, HU) () a. She dared not wait more than an hour or so, . . . b. I did not dare to breathe the words ‘Civil Aviation’.
(FLOB, P) (FLOB, G)
According to Barber (: f.), dare is even moving towards full-verb syntax in interrogatives and negated sentences (i.e. occurring with do-support rather than inversion/bare negation); while dare is a low-frequency item and infrequently attested in the Brown-family corpora, the little evidence that we do find does not support Barber’s view (see Table .).
Table . Full-verb, mixed, and auxiliary syntax of dare in interrogative and negated sentences in the Brown-family corpora of American and British English (AmE and BrE)
B-Brown (AmE, ) Brown (AmE, ) Frown (AmE, ) AE (AmE, ) B-LOB (BrE, ) LOB (BrE, ) FLOB (BrE, ) BE (BrE, ) Total
12
Full verb
Mixed
Auxiliary
Total
Note that dare also allows for mixed patterns that combine, e.g., do-support with a bare infinitive, as in the following example: ‘To me, all premarital sex was immoral; if I did not dare have sex with my own girl-friend, I could scarcely imagine it with anyone else’ (Frown, G). For variable do-support with semi-modal got to, see Mair (a).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
The semantic changes involved in the grammaticalization of the new category of verbs are analysed as gradual rather than abrupt by scholars working within the functional paradigm (see Ziegeler, this volume, for details) and therefore do not support the idea of a catastrophic change, either. Another example of a gradual rather than abrupt development is the loss of nonfinite forms of pre-modals which, according to Lightfoot (), however, was part of the ‘catastrophic’ change in the sixteenth century. Warner (, ) and Fischer () maintain that the development is more complicated, differs from verb to verb, and, most importantly, is gradual rather than abrupt.
.. Modals: grammaticalization and constructionalization Modal verbs in English have been discussed as a prime example of grammaticalization (e.g., Heine )13 but recently also as an instance of constructionalization (e.g., Traugott and Trousdale ). Grammaticalization and constructionalization are closely related concepts: according to Fried (: ), grammaticalization aims to account for the series of changes involved in the change from lexical items into grammatical patterns, whereas construction grammar’s focus is on capturing the incremental nature of the change in that it specifically also takes frequency changes into account. But the case of the modal idiom had better shows that constructionalization is different from grammaticalization in that it also covers partial developments, as we will see. The syntactic and morphological changes involved in the grammaticalization/ constructionalization of the modals were discussed above (section ..). In this section, we will briefly look at some of the semantic changes connected with the emergence of modal verbs before moving on to an account of the modal idiom (had) better as an example of constructionalization. The study of the semantic changes that are part of the grammaticalization of modal auxiliaries provides a good argument for grammatical change that can stretch over centuries and be characterized by long periods where different meanings exist side by side. Moreover, ambiguity or semantic underspecification for individual occurrences of modal verbs is highly relevant, as various studies on English modals have shown (see e.g., Traugott , Palmer ). However, in terms of their general developmental paths, most scholars agree that the pre-modals first took on deontic meaning, i.e. expressing permission or obligation (so-called ‘root’ modality), and that epistemic modality evolved later.14 In other words, the semantic change involved is from more ‘objective’ towards more ‘subjective’ meaning. For sculan/shall, Traugott () provides the following examples of a clearly deontic () and a (somewhat weaker) epistemic use (); the fact that these uses are already attested in Old English has 13
For additional references, see the chapter on modality by Ziegeler, this volume. For an alternative view, according to which deontic and epistemic modality may evolve simultaneously, see Narrog (b). 14
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
been taken as further evidence of the long-drawn out process of change from lexical to auxiliary verb. ()
Utan nu brucan þisses undernmetes swa þa sculon þe hiora Let-us now enjoy of-this breakfast as those must/shall that their æfengifl on helle gefeccean sculon. supper in hell receive must ‘Let us now enjoy this breakfast as befits those who must eat their supper in hell’ (Or p. ; , lines ; –; from Traugott : )
()
& to þam Pentecosten . . . wæs gesewn blod weallan of eorþan. And at that Pentecost was seen blood to-well-up from earth. Swa swa mænige sæden þe hit geseon sceoldan. As many said that it see should ‘and at that Pentecost . . . blood was seen welling up from the ground, as many said who supposedly saw it.’ (ChronE (Plummer) .; from Traugott : )
Traugott (: f.) provides evidence for other modals, such as must, to argue that the semantic development is from deontic via weak subjective epistemic towards clearly epistemic meaning. Epistemic uses were typically supported by an epistemic adverb such as nedes in the context in Middle English, as in (), and only later occurred without such adverbial support, as in (), which can be taken as evidence that epistemic meaning has clearly grammaticalized: ()
He that dooth good & doth not goodly . . . must nedes be badde. ‘Whoever does good, but does not do it with good intentions . . . must necessarily be bad.’ ( Usk, Testament of Love (Skeat) , ; from Traugott : )
()
This must have been a sad shock to the poor disconsolate parent. ( Goldsmith, Cit. World lxxi; from Traugott : )
Importantly for a usage-based grammaticalization account, the process involved overlap and did not occur simultaneously for all verbs: ‘ . . . many earlier meanings coexisted with later ones, and doubtless constrained the development of later ones’ (Traugott : ). According to Traugott and Trousdale (: ; : ), the emergence of the modal schema can also be seen as a process of constructionalization that gave rise to a new form–meaning pairing, which in turn is part of the auxiliary schema (i.e. a subschema in a larger network; see also Hilpert, this volume, section ..). Once the schema exists, it can acquire new members, also at the periphery, as the case of (had) better will show. This view of the process fits in with one of the basic tenets of construction grammar, i.e. that our knowledge of language amounts to a network of constructions (i.e. form–meaning pairings) or the ‘construct-i-con’ in Goldberg’s (: ) words.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
SUBJECT had better it is better proverbs SUBJECT ‘d better better proverbs Øsubject Øhad better
SUBJECT Ø better
. The grammaticalization chain for the modal idiom (had) better in the Late Modern Period (after van der Auwera et al. : )
Let us now turn to the modal idiom (had) better. Denison and Cort () detail its development from the Old English period, looking at syntactic as well as semantic and pragmatic changes involved in the process. They refrain from identifying better constructions unequivocally as an instance of grammaticalization, however, mainly because, according to them, the change is not necessarily unidirectional towards modalhood. Within a constructionalization account, this would not pose a problem since inheritance of properties from the schema can also be partial. For the Late Modern Period, van der Auwera et al. () provide empirical evidence from the Corpus of Late Modern English Texts (CLMET) substantiating the later developments, which are summarized in Figure .. In a nutshell, in a first step, the full form had better shows phonological reduction to ’d better. A parallel reduction of the subject and verb in so-called better proverbs of the type (It is) better (to) XP (than XP) (e.g., Better the devil you know than the devil you don’t; see Denison and Cort : ) contributes to further reduction in form of the better modal idiom (see also Krug : ). This aspect of the development also makes sense within a construction grammar framework, where the modal idiom better would be seen as inheriting the option of a zero subject and zero verb from the better proverbs. In Present-Day English, there is some spurious evidence from child language that better might be on its way to becoming fully verbal (as in ()), and thus offering the potential for further reanalysis.15
15 Van der Auwera et al. (: ) attribute this example to Sturtevant (), but it is typically quoted as evidence for grammaticalization discussed in Palmer (: ) and later analyses. Cruttenden (: ) maintains that instances like You’d better go, bettn’t you? are examples of a ‘false modal’ use attested in child language; whether errors in acquisition may lead to language change is still a somewhat controversial issue (see the discussion in Lieven ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
() I better go now, bettn’t I?
(quoted from van der Auwera et al. : )
A search in the eighty-million-word Oxford Children’s Corpus did not yield a single instance of bettn’t,16 however, despite the fact that there is evidence of the use of spoken features in children’s writing (see Hundt ). In the course of its history, (had) better has come to share a number of important syntactic characteristics with core modals: it lacks non-finite variants and has an invariable verbal element had, which is historically a past tense form but does not express past tense meaning any more. In addition, (had) better always combines with the base form of another verb rather than a to-infinitive; shows contraction of the verbal element or even ellipsis (see examples () and (), respectively); and takes simple not in negation (and not do-support, like lexical verbs). The negator can attach to or follow the verbal base (as in () and (), with (b) showing subject ellipsis); it can also follow better (see ()), typically (but not exclusively) with verbal ellipsis: () I said, ‘You’d better take it off, or else you’ll get took off ’ . . . (OBP, ) ()
You better go and look for the things under some trees in the corner of a garden three or four gardens away from the house where they took the things. (OBP, )
()
a. When in their cups, beer-drinkers were heard to mutter that the bugger hadn’t better come back, neither . . . (COCA, , FIC) b. “Hadn’t better try the boots on, the way my feet are swollen.” (COCA, , FIC)
() I call’d Wood into a Corner, and ask’d him whether he had not better save his own Life and make a Discovery? (OBP, ) ()
a. . . . then these two men said, you had better not meddle with him, you may be brought into bad bread. (OBP, ) b. Well, let me tell Willis, that he’d better not try that trick on me. (TIME MAGAZINE, August ) c. “Better not go in there, buddy,” said the cop. (TIME MAGAZINE, October ) d. Still, one better not dwell on Piddlington. (COHA, , FIC) e. They better not put him on the heater because he would just melt away. (COCA, SPOK, )
In terms of frequency developments, van der Auwera et al. (: ) show that the full form had better declines significantly in British English during the twentieth century, 16 The corpus contains a collection of writing by children aged thirteen or younger from a BBC writing competition in /. Thanks go to Nilanjana Banerji (Oxford University Press) for running the search in the corpus for me.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
whereas the reduced form ’d better increases to the same extent; the more fully grammaticalized bare form better is attested from the eighteenth century in their data but only slowly increases during the twentieth century. With respect to semantic changes, their evidence further corroborates the shift from earlier deontic meanings to later epistemic uses, which remain rare even in Present-Day English (van der Auwera et al. : –). The earliest epistemic (what they call ‘optative’) use from their data is ():17 ()
I haven’t any energy left. I don’t understand things. This had better be the end of it. Let’em sell the stock and take him down,” said the old man, . . . (CLMET , Dickens, Dombey and Sons; quoted from van der Auwera et al. : ).
The fact that the (had) better construction shares some, but not all, properties with the core modals is amenable to modelling modality within the construction grammar idea of a network of constructions.18
.. Syntactic demise: being to V The story of mood and modality in English is not only one of inflectional loss. The language also lost some of the periphrastic constructions that had developed from Old English onwards. One example of such syntactic loss is the demise of non-finite variants of the semi-modal be to, which enjoyed a rather short lease of life between the end of the sixteenth century and around (see () for the earliest attested example in OED; the last, according to Visser –: }, is from ). However, in a corpus-based study of the demise of non-finite being to V, Hundt () found that the construction is attested considerably later: into the early s and, occasionally, even in the twenty-first century in (American) speech (see examples () and (), respectively). ()
He him selfe being to iudge all men, is to bee iudged of no man. ( FULKE in Confer. III. (), OED s.v. be v. IV. c.)
()
“ . . . What is the usual meaning of the word impersonal?” “Indefinite.” “Indefinite to the point of being non-existent, you mean?” “Perhaps not quite so much as that. Indefinite to the point of not being to be bothered about.” (COHA, NON-FICTION )
Denison and Cort (: ) quote an earlier example from , but also point out that this is ‘unusually early’. 18 Similarly, Krug’s (: ) gravitation model of emerging English modals could be integrated conceptually into a constructionalization framework and the notion of the construct-i-con. 17
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
()
They can’t change their orientation; and here they are, being to be close to children and it’s abominable. (COCA, CNN transcripts )
The finding of the corpus-based study on the (near-complete) demise of being to V is that English lost an extremely low-frequency pattern. Chronologically, this development is linked to the grammaticalization of the progressive passive (e.g., was being watched), which became available in the second half of the eighteenth century. Data from large corpora thus mainly corroborate the accounts given in earlier studies (e.g., Warner : ; for the earliest attested progressive passive, see van Bergen ). Warner () argues for a parametric change, though. As Hundt (: ) points out, the fact that being to V is still occasionally attested beyond the period when the progressive passive developed speaks against treating this change within a strict (parametric) formalist approach, where the rise of the construction that brings about the systemic change (i.e. the progressive passive) should block the ‘old’ pattern from appearing. Corpus evidence does not corroborate an alternative hypothesis, namely that being to V was ousted by its near-synonym having to V. Even though the two constructions are not fully equivalent in all contexts (see Hundt for details), they could be used with the same meaning in some contexts, as the following examples illustrate: ()
Having to meet Arabella here, it was impossible to meet Sue at Alfredston as he had promised. (Thomas Hardy, , Jude the Obscure, quoted from NCF)
()
Being to seek his food he would hunt for it. (OED, , Holcroft Procopius I. )
Data from the Old Bailey Proceedings (OBP) corpus show that being to V had already declined significantly before the non-finite alternative having to V started spreading in the second half of the eighteenth century (see Figure .). Corpus evidence thus does not provide an alternative explanation to the one offered by earlier studies (e.g., Warner ), but adds important details to the chronology of the development, and, crucially, evidence of the occasional use of the obsolescent pattern beyond the s, which is relevant from a theoretical point of view.
.. Grammatical revival: the case of the mandative subjunctive After mandative or suasive expressions such as ask, stipulate, proposal, or essential, a that-clause in English can contain either a modal auxiliary like should or a verb inflecting for subjunctive mood (be for all persons but limited to third person singular subjects and present tense for all other verbs; for subjunctives, see examples () and ()
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
14.6
6.3
1.65
2.18
0
0.28
0.07
0
1600s
1700s
1800s
1900s
being to V
having to V
. Being to V and semi-modal having to V (frequency per million words) in the OBP corpus (based on data from Hundt : )
in section .. above).19 Various corpus studies have shown that the subjunctive, after decreasing to the point where it had become the minority variant in these contexts (e.g., Hundt : ), sees a revival in American English in the twentieth century, with other varieties (e.g., New Zealand, Australian, and British English) following suit (see Övergaard , Hundt a, Leech et al. ). Hundt and Gardner () provide additional data from the s prequels to the standard reference corpora of British and American English (BrE and AmE), i.e. B-LOB () and B-Brown (), respectively, which allows them to corroborate the change in twentieth-century English on both sides of the Atlantic. Figure . shows that by the s, AmE is clearly ahead of BrE in the revival of the mandative subjunctive.20 In fact, this is one of the changes where BrE is lagging behind significantly in a development that has reached near completion in written AmE: the only significant diachronic change emerging from a pair-wise comparison of the data in Figure . is the one between the s and s BrE corpora (LOB and F-LOB). In other words, the increase that, according to Övergaard (: –), occurred in AmE 19
In a sentence such as I suggest that we base our study on the analysis of representative corpus data, the verb base is underspecified with respect to mood (i.e. indeterminate between a subjunctive or indicative interpretation). Such examples are not included in the data set on which Figure . is based. The indicative (e.g., I suggest that he leaves earlier today) is not attested in the American written corpora and was therefore also excluded from the British data in Figure . to allow for more direct comparison of the two options. Leech et al. (: –) show that it is very infrequent in BrE writing but is regularly used in spoken BrE. 20 Preliminary evidence from DCPSE indicates that the increase of mandative subjunctives in BrE is limited largely to the written medium (see Waller ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
B-Brown (AmE, 1930s) Brown (AmE, 1961) Frown (AmE, 1992)
B-LOB (BrE, 1930s) LOB (BrE, 1961) FLOB (BrE, 1991) 0%
20%
40% subjunctive
60%
80%
100%
should-periphrasis
. Proportion of periphrastic constructions and mandative subjunctives in the Brown family of corpora (based on Hundt and Gardner : ); total number of variable contexts: B-LOB = ; LOB = ; FLOB = ; B-Brown = ; Brown = ; Frown =
between and had already made the subjunctive so frequent that further significant increases after the s were unlikely. Övergaard () attributes the kickstart that the mandative subjunctive received in the US to language contact between English and languages with subjunctive forms (such as German, French, Spanish, and Italian) spoken by immigrants in the Mid-West. Interestingly, the loss of the were-subjunctive in the protasis of conditional clauses (e.g., if I were kind to you; see () above) is much more advanced in written BrE than in AmE (see Leech et al. : – and Hundt and Gardner : ): frequencies in the LOB corpora drop from about per cent in the s to just above per cent in the s, whereas those in the AmE corpora start from the slightly higher level of . per cent in the s (B-Brown) and fall to just under per cent in the s (Frown). Together, the revival of the mandative subjunctive and the retarded change towards indicatives in conditional clauses indicate that (written) AmE, overall, has been a relatively ‘subjunctive-friendly’ variety in the twentieth century.
.. Recent change in core modals and modal constructions One of the most noticeable recent grammatical changes that emerged from Leech et al.’s () study of BrE and AmE on the basis of the Brown-family corpora is the significant decline in core modal verbs and a substantial rise in the frequency of semimodals; while Leech et al. () use written data, the same general trend has also been confirmed in spoken English (see e.g., Close and Aarts or Bowie et al. ). This section discusses the overall development in these two related areas of English grammar.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
need(n’t) ought (to) shall might must should may could can will would 0
1000
2000
3000
4000
1991/2
1961/2
5000
6000
7000
. Diachronic development of core modals in the Brown-family of corpora (based on Leech et al. : ; raw total frequencies in BrE and AmE pooled together; all corpora are approximately ,, words in size)
Figure . presents data on the core modals. The decrease of could is the only one that does not prove statistically significant in a log-likelihood test, and can is the only core modal that shows a slight (but non-significant) increase (see Leech et al. : ).21 In addition to the decline in frequency, some core modals also exhibit what Leech et al. (: ) refer to as ‘paradigmatic atrophy’. All modals are anomalous (for historical reasons) in that they lack person marking in the third person singular present indicative. Like the verbs they evolved from, they have largely lost the original tense opposition (e.g., the main difference between shall and should is a semantic rather than a temporal one).22 Additionally, a verb like shall is restricted with respect to the persons it can take as its subject: in BrE, especially, it almost exclusively takes first person subjects (as in Shall I close the window or Shall we start?).23 The few instances where it 21 Strictly speaking, two data points are not sufficient to verify diachronic developments, but subsequent studies (e.g., Smith and Leech ) have verified the trend. 22 There is, however, variation among the modals in this regard, with past-time uses of could and would still quite frequent. 23 Note that in some regional varieties of English (notably Scottish and Irish English, and occasionally in regional varieties of New Zealand English where Scottish influence was strong), will has encroached on the domain of shall even with first person subjects in offers and suggestions, giving rise to sentences like ‘Babe, will I ring us a taxi and take you to the doctor?’ (Wellington Corpus of Written New Zealand English, K).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
is still attested with a second or third person subject are stylistically marked in BrE: they are either instances of archaic or specialized usage from legal and administrative writing (see Leech et al. : ). The situation, at first glance, appears to be somewhat different in the International Corpus of English (ICE), which also includes spoken data: Collins (: f.) finds that, in ICE-GB, ICE-Aus, and comparable data for AmE, speakers regularly use deontic shall with third person subjects: . per cent or instances of all deontic uses of shall in his Australian, British, and American data violate the apparent ‘rule’ that shall is only used with first person subjects. This observation does not distinguish between evidence from the spoken and written parts of the corpus and pools the results for all three varieties. A closer look at ICE-GB, only, reveals that of the ninety-four instances of shall in the written part of the corpus, almost half ( or . per cent) illustrate co-occurrence with a third-person subject. However, on closer inspection, these turn out to be very unevenly distributed, with / from only two texts in the administrative writing section of the corpus. The remaining instances are from similarly formal contexts (such as the formal letter from which example () is taken) or from quotations of legal texts (as illustrated in example ()): ()
I am writing simply to confirm with you (and, by copy of this letter, with all the other parties involved) that it has been agreed between the inventors and the College’s exploitation staff that all ongoing work on and ideas as to their possible application shall be treated as a team effort. (ICE-GB, WB-)
() Local authority residential homes were established by Section of the National Assistance Act which stated that: it shall be the duty of every local authority . . . to provide residential accommodation for persons who by reason of age, infirmity or any other circumstances are in need of care and attention which is not otherwise available to them. (ICE-GB, WA-) The ICE-GB written data thus confirm rather than contradict Leech et al.’s observation of an uneven distribution of shall across text types, which they refer to as ‘distributional fragmentation’ and take to be another characteristic of the obsolescence of core modals (Leech et al. : ). The question is whether any of these changes in the area of core modals are related to frequency developments of semi-modals. Figure . shows the development of the semi-modals in British and American English between and /. An important detail that emerges from the comparison of Figures . and . is that the changes do not happen on the same scale: modals are, overall, much more frequent to start with and the decrease (over , instances) is very substantial (most notably the drop in usage that must and may display). The increase in semi-modals (a total of instances) does not compensate for the dramatic loss in core modals in writing.24 24
There are three notable exceptions to the general development of semi-modals: BE to, (had) better, and (HAVE) got to decrease rather than increase, the first of these even significantly so.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
WANT to NEED to HAVE to (HAVE) got to (had) better BE to BE supposed to BE going to BE able to 0
200
400
600
800
1991/2
1000
1200
1400
1600
1961/2
. Diachronic development of semi-modals in the Brown-family corpora (based on Leech et al. : )
Leech et al. (: –) observe that evidence from corpora of spoken English (DCPSE, the spoken component of the British National Corpus and the Longman Corpus of Spoken American English) points towards a much more frequent use of semi-modals in the spoken medium than in writing, which they take as a tentative argument for a development where . . . the competitive relation between core modals and grammaticalizing semimodals in spoken English is an explanatory factor in accounting for the decline of the one and the ascendancy of the other in both spoken and written English. The balance of gain and loss in the spoken language, it seems, has a knock-on effect in the written language, even where that gain/loss equation does not (yet) materialize in the written language. (Leech et al. : )
In a detailed study on variation between must and have (got) to, Close and Aarts () find that it is in the function of root modality that have to has been replacing must in spoken English. In terms of the semantics, Leech et al. (: ) point out that there is no simple one-to-one relation between core and semi-modals: for the expression of necessity and obligation, for instance, English has a set of four modals (must, should, ought (to), and needn’t) and four semi-modals (HAVE to, (HAVE) got to, need to, and (had) better) sharing the semantic space. The story of the loss of modals is further complicated by
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
the fact that the envelope of variation is not simply one where we find core modals on the one hand and semi-modals on the other hand, but one in which modal adverbs such as necessarily, the mandative subjunctive (see section ..) and modal lexical expressions like be obliged to offer additional means to express the modal semantics (in this case, necessity and obligation). But even taking all these alternative means of expression into account does not fully explain the drastic decline in core modals in the recent history of the language, prompting Geoffrey Leech () to conclude that ‘I still don’t know where those modals have gone!’
. C
.................................................................................................................................. The aim of this chapter has been to give examples of changes in morphology and syntax, including recent change in English grammar, comparing their treatment in different grammatical models, notably the generative versus functional, usage-based approaches. The case studies from the area of mood and modality were used to illustrate instances of both morphological and syntactic loss as well as gain. Moreover, we saw that residual forms of formerly fully functional inflectional paradigms can be revived in a functional niche, as has been the case with the mandative subjunctive in the twentieth century. With respect to the recent decline in the frequency of core modals (see Leech et al. ), we might wonder whether this might eventually lead to a loss of this new category of verbs. Traugott and Trousdale (: –) reinterpret the changes in frequency in the modal and semi-modal inventory within a construction grammar model: The obsolescence of the marginal core modals [such as shall, M.H.] has not yet led to loss of schematicity of the core modal construction, since all members are still used. However, the individual trajectories of each micro-construction suggest that alignment within the larger macro-envelope of core modals is becoming rather weak during obsolescence, and that many core modals are becoming restricted within the system. These are all constructional changes. (Traugott and Trousdale : f.)
In other words, once a constructional schema such as the modal auxiliary has developed, individual members of the category may be lost and the category may be weakened. Alternatively, we could conceive of the modal schema in a more abstract way and envisage the core modal schema as a sub-schema alongside a more recent emergent modal sub-schema (see Krug : ), where micro-constructions (should, shall, be to, (had) better) cluster around these more abstract prototypes, opening up the possibility of migration or attraction of individual members from one sub-schema to another or the hybridization of a member, as seems to be the case with dare (to), which allows both for bare negation (typical of core modals) and negation with do-support (also possible with some semi-modals like have to or used to). In a construction
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
grammar view of grammatical change it is also possible that a construction simply remains at the margin (e.g., (had) better) and never fully constructionalizes to the subschema of the core modals.
A I dedicate this chapter to the memory of Geoffrey Leech ( January – August ), who(m) I last met when he gave a guest lecture at the English Department in Zürich with the title ‘Where have all the modals gone?’ We are still awaiting a conclusive answer to this puzzling question, Geoff!
L C ARCHER = A Representative Corpus of Historical English Registers .. –/// /. Originally compiled under the supervision of Douglas Biber (Northern Arizona University) and Edward Finegan (University of Southern California); currently managed by a consortium of participants at fourteen universities; http://www.alc.manchester.ac.uk/ subjects/lel/research/projects/archer/. AE = American English . (period –) Compiled by Paul Baker, Lancaster University. (Parallel in design and sampling principles to BE.) http://www.helsinki.fi/ varieng/CoRD/corpora/BE/index.html. BE = British English . (period –) Compiled by Paul Baker (University of Lancaster). http://www.helsinki.fi/varieng/CoRD/corpora/BE/index.html. B-Brown = The s BROWN Corpus. –. Compiled by Marianne Hundt (University of Zurich). http://www.es.uzh.ch/Subsites/Projects/BBROWN.html. B-LOB = The BLOB- Corpus. –. Compiled by Geoffrey Leech and Paul Rayson (University of Lancaster). http://www.helsinki.fi/varieng/CoRD/corpora/BLOB-/index. html. BNC = British National Corpus. Created by the BNC Consortium led by Oxford University Press. Accessed with Corpus Navigator at Zürich University. Brown = A standard corpus of Present-day edited American English, for use with digital computers. . Compiled by W. Nelson Francis and Henry Kučera (Brown University). http://www.helsinki.fi/varieng/CoRD/corpora/BROWN/. Brown family of corpora (Brown, LOB, Frown, FLOB, B-Brown, B-LOB, AE, BE) COCA = Corpus of contemporary American English (–). Compiled by Mark Davies (Brigham Young University). https://www.english-corpora.org/coca/. COHA = Corpus of historical American English (–). s.a. Compiled by Mark Davies (Brigham Young University). https://www.english-corpora.org/coca/. DCPSE = Diachronic Corpus of Present-Day Spoken English (see http://www.helsinki.fi/ varieng/CoRD/corpora/DCPSE/). FLOB = Freiburg-LOB corpus. . Compiled by Christian Mair (University of Freiburg). http://www.helsinki.fi/varieng/CoRD/corpora/FLOB/. Frown = Freiburg-Brown Corpus. . Compiled by Christian Mair. http://www.helsinki.fi/ varieng/CoRD/corpora/FROWN/.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
HC = The Helsinki corpus of English texts: Diachronic and dialectal. . Compiled by Matti Rissanen, Merja Kytö, Leena Kahlas-Tarkka, Matti Kilpiö, Saara Nevanlinna, Irma Taavitsainen, Terttu Nevalainen, and Helena Raumolin-Brunberg (University of Helsinki). http:// www.helsinki.fi/varieng/CoRD/corpora/HelsinkiCorpus/. LOB Corpus = Lancaster-Oslo-Bergen corpus. . Compiled by Geoffrey Leech (Lancaster University), Stig Johansson (University of Oslo) (project leaders), and Knut Hofland (University of Bergen, head of computing). http://www.helsinki.fi/varieng/CoRD/ corpora/LOB/. TIME MAGAZINE = Time Magazine Corpus compiled by Mark Davies. https://www.englishcorpora.org/time/. OBP = Old Bailey Proceedings Corpus (–). http://www.helsinki.fi/varieng/CoRD/ corpora/OBC/. WWC = Wellington Corpus of Written New Zealand English. http://www.victoria.ac.nz/lals/ resources/corpora-default#wwc.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
......................................................................................................................
Non-standard grammatical features ......................................................................................................................
. I
.................................................................................................................................. T purpose of the present chapter is to discuss non-standard grammatical features of regional varieties of English in relation to their Standard English functional equivalents, focusing on those varieties used primarily by native speakers. The term ‘grammatical variation’ will be used to address differences in form and function between non-standard and standard Englishes. I here approach regional varieties of English from a crosslinguistic, typological perspective. The discussion will for the most part exclude grammatical variation found within Standard English, such as that between the of-genitive and the s-genitive (e.g., the decision of the committee versus the committee’s decision); the influence of genre on such variation is discussed in Chapter , this volume. The term ‘Standard English’ refers to a difficult concept, as is well known. According to Trudgill (b), Standard English is essentially a sociolect—a ‘social dialect’ in his terms—used by the educated middle and upper classes, on the media, in education, and in academia. Crucially, it can be combined with different accents, and is found on different stylistic levels and in different registers (or genres). Widely used grammars of English like Quirk et al. () or Huddleston and Pullum () provide descriptions of Standard English, thus codifying and further standardizing it. We may in principle distinguish several Standard Englishes, as, for example, Standard British and American English, Standard Australian English, Standard Singapore English, Standard Nigerian English, and so on, since in all the relevant territories we find a norm-conscious educated middle class. It is a matter of some debate if some of these Standard Englishes represent separate standards, or if they can be subsumed under British and American English respectively. Australia and New Zealand are traditionally considered territories
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
in which the British English norm prevails, but we also find positions that view them as Standard Englishes in their own right or as more strongly influenced by North American English, especially more recently. Canadian English is generally subsumed under the North American standard, though in this case, too, we can observe endeavours to recognize it as a separate norm (compare the contributions to Hickey b). Quite typically, we do not find categorical differences between the various Standard Englishes, but distributional differences that are captured by the term ‘gradient grammar’ in the research literature (see Hundt b on New Zealand English; Bresnan and Hay on New Zealand and American English; Hinrichs et al. on American and British English). Such gradient differences will not be the main focus here, though some results of this research strand will be included. Even though the term ‘Standard English’ is very difficult to define, there is widespread consensus that Standard British and North American English represent two norms that speakers and learners of English worldwide use as a point of orientation. What the associated territories have in common is that English is primarily learnt and used there as a first language (referred to as ‘L varieties’ in Chapter , this volume). In Kachru’s () terms, they represent the inner circle of English, although the number of speakers who learn and use English there as a second or additional language has vastly increased over the past decades. In some urban contexts, such as certain parts of London, Toronto, and Vancouver, non-native speakers outnumber native speakers or the situation is close to parity. There is reason to believe that the inner circle Englishes are changing due to these external influences (Cheshire et al. ). Inner circle Englishes contrast with those of the outer circle, which comprise the varieties spoken in the former British and US American colonies, such as India, Singapore, Nigeria, and the Philippines. These will only marginally be of interest here and only to the extent that they preserve non-standard regional features of the inner circle territories (see Chapter , this volume, where these Englishes are called ‘L-varieties’). As the more recent discussion of New Englishes as well as English-based Pidgins and Creoles shows, the historical diffusion of inner circle non-standard features into other parts of the world had long been underrated (Mufwene , Davydova ). Ireland represents a special case since it ceased being a British colony long before the other colonies, lies very close to the British mainland, and shows a substantially higher use of English in comparison to its indigenous language Irish. Accordingly, it will here be treated as an inner circle area. The discussion of regional variants in relation to Standard English can be pursued from three different perspectives. Firstly, we could process the different inner circle territories one by one, offering descriptive surveys of the standard and non-standard features encountered. The main disadvantage of this approach is that it produces considerable overlap, as several locales in the old and new world manifest the same features for reasons of historical migration patterns. Secondly, one could adopt an altogether historical perspective and chart the correspondences between current standard and non-standard features and their respective historical precursors. This approach is quite illuminating, since some of today’s non-standard regional features
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
can be traced to historical standard features and vice versa (e.g., rhoticity, negative concord, etc.). And thirdly, we can adopt a phenomenological perspective and explore the distribution and variation of specific grammatical features. This approach will be pursued here for a selection of morpho-syntactic features. It is consistent with a cross-linguistic or typological approach to language variation, the main difference being that we here investigate regional varieties of the same language, whereas typology’s prime interest lies in the charting of grammatical variation across genetically different languages. I will introduce the overall approach and its application to the study of regional variation in more detail in section .. Needless to say, in an overview article like this it is impossible to provide a comprehensive and minutely detailed account of English regional variation. There is simply too much out there (compare the electronic World Atlas of Varieties of English, Kortmann and Lunkenheimer ). Instead, I will here focus on a selection of morpho-syntactic variables that are per se informative, come from different grammatical domains, and can be related to the respective standard variants in a systematic way. They come from pronominal systems, tense and aspect, negation, subject–verb agreement, and clause structure.
. T
..................................................................................................................................
.. Exploring the patterns and limits of language variation Typological research shows that languages are constrained with respect to their formal means of expression, their functions, and especially the interplay between form and function. In other words, the logically conceivable space of variation is heavily curtailed so that clusters of form types and functions emerge as well as prototypical mappings of form and function. To give an example, languages typically offer dedicated clause types (i.e. form types) for expressing questions and requests (interrogatives and imperatives; see Chapter , this volume, on clause types), but there seem to be no languages with a dedicated clause type for promises or apologies (‘promissives’ or ‘apologetics’). Similarly, we often find dedicated plural and dual morphology, but are less likely to encounter grammatical marking for ‘trials’ or ‘paucals’—though these do exist. Certain phenomena are less likely to occur than others or do not exist at all. The taxonomic goal of charting variation is common to both language typology and variation studies, though they clearly differ in the object of their study. While language typology—at least in the ideal case—considers a representative selection of genetically independent languages from the pool of six to seven thousand attested languages in the world, variation studies is interested in the extent of heterogeneity found in one language. The roots of variation studies can be traced back to the kind of traditional
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
dialectology pursued in the nineteenth and twentieth century in Europe that saw the advent of comprehensive dialect atlases, primarily for French, English, and German (e.g., the Survey of English Dialects directed by Harold Orton; Orton and Dieth –). Since the boundaries between languages and dialects are fluid, typology and variation studies necessarily overlap. For example, typological studies within the group of Germanic, Romance, or Malay languages could easily be subsumed under variation studies, at least from a cross-linguistic perspective, since the relative differences within these language groups are comparatively small. Language typology and dialectology also differ in relation to the phenomena they are primarily interested in. The main objective of traditional dialectology consisted in the charting of lexical differences and differences in pronunciation. For instance, British dialects show rhotic and non-rhotic pronunciations as well as different realizations of certain segments (voiced initial fricatives versus voiceless ones, as in /zevn/ versus /sevn/). There are many lexical differences (e.g., bairn versus child). More recent sociolinguistic studies investigate phonological and grammatical variants, though the focus is more on the social determinants in urban areas and less on traditional dialect areas. For example, the distribution of the discourse marker like heavily depends on the age and sex of the speakers in certain conurbations (Tagliamonte ). Typological studies are also concerned with the variants of certain variables, although we here talk about categories and their values. Typologists aim to investigate, firstly, whether a certain grammatical category exists in a language; secondly, which values it can assume; and thirdly, in which ways it is encoded. For example, we may be interested in studying the category of number cross-linguistically: whether languages realize it at all; which distinctions are drawn (singular, dual, plural, etc.); and by means of which strategy it is encoded (suffix, prefix, modifier, etc.; see Corbett ). Variation studies define a certain variable and its values at the outset to see which external and internal factors determine their distribution.
.. Language universals If, as outlined above, the logically conceivable space of language variation is constrained, we may ask which points or areas in this space are invariably attested and whether logical connections exist between otherwise independent areas. This brings us to a brief discussion of language universals that can be conceptualized as the crisscrossing of either absolute (i.e. exceptionless) or statistical universals, on the one hand, and conditional versus unconditional universals on the other. Table . shows the resulting four basic types of universals in an overview fashion. True absolute universals are rare and perhaps not even especially interesting. We can here list the fact that all (spoken) languages possess and distinguish between consonantal and vowel sounds, even though they differ heavily in the ratios of consonants to vowels (Maddieson ). Similarly, it is widely believed that all languages chunk the speech signal into constituents at some underlying level of representation, and that
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Table . Logical types of universal statement (following Greenberg), taken from Evans and Levinson (: ) Absolute (exceptionless) Unconditional Type . ‘Unrestricted absolute universals’ (unrestricted) All languages have property X Conditional (restricted)
Type . ‘Exceptionless implicational universals’ If a language has property X, it also has property Y
Statistical (tendencies) Type . ‘Unrestricted tendencies’ Most languages have property X Type . ‘Statistical implicational universals’ If a language has property X, it will tend to have property Y
syntactic operations (like passivization, topicalization, etc.) work on constituents rather than individual words. Other widely discussed absolute universals concern recursion, i.e. the stacking of structurally identical syntactic material (John believes that Mary thinks that Sue said that . . . ), and parts of speech systems, where a minimal contrast between nouns and verbs is widely believed to be universal. Most universals, nevertheless, are statistical in nature, meaning that they are attested in the overwhelming majority of languages or at least have several exceptions. For example, although word order systems in which the subject precedes the object definitely represent the norm, there are several languages that show object–subject order in their basic word order pattern. Conditional (or implicational) universals try to capture the observation that certain properties in the logical space of variation occur or tend to occur with other properties. Put differently, some properties come to be predictors of other properties. The abstract phrasing of such conditional universals is shown in (), with (a) describing the case of absolute and (b) that of statistical conditional universals. ()
a. If a language has property X, it also has property Y. b. If a language has property X, it will tend to have property Y.
Such conditional universals may also be stacked, thus forming chains of implicational dependencies referred to as ‘hierarchies’. As for parts of speech systems, certain word classes serve as predictors for others, as illustrated in ()—to be read from right to left (with the presence of one class implicating the presence of the class(es) to its left: e.g., if adverbs are found in a particular language, then adjectives (and nouns and verbs) will also be found there). Here, the implicational connections exist between different values of the same category, i.e. word class. ()
verb > noun > adjective > adverb
(Hengeveld : )
We can also observe implicational connections between different categories. For example, basic word order is a fairly good predictor of adposition–head order, provided
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
a language has adpositions. Closely related to conditional universals is the notion of semantic maps. These are graphic representations of related cognitive domains where cognitive relations are rendered in terms of topographic vicinity. The closer the cognitive relations, the nearer they are placed on the semantic map. It is understood that grammatical markers cover adjacent areas on these maps. The territorial connections may be interpreted in various ways, inter alia, grammaticalization paths, historical developments, polysemy patterns, or conditional universals. Applying the typological approach, I will in what follows explore a number of non-standard grammatical subsystems of English. In particular, I will try to show that non-standard does not mean idiosyncratic, but that there can often be detected some systematic relationship to the corresponding standard features. Typological generalizations help to make these systematic relationships explicit.
. P
.................................................................................................................................. Let us now turn to a discussion of selected grammatical subsystems of English, starting with pronominal systems. Within this broad domain, I will here be concerned with reflexive pronouns, pronominal gender, and pronominal case.
.. Reflexive pronouns and reflexive marking In comparison to the standard English paradigm of reflexive pronouns, the one found in regional varieties of Great Britain and the United States, though also in other parts of the world (Cheshire et al. : , Kortmann and Szmrecsanyi : ), is shown in example (). The main difference concerns the forms of the third person that involve possessive pronouns just as do those of the first and second person. ()
a. myself, yourself, hisself, herself, itsself b. ourselves, yourselves, theirself/theirselves
Accordingly, the paradigm is more regular than the standard paradigm where object forms of the pronouns appear in the third person. Some illustration can be found in (). ()
a. [He] put his hand to steady hisself on top of the winch. (Southeast of England, Anderwald : ) b. And then they went and locked theirselves out of the trailer. (Colloquial American English, Murray and Simon : )
Changes in the paradigm form myself to meself can also be observed, as shown in (), as the relevant dialects use me as a possessive pronoun (Beal : –, Edwards : , Burridge : , Kortmann : –).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
a. I thought to meself “Possession’s nine points of the bloody law.” (Australian Vernacular English, Pawley : ) b. I had ten bob. Two bob for meself and eight bob for the board and lodging. (Southeast of England, Anderwald : )
There are also differences in the distribution of reflexive pronouns. In the standard varieties, co-referential noun phrases subcategorized by the same verb obligatorily trigger reflexive pronouns in object positions. Some regional varieties admit simple pronouns in such positions. Example () illustrates this for indirect objects, with first person pronouns being most likely to participate. The distribution of dedicated reflexive pronouns by and large follows the implicational hierarchy in (), i.e. if a variety requires them in the first person, they will also be obligatory in second and third person. This predicts the higher frequency of simple pronouns in the first person. ()
a. I’m going to buy me biscuits and chocolates. (Cape Flats English, McCormick : ) b. He was looking to buy him a house for his family. (Appalachian English, Christian : –)
()
third person > second person > first person
(Faltz : )
In rare cases, we can also find simple pronouns in co-referential direct object positions. These represent archaic cases that reflect Old English usage, where dedicated reflexive markers did not exist. The parallels are illustrated in examples () and (). ()
a. He has cut him. [= himself] b. He went to bathe him. [= himself ] (Yorkshire English, Wright –: volume iii, )
()
hine he bewerað mid wæpnum him he defended with weapons ‘he defended himself with weapons’ (Old English, Aelfric’s Grammar ., Zupitza )
Especially in Irish English, but for reasons of historical migration also in Newfoundland English (through Irish settlers), reflexive pronouns can be found in subject position. Consider the examples in (). This is a highly unusual pattern, though as argued in Siemund (), these occurrences of seeming reflexives are more appropriately analysed as intensive self-forms, which characterize the relevant referent as important or central (König and Siemund ). ()
a. I’m afraid himself [the master of the house] will be very angry when he hears about the accident to the mare. b. Is herself [i.e. the mistress] at home yet Jenny?
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
c. A: Could fairies get back to Heaven? – B: Well, I wouldn’t know now. Could yourself imagine they would? (Irish English, Filppula : –) Both standard and regional varieties do not typically draw a formal distinction between intensive self-forms and reflexive markers, as shown in example (). The self-forms in the Irish English examples in () semantically pattern with the intensive forms in (a), though the nominal or pronominal head is missing. ()
a. The professor himself criticized the student. b. The professor criticized himself.
(intensive) (reflexive)
.. Pronominal gender A second area of pronominal variation concerns gendered pronouns, i.e. the use of masculine he and feminine she in contrast to neuter it. Standard usage is based on a semantic split between animate (human) and inanimate (non-human) referents, and in the domain of humans and (certain) animals between male and female. The gender system is purely referential, with pronominal forms not being triggered by the lexical properties of some seemingly controlling noun, but by the properties of the referents themselves. This is illustrated in example (). Some extensions of the animate pronouns into the domain of inanimates are attested (ships, cars, moon, sun, etc.), though these represent marginal, colloquial, or literary usage. ()
a. he: b. she: c. it:
John, man, boy, etc. Mary, woman, girl, etc. stone, table, water, grass, etc.
Some varieties—and we are here again talking about traditional dialectal systems— offer strikingly different parameters of organization. For example, pronominal gender in the traditional dialects of Southwest England (Somerset, Devon) is organized around the mass/count distinction of nominals, with masculine he covering the countable domain—both animate and inanimate—and neuter it the domain of masses, substances, liquids, as well as abstract concepts. Feminine she is by and large restricted to females. The mass/count system was transported to Newfoundland and Tasmania as a result of migration. There, the split between he and she is based on different principles, with she being used for mobile entities as well as instruments of various kinds. Less regular, though, frequent extensions of he and she to inanimates can also be observed in Orkney and Shetland English, Irish English, and some varieties of American English (Kortmann : , Pawley : –, Schneider : , Wagner , Siemund ). Some examples of such extension can be found in (). ()
a. [What’s the matter with your hand?] Well, th’old horse muved on, and the body of the butt valled down, and he [the hand] was a jammed in twixt the body o’ un and the sharps (bran-pollard). (Southwest of England, Siemund : –)
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
b. Put the cover an the chest again an’ . . . locked un up screwed un up, however they had done with un. (Newfoundland English, Wagner : , ; un = him) c. That timber gun, she splits the log open. (Australian (Tasmanian) English, Pawley : , , )
The usage of neuter it in relation to liquids and abstract concepts is illustrated in example (), there being a contrasting pair of pronouns in (a). () a. Thick there cask ’ont hold, tidn no good to put it [the liquid] in he [the cask]. (Southwest of England, Siemund : ) b. I sure you, mum, ’twas a terble awkard job, and I widn do it ageean vor no such money. (Southwest of England, Siemund : ) At first sight, the dialectal mass/count systems appear completely unrelated to the system of pronominal gender found in the standard varieties. Here, typological studies can help to see important connections. One construct that has turned out to be especially useful is the so-called ‘hierarchy of individuation’, as provided by Figure .. We can view it as either an implicational hierarchy or a semantic map in which noun classes are ordered according to their degree of individuality or boundedness. Descriptions of humans range at the top (left), while abstract and mass nouns are placed at the bottom (right). Nouns describing animals and inanimate objects go in between. Several grammatical subsystems have proven sensitive to this hierarchy, including person and number marking, word order, and case marking (Croft ). The general observation is that if some grammatical distinction is available for a certain noun class, it will also be available for all noun classes further to its left. Regarding English pronominal gender, we can see that the split between he and she on the one hand, and neuter it, on the other, varies between dialects and standard varieties on this hierarchy. Whereas the standard varieties place the split between humans and animals (with some extensions into the domain of animals), the dialectal systems simply move it further to the right.
PROPER NAMES
HUMANS
ANIMALS
INANIMATE TANGIBLE OBJECTS
ANIMATES
ABSTRACTS
MASS NOUNS
INANIMATES
PROPER NOUNS
COMMON NOUNS
COUNT NOUNS
MASS NOUNS
. Morphosyntactic distinctions along a continuum of ‘individuality’ (Sasse : , Siemund : ). By permission of Mouton de Gruyter
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
.. Pronominal case A third area of pronominal variation is furnished by case contrasts, relating to differences in both form and distribution. Consider the examples in () illustrating a split between subject and object case forms in the second person that was lost in the standard varieties. Arguably, the dialectal systems are more regular than standard English in this respect. o dee grandfaider. () a. Du minds me aafil You remind me awful of you grandfather. ‘You remind me awfully of your grandfather.’ (Shetland English, Melchers : ) b. Set dee doon. Sit you down. ‘Sit down.’
(Shetland English, Melchers : )
Case contrasts typically follow the animacy hierarchy shown in () that can be viewed as a slightly different formulation of the hierarchy of individuation introduced above. Again, the observation is that if some noun class on this hierarchy shows case marking, all noun classes further to its left will also show case marking. It is noteworthy that the standard varieties run counter to the predictions made by this hierarchy, since there is no case contrast on second person pronouns. () st > nd > rd > proper nouns > human > animate > inanimate (adapted from Silverstein : ) As far as the distribution of case-marked pronouns is concerned, we can first of all observe the extension of object forms to possessive contexts, as in example (). ()
a. I sat down to have me tea as usual. (Southeast of England, Anderwald : ) b. We like us town. (North of England, Trudgill and Chambers : )
Moreover, there is a more general phenomenon frequently referred to as ‘pronoun exchange’ whereby subject forms can appear in object position and vice versa, even though the former process is more widespread. Some illustration is provided in () and (). Again, we are here talking about a traditional dialectal feature (Kortmann : –, Wagner : –). ()
a. . . . they always called I ‘Willie’, see. b. I did give she a ’and and she did give I a ’and and we did ’elp one another. (Southwest of England, Wagner : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
a. ’er’s shakin’ up seventy. ‘She is almost seventy.’ b. Us don’ think naught about things like that. (Southwest of England, Wagner : )
Pronoun exchange has been most widely observed in the Southwest of England, but it can also be found in several other varieties, including those of East Anglia, the Southeast of England, the North of England, and Newfoundland (Beal : –, Clarke : , Trudgill : –). Pronoun exchange is not categorical, though there seems to be a functional contrast to the effect that subject forms in object position can be used for emphasis, as emphatic forms so to speak.
. T
.................................................................................................................................. Having surveyed a nominal subsystem, let us now turn to the verb phrase, focusing on tense and aspect marking. Again, these are domains that offer considerable variation both in terms of form and function. Moreover, there are substantial regional differences.
.. Tense marking The English domain of tense marking covers categories such as present, perfect, past, and future, the perfect here being subsumed under tense for expository reasons. Its status is clearly different from the others, since it can be combined with past, present, and future. Non-standard tense uses—here focusing on the differences—can be related to the standard distributions as shown in Figure .. In addition, we can observe differences in the formal encoding, to which we will turn later. Using different varieties, the examples below illustrate these correspondences, namely the present tense in present perfect contexts (a), the present perfect in past tense contexts (b), the past tense in present perfect contexts (c), and the interchangeable use of the past tense and the past perfect (d). Such non-standard uses of the standard English form types are by no means restricted to the varieties mentioned in (), but relatively widespread. Non-standard varieties
Standard varieties
present tense
present tense
present perfect
present perfect
past tense
past tense
past perfect
past perfect
. Correspondences between non-standard and standard tense use (Siemund : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
a. I’m not in this [caravan] long [ . . . ] Only have this here a few year. ‘I haven’t been [ . . . ] have had this here for a few years.’ (Irish English, Filppula : ) b. It’s been twenty year ago they offered me a house and land. (Appalachian English, Montgomery : ) c. Were you ever in Kenmare? ‘Have you ever been . . . ?’ (Irish English, Filppula : ) d. . . . he . . . was angry I didn’t stay in the café. ‘He was angry that I hadn’t stayed in the café.’ (Scottish English, Miller : )
Non-standard form types can be more clearly associated with particular regions. For example, the so-called ‘after-perfect’ can be analysed as a calque on an underlying Irish construction and hence only occurs in Irish-English contact areas. It is one of the most conspicuous features of Irish English (), replacing present perfect (a) and past perfect (b). Its occurrence in Newfoundland English is due to Irish settlers exporting it to the New World (). ()
a. You’re after ruinin’ me. ‘You have (just) ruined me.’ b. And when the bell goes at six you just think you were only after going over, and you get out and up again. (Irish English, Filppula : )
()
a. I’m after havin’ eleven rabbits eaten [by dogs] this last three months. b. I’m after burning now [in the sun] about three times. (Newfoundland English, Clarke : )
Perhaps as a consequence of historical retention and innovative language contact, Irish English (and its descendant Newfoundland English) is especially prolific in perfect constructions. As well as the after-perfect and, of course, the standard have-perfect, we find the so-called ‘medial object perfect’ in which the participle is placed after the transitive direct object, as in example (). Moreover, these as well as some other traditional dialects use two perfect auxiliaries, namely be besides have. This is illustrated in (). () When your letter came to hand we had a letter prepared to send to you. (Irish English, Pietsch : ) ()
a. They’re already left. (Newfoundland English, Clarke : ) b. I’m been dere twartree [‘a couple of ’] times. (Shetland English, Melchers : )
The proliferation of perfect constructions in Irish English is quite unexpected from a cross-linguistic perspective, as the perfect as such is not an especially widespread category. In Dahl and Velupillai’s () sample of languages, no fewer than languages lack this category, i.e. more than fifty per cent of the languages sampled
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
express the relevant semantic territory differently. It stands to reason that either the present or the past takes over in these cases, as evidenced by the more extensive past tense use in North American Englishes (Elsness : ). All traditional varieties of English, though, possess have-perfects. These are practically restricted to European languages (German, Swedish, Spanish, etc.). They typically originate in resultative constructions (e.g., I have the letter written ‘I possess the letter and it is in a written state’) and may continue to grammaticalize into narrative tenses, as in German and French. Such developments may be behind the observed use of the present perfect in past tense contexts in Australian English (Then he’s hit her on the head; Collins and Peters : ).
.. Aspect marking While tenses relate the time of situation to the moment of speaking (or some other reference point), aspectual distinctions permit the speaker to portray the situation time in different ways—whether it is over, still ongoing, regularly repeated, etc. As grammatical marking, aspectual distinctions practically require the speaker to take up a certain perspective on situation time. For example, for every English sentence that we produce we need to decide whether we want to portray the situation as ongoing or not (progressive versus non-progressive/habitual). Regional varieties mainly offer differences along the progressive/habitual dimension, especially regarding the encoding of these aspectual distinctions. Habitual aspects are frequently expressed by non-emphatic do (Wales, Southwest England), but there are also other structural options of which perhaps Irish English displays the greatest array. Depending on how one counts, this variety has no fewer than six ways to express habituality. The examples in () offer a summary of the structural options. The verbs do and be appear in various constructions and also in combination with one another (Filppula ). ()
a. b. c. d. e. f.
do be + V-ing do be + adjective do + INF be + V-ing be’s/bees + V-ing be’s/bees + adjective
The examples in () illustrate these structural options, though it remains difficult to tell whether there are differences in meaning. ()
a. Yeah, that’s, that’s the camp. Military camp they call it . . . They do be shooting there couple of times a week or so. ‘They shoot there a couple of times per week or so.’ b. They does be lonesome by night, the priest does, surely. ‘They are lonesome . . . ’
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
c. Two lorries of them [turf] now in the year we do burn. ‘ . . . we burn.’ d. A: Where do they [tourists] stay, and what kind of pastimes do they have? – B: Well, they stay, some of them, in the forestry caravan sites. They bring caravans. They be shooting, and fishing out at the forestry lakes. e. A: And who brings you in [to Mass]? – B: We get, Mrs Cullen to leave us in. She be’s going, and she leaves us in, too. f. A: And what do you do in your play centre? Do you think it’s a good idea in the holidays? – B: It’s better, because you be’s bored doing nothing at home. (Irish English, Filppula : –) The second area of variation in the aspectual domain concerns the progressive. Concerning its ways of encoding, we need to mention ‘a-prefixing’, which is a rather archaic strategy of participle formation, mainly found in Appalachian English, Newfoundland English, and the traditional dialects of Southwest England. Examples can be found in () below. Historically, the prefix originates in a locative preposition (on). ()
a. It just took somebody all the time a-working, a-keeping that, because it was aboiling. ‘It just took somebody all the time working, keeping that, because it was boiling.’ (Appalachian English, Montgomery : ) b. They wasn’t a-doin’ nothin’ wrong. ‘They weren’t doing anything wrong.’ (Ozark English, Murray and Simon : )
Language contact has also contributed to the stock of progressive constructions in English, notably the busy-progressive found in White South African English. Consider the example in (), which represents a structural calque on an underlying Dutch construction (bezig ‘busy’). ()
a. I’m busy relaxing. ‘I am relaxing.’ b. I was busy losing my house. ‘I was losing my house.’ c. When I got to the car, he was busy dying. ‘ . . . he was dying.’ (White South African English, Bowerman : )
In terms of distribution, there is hardly any non-standard variety of English for which a more extended use of the progressive aspect has not been observed. This concerns varieties of both the inner and outer circle. A more extended use of the progressive aspect with stative verbs has been reported from Scottish English, Irish English, New Zealand and Australian English, and perhaps others. This appears to be a fairly general phenomenon. Example () illustrates stative verb usage with data from Irish English. ()
Well, of course, Semperit is a, an Austrian firm . . . They are not caring about the Irish people, they are only looking after their own interest, . . . (Irish English, Filppula : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
We need to bear in mind, though, that standard and non-standard usage may be difficult to tell apart here, since the progressive shows a tendency to encroach upon the domain of habituals even in Standard English (e.g., We are going to the opera a lot these days). Therefore, apparent regional non-standard uses as in example () can probably be subsumed under Standard English. ()
a. Today, educational establishments are still trying to teach a standard. Many schoolchildren are not learning the standard outwith school. (Scottish English, Miller : ) b. He’s going to the cinema every week. ‘He goes to the cinema every week.’ (Welsh English, Penhallurick : )
. N
.................................................................................................................................. A cross-linguistically stable observation is that negative sentences are marked, whereas affirmative sentences are unmarked (Miestamo ). This is clearly visible in the standard varieties. Regional varieties follow this trend, but offer more variation regarding sentential negation. In addition, they show different behaviour in the use of multiple negative expressions, namely negative concord. We will consider these areas one by one.
.. Sentential negation Everybody is aware of invariable ain’t as a sentential negator. It is attested in various dialects of England, in Newfoundland English, and in the Englishes of Norfolk Island and the Caribbean. It is also a prominent feature of African American Vernacular English and widely used in colloquial English. The form ain’t is completely invariable and substitutes for negative have or be, partially also negative do. Some illustration is provided in (). There are also main verb uses of have, as shown in example (). ()
a. Well, ain’t you the lucky one? ‘Aren’t you the lucky one?’ (Colloquial American English, Murray and Simon : ) b. She ain’t been here lately. ‘She hasn’t been here lately.’ (Southeast of the US, Wolfram a: ) c. You ain’t asked me about makin’ butter yet. ‘You haven’t asked me . . . ’ (Newfoundland English, Clarke : ) d. She ain tell um. ‘She didn’t tell him. / She hasn’t told him.’ (Gullah, Mufwene : )
()
a. Well ain’t you nothing? What if I have! b. She said you’ve a daughter ain’t ya or summat. (British National Corpus, Anderwald : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
A second non-standard sentential negator is the adverb never that occurs in some traditional dialects of England, Scottish English, Appalachian English, Newfoundland English, and Australian Vernacular English. Example () provides some illustration. The crucial point here is that never in its use as a sentential negator does not negate the entire past time sphere (I’ve never been to Hawai), but only the occurrence of a specific event. ()
I never went to school today. ‘I didn’t go to school today.’ (Southeast of England, Anderwald : )
The invariant form ain’t is only one example of paradigm simplification under negation. Others include what is known as ‘third person singular don’t’ where positive do and does fall together in negative don’t, and the levelling of negative past tense be to either wasn’t or weren’t (Anderwald : –). Table . summarizes these asymmetries.
Table . Asymmetrical paradigms (adapted from Anderwald : ) Positive Negative
am
is
are ain’t
has
have
do
does don’t
was
were
wasn’t/weren’t
.. Negative concord In sharp contrast to standard English (I didn’t see nobody ‘I saw somebody’), several negative expressions in one clause do not cancel each other out in various regional Englishes, but achieve an overall negative interpretation. This is known as ‘negative concord’, also referred to as ‘multiple negation’ in the literature. Negative concord may be preverbal or postverbal, as illustrated in (a) and (b). ()
a. Yes, and no people didn’t trouble about gas stoves then. (Southeast of England, Anderwald : ) b. We didn’t have no use for it noways. (Appalachian English, Montgomery : )
As a non-standard feature, negative concord is very common. It can be encountered in many traditional vernaculars, though it is probably best known from African American Vernacular English (especially preverbal negative concord). Based on corpus evidence sampled from British dialects, Anderwald (: ) documents negative concord in rates of up to thirty per cent—counted against the positions where it could occur in principle.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Negative concord is rather common in the languages of the world. For example, Haspelmath (a) finds no fewer than languages with negative concord in an overall sample of languages. The co-occurrence of predicate negation and negative indefinites, as in Standard English, is ruled out by a mere eleven languages. Illustration of negative concord from (Standard) Spanish and Italian is provided below. () Nadie dice nunca nada a nadie. nobody says never nothing to nobody ‘Nobody ever says anything to anybody.’
(Spanish, Christoph Gabriel)
() Non è venuto nessuno. NEG is come nobody ‘Nobody came.’
(Italian, Miestamo : )
Today’s dialectal negative concord clearly has historical precursors, with the prohibition of negative concord in the standard varieties even being due to prescriptive efforts (see Curzan ). Example () offers such a case of historical negative concord. ()
I thinke ye weare never yet in no grownd of mine, and I never say no man naye. (Early Modern English, Corpus of Early English Correspondence, cited in Nevalainen : )
. S–
.................................................................................................................................. Standard English subject–verb agreement is restricted to the third person singular present tense that marks the finite verb with the suffix -s. Hardly any other grammatical feature is more prone to regional and social variation. The dialects of East Anglia, for instance, tend not to use it at all, using bare verb stems instead (zero marking). Trudgill (: ) states that this pattern is characteristic of and restricted to East Anglia. Examples are provided in (). ()
a. He like her. b. She want some. c. That rain a lot there.
(East Anglia, Trudgill b: )
Most other dialects of Britain, however, tend towards an extended use of the -s suffix. Such paradigm regularization is common in many western and northern dialects of England, but it can also be found in the South of Britain, as example () shows. Using the -s suffix consistently perhaps helps to differentiate between past tense, present tense, and infinitival forms.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
()
I gets out of the car and walks down the street for a few yards before I sees them boys coming towards me. (Reading, Edwards : )
A very special and cross-linguistically rather unique agreement system is the Northern Subject Rule that, as its name suggests, can be encountered in northern English and Scottish dialects. A common formulation is that in (). Its most curious property is the adjacency condition between finite verb and pronominal subjects, leading to data such as (). Adjacency as a condition on agreement is very rare cross-linguistically (Corbett : Chapter ). ()
The Northern Subject Rule (A) (Pietsch : ) Every agreement verb takes the -s form, except when it is directly adjacent to one of the personal pronouns I, we, you or they as its subject.
()
a. They peel ’em and boils ’em. b. They go in and cuts em down.
(Lancashire, Pietsch : ) (Yorkshire, Pietsch : )
The formulation of the Northern Subject Rule in () predicts a categorical system, but such systems are rarely found. There always seems to be variation. Therefore, the alternative formulation in () captures the empirical reality more adequately. ()
The Northern Subject Rule (B) (Pietsch : ) a. All third singular subjects (and, where preserved, the old second singular thou) always take verbal -s. b. Type-of-Subject Constraint: All other subjects except the personal pronouns I, we, you, they (and, where it exists, youse) take verbal -s variably. c. Position-of-Subject Constraint: Non-adjacency of subject and verb favours verbal -s.
Pietsch (: ) views the formulation of the Northern Subject Rule in () as the result of competition between two categorical systems, namely the traditional Northern Subject Rule (), on the one hand, and the agreement pattern found in Standard English, on the other. Similarly, Godfrey and Tagliamonte (: ) conceptualize the distribution of the -s suffix as a statistical problem, since verbs can take it variably in practically identical contexts, also in other non-standard varieties. The examples in () provide evidence. ()
a. Her gives me a hug and a kiss, when I comes in and one when I go. b. People says ‘yeah but look at your weather, you gets it freezing cold in the winter, you get all the rain.’ c. He comes every- three times a week he come. d. Kiddies come over . . . and they’m talking to the animals and that. And the animals looks down, you know. And there’s a fantastic thing—animals and kiddies. (Devon English, Godfrey and Tagliamonte : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
The observable variation is due to an intricate mix of phonological, lexical, semantic, and syntactic conditioning factors. For example, the verb say, habitual contexts, full noun phrase subjects, as well as subject non-adjacency favour the appearance of the -s suffix. In other words, the Northern Subject Rule describes only a subset of the relevant factors. A second area of subject–verb agreement concerns the past tense forms of be (was/ were), whose distribution in the standard varieties follows the singular–plural opposition, except for the second person singular (you were). Regional varieties here manifest the analogical extension of one form into the domain of the other. This is known as ‘was/were-levelling’ or ‘was/were-generalization’. Some examples illustrating was-generalization can be found in (); those in () show were-generalization. The phenomenon is rather common and can be encountered in the British Isles, several North American dialects, South African English, and (non-standard) Australian English. ()
a. Shops was open while ten. b. Them pidgeons was there. c. They was very cheap, clay pipes was.
()
a. That were a game we invented. b. I were broke on a Monday. c. It weren’t very satisfactory.
(Yorkshire, Pietsch : ) (Cheshire, Pietsch : ) (Cheshire, Pietsch : )
(York English, Tagliamonte : ) On the whole, was-levelling seems more widespread than were-levelling, but the levelling process is sensitive to negation such that were becomes tied to negative and was to positive clauses, as shown in the examples in () and (). Again, we are here dealing with a statistical generalization and not a categorical rule (Nevins and Parrott : , ). ()
a. I weren’t able to answer. b. She weren’t that close to you.
()
a. Aye, I thought you was a scuba diver. b. We played on the beach until we was tired.
A third area of non-standard subject–verb agreement is known as ‘third person singular don’t’ and involves missing agreement on the verb do in negative contexts. It appears to be related to the omission of verbal -s in East Anglia, though it is by no means restricted to this area (Anderwald : –). Some examples can be found in (). ()
a. Well, it sure don’t help things none when we get hail like that. (Colloquial American English, Murray and Simon : )
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
b. He don’t live in there. (Southeast of the US, Wolfram a: ) c. If ’e don’t work, ’e don’t eat. (Australian Vernacular English, Pawley : ) d. My husband don’t like this district. (Cape Flats English, McCormick : ) Anderwald (: –), in her data sample drawn from the British National Corpus, posits the implicational connection that all dialects that allow she don’t also permit he don’t, though not vice versa. In addition, she generalizes that third person singular don’t is most frequent in declarative clauses, then tag questions, and least frequent in (full) interrogative clauses.
. C
.................................................................................................................................. The final section of this chapter will be dedicated to clause structure issues. Again, a reasonable selection needs to be made, since the observable variation is considerable. I will here focus on ditransitive constructions, embedded interrogatives, and relative clauses.
.. Ditransitive constructions In standard English ditransitive or double object constructions, the object encoding the recipient is positioned before the theme object (indirect versus direct object), as in example (a). This is known in the literature as the ‘canonical double object construction’. Northern English dialects offer the ‘alternative double object construction’ (b), with the theme located before the recipient (Gast : ), although it ‘is not especially common’ (Hughes et al. : ). ()
a. She gave the man a book. b. She gave a book the man.
(canonical) (alternative)
Variation with pronominal objects is more pervasive. According to Hughes et al. (: ), the alternative ordering in (a) is ‘very common in the north of England, but is not found in the south’. The opposite pronominalization in (b) is less common, but can ‘be heard in the north of England, particularly if there is contrastive stress on him’. Due to information structuring, the same holds for canonical examples like She gave the man it (see Chapter , this volume). ()
a. She gave it the man. b. She gave the book him.
(alternative) (alternative)
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
With two pronominal objects, the alternative ordering appears just as common as the canonical construction (though some speakers accept neither). This is shown in example (). Hughes et al. (: ) say that the alternative ordering ‘is very common indeed [in the North of England, PS], and is also quite acceptable to many southern speakers’. ()
a. She gave him it. b. She gave it him.
(canonical) (alternative)
We may note that variation really only concerns the double object construction, as the reversal of recipient and theme in the prepositional construction is rejected by speakers of all varieties (Siewierska and Hollmann : –): ()
a. *She gave to the man it. b. *She gave to him it.
According to Gast (: –), we can distinguish three major types of varieties in relation to object ordering in ditransitive constructions, namely (see also Gerwin ): i. varieties that have only the canonical (but not the alternative) double object construction, but that do not use it when both objects are pronominal (*gave me it, *gave it me, gave it to me; e.g., standard British English); ii. varieties that have only the canonical double object construction and that do allow it in sentences with two pronominal objects (gave me it, *gave it me, gave it to me; e.g., some north-eastern varieties of British English); iii. varieties that have both the canonical and the alternative double object construction and that use the latter when both objects are pronominal (*gave me it, gave it me, gave it to me; e.g., some (north)western varieties of British English). (Gast : –) To be sure, the above rules try to capture variation in British English, but say nothing on the distributions in North American dialects or Australian English. Very little is known about that. Figure . offers more fine-grained dialectal information for England. Interestingly enough, the recipient–theme order found in standard English is consistent with cross-linguistically attested ordering patterns. Animate noun phrases generally precede inanimate ones (Haspelmath b). This ordering preference is weakened for pronominal objects and clitics, and again, such variation is not restricted to English (Gensler , Siemund : –).
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
GIVE ME IT
GIVE IT ME GIVE IT TO ME
GIVE ME IT GIVE ME IT
GIVE ME IT
GIVE IT TO ME
GIVE IT ME GIVE IT TO ME
GIVE IT ME
GIVE IT TO ME
GIVE IT ME
GIVE IT ME
. Map ‘Give it me’, taken from An Atlas of English Dialects by Clive Upton and J. D. A. Widdowson (: ) By permission of Oxford University Press.
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
.. Embedded inversion Standard English, as is well known, has subject–auxiliary inversion in main clause interrogatives, though not in embedded interrogatives. The latter show the same word order as declaratives. These differences are illustrated in examples () and (). ()
a. Can you prepare sushi? b. I wonder if you can prepare sushi.
()
a. What do you need to prepare sushi? b. I wonder what you need to prepare sushi.
Especially in the context of the so-called ‘Celtic Englishes’, we find data that show inversion of subject and auxiliary in embedded clauses, widely referred to as ‘embedded inversion’. It can be found with embedded polar () and constituent (or open) interrogatives (). ()
a. I asked her could I go with her. (Urban African American Vernacular English, Wolfram b: ) b. I don’t know was it a priest or who went in there on time with a horse-collar put over his neck. (Irish English, Filppula : ) c. I asked him did he want some tea. (North of England, Beal : ) d. He asked could he get there about fifteen minutes late. (Colloquial American English, Murray and Simon : )
()
a. I wonder what is he like at all. The leprechaun. I don’ know what is it at all. (Irish English, Filppula : ) b. If they got an eight they had to decide where was the best place to put it. (Scottish English, Miller : ) c. I don’t know what time is it. (Welsh English, Penhallurick : )
Davydova at al. (: –) compare embedded inversion across several non-standard varieties and document particularly high rates of it for embedded polar interrogatives in Irish English, suggesting an analysis in terms of Irish substrate influence. There remain important methodological problems, though, as embedded inversion needs to be distinguished from main clause inversion in direct speech (Siemund : –).
.. Relative clauses As a type of noun modification, relative clauses in English are placed after the noun that they modify and are typically introduced by a relative marker. The most widely used relative marker in regional varieties is that, especially in northern English dialects
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
and the English of Northern Ireland (Tagliamonte et al. : ), but we also find what, at, as, and possessively used that (where Standard English would have whose). The form at can be analysed as a reduced variant of that. The examples in () illustrate these points (see Hinrichs et al. on the rise of that in written registers, especially in American English). ()
a. He’s the one what done it. (East Anglia, Trudgill : ) b. I know a man at will do it for you. (North of England, Beal : ) c. Tom Sparks has herded more than any man as I’ve ever heard of. (Appalachian English, Montgomery : ) d. The girl that her eighteenth birthday was on that day was stoned. (Scottish English, Miller : ) e. We need to remember a woman thats child has died. (Appalachian English, Montgomery : )
In addition to the overt marking of the relative clause with a relative marker, head noun and relative clause may also be simply juxtaposed. This ‘gap strategy’ is widely used in Standard English, except for forming relative clauses on subjects (*the girl spoke first meaning ‘the girl who spoke first’). Non-standard regional Englishes, by contrast, widely employ subject gapping, as shown in (). ()
a. You know anybody Ø wants some, he’ll sell them. (Southwest of England, Wagner : ) b. They is people Ø gets lost in these Smoky Mountains. (Appalachian English, Montgomery : ) c. There’s no one Ø pays any attention to that. (Newfoundland English, Clarke : ) d. I knew a girl Ø worked in an office down the street there. (Australian Vernacular English, Pawley : )
Tagliamonte et al. (: ) show, however, that subject gapping is contextually constrained, and typically occurs in existential constructions (a), cleft sentences (b), and possessive constructions (c). ()
a. There’s no many folk Ø liked going to the pit to work. b. It was an earthen floor Ø was in that house. c. I have a woman Ø comes in on a Thursday morning. (Tagliamonte et al. : )
Another area of variation in the domain of relative clauses concerns so-called ‘resumptive pronouns’ or ‘pronoun retention’, in which pronominal copies of the head noun of the relative clause fill the gap that would otherwise be left by relativization. Resumptive pronouns are usually not available in standard English, but as the examples in ()
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
make clear, they can be found in British regional varieties. Their frequency of occurrence, however, seems to be rather low (Herrmann ). ()
a. They jumped banks that time on the race-course that they wouldn’t hunt over them today. (Irish English, Filppula : ) b. They’re the ones that the teacher thinks they’re going to misbehave. (Scottish English, Miller : )
The strategy of relativization, i.e. gapping, relative marker, and pronoun retention, interacts with and is constrained by the so-called ‘accessibility hierarchy’ in () below (Keenan and Comrie ). ()
Accessibility Hierarchy (Keenan and Comrie : ) Subject > Direct Object > Indirect Object > Oblique > Genitive > Object of Comparative
The accessibility hierarchy is a typical implicational hierarchy. It predicts that if noun phrases in a certain syntactic function can be relativized on, this will also be possible for noun phrases in all functions further to its left. Interestingly, the standard English restriction on subject gapping violates the accessibility hierarchy, though non-standard Englishes conform with it. Moreover, Keenan and Comrie (: –) argue that the hierarchy is reversed for pronoun retention.
. S
.................................................................................................................................. In the present article, I have argued for a cross-linguistic, typological approach to the study of non-standard grammatical features. After exploring the methodology of language typology and the major generalizations resulting from this research endeavour, I examined the non-standard features of English found in five domains of grammar, namely pronominal systems, tense and aspect, negation, subject–verb agreement, and clause structure. I have tried to show that the cross-linguistic approach offers valuable insights into the relationship between non-standard and standard features and helps to position them in a universal linguistic categorial system. An overview article like this must necessarily remain incomplete, unfortunately. We should also bear in mind that I have focused here on inner circle varieties, excluding the huge array of language contact phenomena found in outer circle varieties (see Chapter , this volume). Further information concerning these issues can be found in Kortmann et al. () as well as Siemund (). The major grammatical domains not included here are the determiner system and modal verbs. Both offer remarkable non-standard form types and distributions, as, for example, three distance contrasts in the system of demonstratives, or the famous double and triple modals of especially
OUP CORRECTED PROOF – FINAL, 15/10/2019, SPi
Scottish and Southern US English. Again, Kortmann et al. () and Siemund () offer useful starting points to explore these domains. The grammatical domains treated here also allow for more fine-grained observations, often relating to specific regions of the English-speaking world, notably benefactive datives in Southern US English, invariant interrogative tags in British dialects, and markers of future time reference, among others.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. T present chapter will be concerned with morphosyntactic variation in the English-speaking world. Like Siemund’s chapter in this volume (see Chapter ), it is informed by a typological approach to the study of dialects and World Englishes (cf. e.g., Anderwald and Kortmann , Siemund ). However, while Chapter is feature-oriented and offers a qualitative discussion of features selected from individual domains of non-standard grammar in native speaker varieties of English, the primary purpose of the present chapter is a different one. It rather seeks to identify the areal and global reach as well as the degree of distinctiveness of morphosyntactic properties for individual types of varieties and Anglophone world regions. Almost all of the features discussed in Chapter form part of the feature set analysed for this purpose; likewise the native speaker varieties considered in Chapter form a subset of the varieties investigated here. The following account is based on the data set in the electronic World Atlas of Varieties of English (Kortmann and Lunkenheimer ; eWAVE version .: http:// www.ewave-atlas.org) as well as on analyses of this data set as offered most comprehensively in Kortmann and Lunkenheimer (a). The eWAVE data set includes ratings, examples, and interactive maps for morphosyntactic features covering twelve domains of grammar in seventy-six different varieties of English. The features covered are non-standard in the sense that they are not normally considered to be part of the ‘common core’ of (typically written) English, i.e. of ‘what is left when all regional and other distinctions are stripped away’ (McArthur : ). None of these features would normally be used in the EFL (English as a Foreign Language) classroom.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
There are hundreds of candidates for such non-standard features in spontaneous spoken Englishes and English-based pidgins and creoles around the world.1 For details on the selection of the features in WAVE, compare Kortmann and Lunkenheimer (b: f.). The language varieties covered in the data set comprise a range of different types: thirty-one L (native-speaker) and nineteen L (non-native) spontaneous spoken varieties of English as well as twenty-six English-based pidgins and creoles from all Anglophone world regions (cf. E. W. Schneider : f. for a brief discussion of the arguments whether or not to consider English-based pidgins and creoles as varieties of English, as is done by himself and in the present chapter). By L varieties we understand native speaker (or mother-tongue) varieties, which also stand at the centre of Peter Siemund’s Chapter in this volume. According to Trudgill (a), L varieties can broadly be subdivided into low-contact varieties (largely, traditional regional dialects) and high-contact varieties, i.e. varieties of English with a distinct contact history, be it contact with other languages or with other English dialects. So Englishes with a colonial background typically qualify as high-contact L Englishes (e.g., Australian English, New Zealand English, Bahamian English, but also Irish English). The category of L varieties includes predominantly ‘indigenized non-native varieties of English that have a certain degree of prestige and normative status in their political communities’, such as Pakistani English or the East African Englishes, ‘but also non-native varieties that compete with local L varieties for prestige and normative status’, such as Chicano English in the US or Black South African English (Kortmann and Lunkenheimer b: ). In terms of Kachru’s three-circles model of World Englishes developed in the s, L and L varieties form the Inner and the Outer Circle respectively (cf. E. W. Schneider : –). For more information on L and L varieties from a typological perspective, see section .. This uniquely comprehensive data set, collected in a uniform way by a team of more than eighty specialists for the relevant varieties and thus securing a high degree of comparability, allows us to answer many of the big questions in the study of languageinternal structural variation on a global scale. Of these, the following will primarily be addressed here (with the relevant section in parentheses): • Which major typological patterns emerge when examining morphosyntactic variation across the Anglophone world? What is more powerful in explaining the observable patterns of variation: the typological signal (i.e. variety type) or the geographical signal (i.e. Anglophone world region)? (See section ..) • Which are the most widespread non-standard morphosyntactic features in the Anglophone world, and can they be attributed to the globalization of English? (See section ..) 1 The notions standard and non-standard are notoriously difficult to come to grips with, which is why some people prefer to define standard in terms of what it is not (e.g., Trudgill a; for a recent more detailed discussion of the notions standard and Standard English, see Hickey b).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
• Are there distinctive, or even diagnostic, features for the different variety types (Ls, Ls, pidgins, and creoles)? Is it possible to identify systematic correlations between variety type and different degrees of structural complexity? (See section ..) • Which are the most widespread or even diagnostic features for the different parts of the English-speaking world (i.e. for the Anglophone world regions Africa, South and Southeast Asia, Australia, the British Isles, the Caribbean, North America, and the Pacific)? (See section ..) The final and most challenging question to be tackled is the following (section .): How are we to interpret the different types of diagnostic features that will be identified in sections . and ., on the one hand, and the most widespread features in the Anglophone world (section .), on the other hand, against Mair’s (b) suggestion of a World System of (Standard and Non-Standard) Englishes, with Standard American English as its hub? The major drift of this chapter thus is to offer a global perspective on morphosyntactic variation across the highly diverse range of spontaneous spoken Englishes and English-based pidgins and creoles. First, however, Tables . and . provide more information on the (e)WAVE database, i.e. the varieties and domains of grammar covered. The distribution of the different variety types across Anglophone world regions in Table . shows (see the columns labelled ‘L’, ‘L’, and ‘P/C’ on the right) that we can distinguish between two broad classes of world regions: rather homogeneous ones (the British Isles and North America as pervasively L regions, the Caribbean as pervasively a creole region), as opposed to rather heterogeneous ones, where one variety type may predominate (such as L in Africa and Asia, pidgins and creoles in the Pacific, L in Australia), but where there is still a sizeable number of representatives of at least one other variety type to be found. Table . shows how the features covered in WAVE are distributed across different grammatical domains. For a complete overview of the features (including definitions and examples), see http://ewave-atlas.org/parameters or, for the bare list of features, Foldout in Kortmann and Lunkenheimer (a).
. P A
.................................................................................................................................. In this section we will look at the major results which emerge when applying to the entire WAVE data set (presence versus absence of all features in all seventy-six varieties) the NeighborNet algorithm, a clustering method originally developed in bioinformatics, but by now well-established in representing and exploring variation
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Seventy-six varieties in WAVE: world regions World region
Varieties
L L P/C sum
British Isles/ Europe
L: Orkney and Shetland E (O&SE), North of England (North), SW of England (SW), SE of England (SE), East Anglia (EA), Scottish E (ScE), Irish E (IrE), Welsh E (WelE), Manx E (ManxE), Channel Islands E (ChlsE); L: Maltese E (MltE); P/C: British Creole (BrC)
North America
L: Newfoundland E (NfldE), Appalachian E (AppE), Ozark E (OzE), Southeast American Enclave dialects (SEAmE), Colloquial American E (CollAmE), Urban African American Vernacular E (UAAVE), Rural African American Vernacular E (RAAVE), Earlier African American Vernacular E (EAAVE); L: Chicano E (ChcE); P/C: Gullah
Caribbean
L: Bahamian E (BahE); L: Jamaican E (JamE); P/C: Jamaican C (JamC), Bahamian C (BahC), Barbadian C (Bajan), Belizean C (BelC), Trinidadian C (TrinC), Eastern Maroon C (EMarC), Sranan, Saramaccan (Saram), Guyanese C (GuyC), San Andrés C (SanAC), Vincentian C (VinC)
Africa
L: Liberian Settler E (LibSE), White South African E (WhSAfE), White Zimbabwean E (WhZimE); L: Ghanaian E (GhE), Nigerian E (NigE), Cameroon E (CamE), Kenyan E (KenE), Tanzanian E (TznE), Ugandan E (UgE), Black South African E (BlSAfE), Indian South African E (InSAfE), Cape Flats English (CFE); P/C: Ghanaian Pidgin (GhP), Nigerian Pidgin (NigP), Cameroon Pidgin (CamP), Krio, Vernacular Liberian E (VLibE)
Asia
L: Colloquial Singapore E (CollSgE), Philippine E (PhilE); L: Indian E (IndE), Pakistan E (PakE), Sri Lanka E (SLkE), Hong Kong E (HKE), Malaysian E (MalE); P/C: Butler E (ButlE)
Australia
L: Aboriginal E (AbE), Australian E (AusE), Australian Vernacular E (AusVE); P/C: Torres Strait C (TorSC), Roper River C (RRC [Kriol])
Pacific
L: New Zealand E (NZE); L: Colloquial Fiji E (CollFijiE), Acrolectal Fiji E (FijiE); P/C: Hawaiian C (HawC), Bislama (Bisl), Norfolk Island/Pitcairn E (Norf’k), Palmerston E (PalmE), Tok Pisin (TP)
Isolates/South Atlantic
L: St. Helena E (StHE), Tristan da Cunha E (TdCE), Falkland Islands E (FlkE)
Totals
WAVE .
Note: P/C = pidgins and creoles
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Domains of grammar covered in WAVE ( features in all) Grammatical domain Pronouns Noun phrase Tense and aspect Modal verbs Verb morphology Negation Agreement Relativization Complementation Adverbial subordination Adverbs and prepositions Discourse organization and word order
Feature totals
Percentage of
.% .% .% .% .% .% .% .% .% .% .% .%
in linguistics. The basic information needed when looking at the resulting networks is that ‘the distances were measured by determining presence (i.e. attestedness . . . ) and absence of features, then calculating the proportion of mismatches between varieties’ (Kortmann and Wolk : f.).2 The shorter the distance between any two varieties, the more typologically similar they are, i.e. the higher is the number of co-presences and co-absences for the -member feature set. Vice versa, the longer the distance between any two varieties, the more typologically dissimilar they are. The global network in Figure . allows us to answer the first two key questions concerning patterns of grammatical variation in the English-speaking world: Which major typological patterns emerge when examining morphosyntactic variation across the Anglophone world, and is it the typological signal (i.e. variety type) or the geographical signal (i.e. Anglophone world region) that is more powerful in explaining the observable patterns of variation? Figure . reveals four major clusters, numbered counterclockwise Cluster –, beginning with the bottom right cluster. It emerges that the morphosyntactic typological profiles of the seventy-six WAVE varieties pattern rather neatly according to variety type. Thus Cluster consists almost exclusively of mother-tongue varieties of English (some high-contact (Lc) and all low-contact L varieties (Lt) in the sample), Cluster For further details of this algorithm, especially the more refined ‘clustering with noise’ method as developed in dialectometry and applied to the WAVE data set, cf. Kortmann and Wolk : f. It is there that the reader will also find a sophisticated explanation for the parallel lines forming box-like shapes. Basically, what they indicate is that a given variety is somewhat ‘torn’ between two major clusters: in the top right part of Figure ., for example, the Southern African Englishes, especially White South African English and Cape Flats English, dominantly pattern with L Englishes (Cluster ), but also exhibit quite a number of morphosyntactic features shared by the L varieties in Cluster . 2
Cluster 3 (P/C, creoloids) Aus/Pac2 Car1 Norf ’k RRC TorSC Bislama TP Krio Af4 CamP (West Africa) NigP GhP Saramaccan Sranan EMarC Car2
Af3 Cluster 2 (South Africa) (L2) As2 IndE PhilE BISAfE (SE Asia) HKE InSAfE CamE MalE GhE Af2 MaltE CollSgE KenE (East & NigE UgE West Africa) TznE FijiE As1 PakE (South Asia) SLkE As3
JamE BrC SanAC HawC BelC
AborE
ButlE CollFijiE PalmE
Aus/Pac3
WhZimE CFE WhSAfE
Af1 (Southern Africa)
Gullah GuyC VinC JamC BahC Bajan Car3 TrinC
LibSE VLibE
Af5 Cluster 4 (high-contact L1, P/C)
RAAVE UAAVE TdCE EAAVE SEAmE StHE BahE Am2
. Global network for the entire WAVE feature set (N = )
Bris3 O&SE ScE NZE AusE Aus/ EA AusVE Pac1 FlkE SE ManxE Brls2 CollAmE IrE WelE North OzE NfldE SW AppE Am1 Bris1 Cluster 1 (L1)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
ChcE ChlsE
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
of L varieties of English, Cluster of pidgins, creoles and creoloids3, and Cluster of pidgins and creoles, on the one hand, and the majority of high-contact L varieties, on the other hand. The major division between the four clusters is the one between the two righthand and two lefthand sets of clusters: on the right L and L varieties and on the left all pidgins/creoles, the four creoloids (AbE, PalmE, CollFijE, ButlE), and the African, American, and Caribbean high-contact L varieties.4 There are only four apparent outliers: Chicano English (an L patterning with the L Cluster ), White South African and White Zimbabwean English (L varieties patterning with the L varieties in Cluster ), and Jamaican English (an L placed in the P/C Cluster ). For an elaborate discussion of these outliers and the four clusters, in general, see Kortmann and Wolk (: –). The geographical composition of the four clusters is outlined in Table .. Note that the clusterings within the geographical labels in Figure . can include historically related varieties that are not physically in the same area (e.g., Newfoundland English is Table . Geographical composition of the clusters in the global network Cluster (Pidgins, creoles, creoloids) – ButlE (As) Aus/Pac CollFijiE, PalmE, AbE – HawC (Pac) Car JamE, BrC, SanAC, BelC Aus/Pac Norf’k, RRC, TorSC, Bislama, TP Af Krio, CamP, NigP, GhP Car Saramaccan, Sranan, EMarC
Cluster (L varieties) Af WhSAfE, CFE, WhZimE As SLkE, PakE, FijiE Af TznE, UgE, NigE, KenE, GhE, CamE As CollSgE, MalE Af InSAfE, BlSAfE As IndE, PhilE, HKE
Cluster (Pidgins, creoles, Lc) – Gullah (Am) Car GuyC, VinC, JamC, BahC, Bajan, TrinC Af LibSE, VLibSE Am RAAVE, UAAVE, SEAmE – TdCE (Isolate), StHE (Isolate), EAAVE (Am), BahE (Car)
Cluster (L varieties) BrIs IrE, NfdlE, WelE, SW, North Am AppE, OzE, CollAmE BrIs ManxE, SE, FlkE, EA Aus/Pac AusVE, AusE, NZE BrIs ScE, O&SE, ChIsE – ChcE (Am)
3
A creoloid is a creole-like language whose development has not involved a pidgin stage and which, despite many creole features, also exhibits structural properties typical of indigenized L and highcontact L varieties. 4 Note that the class of high-contact L varieties is rather heterogeneous. It includes so-called transplanted L Englishes or colonial standards, L varieties with a pronounced contact history (e.g., the African American and White Southern African varieties of English), as well as language-shift Englishes, i.e. those currently shifting from an L to an L status for an increasing part of the relevant speaker community (e.g., Colloquial Singaporean English, also known as Singlish). L varieties which underwent this process in the past include Irish English and Welsh English. For further discussion see section ..
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
part of the BrIs cluster, right next to its two major historical input varieties Irish English and dialects of Southwest England). When trying to explain the overall structure of this network, it becomes obvious at a glance that, even though geographical groupings are perceptible, they are clearly secondary to the cluster groupings according to variety type. The clustering in Figure . thus convincingly shows that the sociohistorical conditions under which varieties of English emerged and are currently used correlate with their overall morphosyntactic profiles. For example, regardless where in the world L Englishes are spoken, their grammars tend to be more like each other than the grammars of Englishes spoken in the same Anglophone world region. Analogously, this applies to L Englishes and English-based pidgins and creoles. There is no Anglophone world region which is found in only one of these four major clusters. Even the British Isles, although represented by three groups of L varieties in Cluster , have as an outlier British Creole (BrC), which is neatly positioned right next to Jamaican English in the Car sub-cluster in Cluster . The number of geographical sub-clusters per Anglophone world region varies between two (North America) and five (Africa). The global network in Figure . is based on the entire set of WAVE features, but of course other such networks can be generated for different subsets of features, thus possibly revealing other interesting clusterings. One obvious choice for selecting subsets of features are the twelve grammar domains in Table .. The network in Figure ., for example, shows the clustering for the thirty-three Tense and Aspect features in WAVE. As in the global network, the L and L clusters clearly pattern on the right5 and a solid, even purer pidgin/creole cluster patterns on the left. Differently from the global network, however, Figure . shows a separate cluster (loosely labelled ‘AAVE et al.’) consisting of largely high-contact L varieties. This cluster is positioned nearer to the main L cluster and notably includes all three AAVE varieties as well as varieties which are either historically (LibSE, BahE) or geographically (SEAmE) related to AAVE. So in individual domains of grammar, as shown here for Tense and Aspect, subsets of varieties may cluster differently, moderately but distinctly, from their clustering based on the entire -feature set used in WAVE. In a global take on morphosyntactic variation in the Anglophone world, the following three sections will identify three sets of particularly prominent features out of the WAVE features. The most widespread features across all seventy-six WAVE varieties, so-called angloversals (cf. Szmrecsanyi and Kortmann b, Kortmann and Szmrecsanyi ), will be identified in section .. In the two sections that follow prominence is to be understood in the sense of distinctiveness: which morphosyntactic
With regard to the L cluster in Figure ., the following is reassuring to note when relating it to the Tense and Aspect section in Chapter (this volume): (a) the properties discussed there form a proper subset of the thirty-three-member set of Tense and Aspect features in WAVE; more importantly, (b) since Chapter reports exclusively on L varieties of English, it is nice to see that all of the relevant varieties cluster together in Figure . (and, I may add, in the vast majority of the corresponding networks for the other domains of grammar covered in WAVE). 5
UAAVE BahE RAAVE EAAVE LibSE TdCE StHE SEAmE
ButlE VLibE
IrE NfldE
P/C
RRC Bislama TP TorSC PalmE EMarC Sranan Saramaccan GhP NigP Krio CamP
L1
SW AppE OzE SE CollAmE ChcE FlkE EA
North ScE O&SE AusVE AusE ManxE WelE NZE
JamE Gullah SanAC BelC Norf ’k JamC AborE BrC Bajan HawC VinC
PakE WhZimE KenE SLkE
BahC GuyC TznE FijiE ChlsE
TrinC
InSAfE NigE CFE MaltE GhE WhSAfE BISAfE CollSgE MalE UgE IndE CamE CollFijiE HKE PhilE
L2
. Tense and Aspect network in WAVE
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
AAVE et al.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
features are most distinctive, or even diagnostic, of (a) individual variety types (so-called varioversals; section .) and (b) individual Anglophone world regions (sometimes called areoversals; section .).6
. A
.................................................................................................................................. For identifying the most widespread non-standard morphosyntactic features in the Anglophone world, a threshold of per cent of varieties was defined. Table . lists and illustrates those six features which are attested in the largest number of varieties of English and English-based pidgins and creoles worldwide. The ‘total’ column shows the number of varieties in which each feature is attested, while the rightmost column shows the attestation rate (AR) out of a total of seventy-six WAVE varieties. The AR of the top four features ranges between per cent and per cent, the least expected of these perhaps being the second most widely attested angloversal, F (special forms or phrases Table . Vernacular angloversals: top (≥ %) No.
Feature
Absent in
Total
AR worldwide
%
%
PakE, SLkE, HKE, FijiE, BelC, F adverbs other than degree modifiers have the same form Bislama, TP as adjectives (Come quick!)
%
F
%
F no inversion/no auxiliaries in North, SE, AppE, ChcE, NigE, TP main clause yes/no questions (You get the point?) F
O&SE, SE, TznE, UgE, HKE, FijiE, forms or phrases for the second person plural pronoun Saramaccan other than you (e.g. youse, yinz, y’all, you guys, yufela)
me instead of I in coordinate FijiE, EMarC, Saramaccan, TorSC, subjects (me and my brother) PalmE, Bislama, Norf’k, TP
F never as preverbal past tense negator (She never came this morning)
O&SE, ChcE, JamE, UgE, NigE, IndE, SLkE, EMarC, Saramaccan, Sranan, RRC, Bislama, TP
%
F multiple negation/negative concord (He won’t do no harm)
O&SE, WhSAfE, CollSgE, GhE, NigE, TznE, PakE, SLkE, MalE, FijiE, RRC, TP, Bislama, GhP, NigP
%
6 Compare Anderwald and Kortmann (: –) for a survey of how the concept of linguistic universals in language typology has been applied in dialectology and the study of World Englishes. Compare also Siemund (: –) and in Chapter (this volume).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Vernacular angloversals: top runners-up (≥ %) No.
Feature
F
degree modifier adverbs have the same form as adjectives (This is real healthy)
Total
AR worldwide
%
F
was for conditional were (if I was you)
%
F
double comparatives and superlatives (That’s so much more easier to follow)
%
F
existential/presentational there’s/there is/there was with plural subjects (There’s three people in the garden)
%
F
no inversion/no auxiliaries in wh-questions (What you doing?)
%
for the second person plural pronoun like youse/yinz/y’all or you guys/you ones/you lot). It should be noted that each of the angloversals in Table . is widely used, or at the very least of medium-frequency use, in the individual WAVE varieties. Figuring prominently among those few varieties where these angloversals are not found (see the column ‘absent in’) are L varieties (e.g., Nigerian English, Ugandan English, Acrolectal Fiji English) and especially (Pacific) pidgins and creoles (notably Tok Pisin and Bislama). If we lower the threshold to per cent, we arrive at the next most widely attested features in the Anglophone world. These are listed in Table .. The question of what we can learn from the WAVE data set about global varieties and the globalization of English will be addressed in section . below. It is obvious that the set of vernacular angloversals should play a prominent role in this discussion. It thus seems justified to state at this point already that the vast majority of the eleven features in Tables . and . have gained wide social acceptance among native L speakers and can be considered part of what may be called standardizing non-standard English. In fact, for many speakers, especially in the US, quite a few of these features may already have reached the status of spontaneous spoken Standard English. The only angloversals to which this does not appear to apply are F (multiple negation) and F (double comparatives and superlatives) since both are widely stigmatized. They are clearly ‘above consciousness’ features attracting much explicit negative comment, especially in school settings (cf. also Kortmann : f.).
. V :
.................................................................................................................................. The global network in Figure . visualized that the crucial factor explaining the observable clusterings of the WAVE varieties is variety type. In other words, the
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Top diagnostic features of L varieties (AR ≥ %, AR difference ≥ %), sorted by AR difference No.
Feature
AR L(N=) AR difference L exceptions
F like as a focussing device (Yeah, it was like dollars)
%
%
O&SE, OzE, AppE, LibSE
F
use of us + NP in subject function (Us kids used to play in the barn)
%
%
O&SE, LibSE, CollSgE
F
she/her used for inanimate referents (She’s a nice bike)
%
%
SE, EAAVE, AppE, LibSE, WhZimE, CollSgE
F
ain’t as the negated form of be (They’re all in there, ain’t they?)
%
%
O&SE, IrE, ManxE, WhSAfE, CollSgE, PhilE, AbE, NZE
grammars of those varieties that belong to the same variety type resemble each other most. The primary aim of the present section is to identify the diagnostic, i.e. most distinctive, features for each of the three major variety types represented in the WAVE data set (L, L, and pidgins/creoles). The central metric that will be used is the attestation rate (AR) of a given feature for a given variety type vis-à-vis the attestation rate of the same feature in all other varieties (i.e. the attestation rate difference = AR difference). For the first measure we will use a per cent threshold, for the AR difference a per cent threshold. In other words, a given feature qualifies as diagnostic (a) if it is attested in at least per cent of the varieties belonging to a certain variety type and (b) if its attestation rate is at least per cent higher than in all varieties belonging to other variety types.7 For the thirty-one L varieties in WAVE these thresholds yield the four top diagnostic features in Table ., which are sorted by AR difference (as in all of the tables in this section).8 Listed in Table . are top features whose presence is most characteristic of the ten traditional (or low-contact) L (Lt) varieties in WAVE, in general, but specifically in contrast with the high-contact L (Lc) varieties. All of these traditional dialects are spoken in the British Isles and North America. In the British Isles these are Orkney and Shetland English, Scottish English, the dialects of East Anglia, the North, the Southwest and Southeast of England; in North America these are Newfoundland English (as the only representative of English in Canada in WAVE), Appalachian English, Ozark English, and the Southeast American enclave dialects. 7 These may be called convenience thresholds. Like the ones in the previous and the following section, these thresholds were not predetermined, but decided on as the ones with the highest distinctive power in light of the percentages which emerged from the relevant data analyses. 8 For detailed accounts of the typological profiles of the different variety types see Szmrecsanyi () for L varieties, Lunkenheimer (b) for L varieties, and A. Schneider () for pidgins and creoles.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Top diagnostic features of traditional L varieties (AR ≥ %, AR difference ≥ %), sorted by AR difference AR Lt (N=)
AR Lc (N=)
F relativizer at (This is the man at painted my house)
%
%
%
F agreement sensitive to subject type (e.g., birds sings versus they sing)
%
%
%
F
%
%
%
No. Feature
forms or phrases for the second person singular pronoun other than you (e.g., ye, thou, thee)
AR-difference (%Lt %Lc)
These three features, at least two of which (F, F) are rather archaic, are closely followed by F ‘he/him used for inanimate referents’ (e.g., I bet thee cansn’ climb he [a tree]) and F ‘either order of objects in double (pronominal) object constructions’ (e.g., He couldn’t give him it). For both features the AR difference is per cent ( per cent AR in traditional L varieties versus per cent in high-contact L Englishes). Most of these five features can be considered as instances of what has been called ornamental rule complexity in the recent complexity debate triggered by McWhorter (): they add morphological or syntactic contrasts, distinctions, or asymmetries without providing a communicative or functional bonus (cf. Szmrecsanyi and Kortmann : –, f.). Following Trudgill’s (a) suggestion of a new, contact-based typological split of varieties of English, the following (types of) Englishes qualify as high-contact L varieties: transplanted L Englishes or colonial standards (e.g., Australian English, New Zealand English, White South African English), language-shift Englishes (by virtue of being a shift or even shifted variety, e.g., Irish English or Colloquial Singapore English), and the (spoken) standard varieties of British and American English. Within the WAVE universe, the following twenty-one varieties have been classified as high-contact L (Lc) varieties. As this list shows, Lc varieties are found in all Anglophone world regions: Africa: America: Asia: Australia: British Isles: Caribbean: Pacific: Isolates:
Liberian Settler English, White South African English, White Zimbabwean English Colloquial American English, Urban AAVE, Rural AAVE; additionally, as a historical variety: Earlier AAVE Colloquial Singapore English, Philippine English Aboriginal English, Australian English, Australian Vernacular English Irish English, Welsh English, Manx English, Channel Islands English Bahamian English New Zealand English St Helena English, Tristan da Cunha English, Falkland Islands English
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Top diagnostic features of high-contact L varieties (AR ≥ %, AR difference ≥ %), sorted by AR difference AR Lc (N=)
AR Lt (N=)
AR-difference (%Lc %Lt)
alternative forms/phrases for referential (non-dummy) it (When you off the thing [‘switch it off’] you press that one; FijiE)
%
%
%
F
indefinite article one/wan (Longa Kildurk gotta one stumpy-tail horse. ‘At Kildurk there is a stumpy-tailed horse’; AbE)
%
%
%
F
zero past tense forms of regular verbs (My grandfather belong to Thomas Jefferson; EAAVE)
%
%
%
F
deletion of auxiliary be: before progressive (Togba, you laughing; LibSE)
%
%
%
No.
Feature
F
The top diagnostic features of this variety type, specifically in contrast with traditional L varieties, are listed in Table .. The following three deletion features are worth mentioning, too. Although they are attested in only per cent of all Lc varieties, they are completely absent in the traditional L varieties: F F
deletion of copula be: before NPs deletion of copula be: before AdjPs
F
deletion of copula be: before locatives
But this one Ø not your car. (CollSgE) Ou mudder Ø crook, eh? ‘Your mother’s ill, isn’t she?’ (AbE) Khatib Ø very near my place. (CollSgE)
These deletion features, like the majority of the top diagnostic Lc features in Table ., qualify as features that rather simplify the rule system of the relevant variety when judged against the system of (written) Standard English (StE). Simplifying features of a different sort, namely features facilitating second language acquisition by adults and thus known to recur in (adult) L learners’ interlanguage varieties (so-called L simple features; see Szmrecsanyi and Kortmann : ), figure prominently among the top diagnostic features of the nineteen L varieties in WAVE, listed in Table .. Interestingly, as the NeighborNet clustering based on the full feature set in Figure . shows, the L varieties of the Anglophone world fall into two broad clusters. Lunkenheimer (b: –) convincingly argues that the major factor responsible for the split between these two clusters is the degree of difference (or autonomy) from Standard English. On average, the varieties in Cluster I on the left attest far fewer nonstandard morphosyntactic features (mean value: .) than those in Cluster II on the right (mean value: .). Within Cluster I, the varieties in sub-cluster a appear to be
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Top diagnostic features of L varieties (AR ≥ %, AR difference ≥ %), sorted by AR difference No.
Feature
F
insertion of it where StE favours zero (My old life I want to spend it in India; IndE)
AR L (N=)
ARdifference L exceptions
%
%
ChcE, JamE, CFE
F addition of to where StE has bare infinitive (She made me to go there; IndSAfE)
%
%
MaltE, ChcE, JamE, FijiE, CollFijiE, CFE
F
different count/mass noun distinctions resulting in use of plural for StE singular (I have done a lot of researches in this area; HKE)
%
%
ChcE, CFE
F levelling between present perfect and simple past: present perfect for StE simple past (It has been established hundreds of years ago; GhE)
%
%
ChcE, NigE, KenE, CFE
F
%
%
MaltE, ChcE, JamE, GhE, InSAfE, PakE, FijiE, CFE
comparative marking only with than (It might be beautiful than those big ones; BlSAfE)
even more conservative and oriented towards Standard English than those in subcluster b. From an areal perspective, it is remarkable that almost all East and West African L varieties are members of Cluster I, more exactly of its sub-cluster a, with Cameroon English (of all Cluster II varieties) being most strongly pulled towards this sub-cluster. By contrast, most of the features in the Cluster II varieties, and many more than those attested for the Cluster I varieties, can be explained by a higher degree of restructuring and in terms of processes characteristic for learner language. In terms of Schneider’s four-phase Dynamic Model of the development of Postcolonial Englishes (), one can say that all nineteen L varieties in WAVE show some degree of structural nativization and have thus reached phase three (nativization). The major difference between the two clusters is, however, that the varieties in Cluster I are oriented more strongly towards an exogenous (i.e. Standard English) norm and are therefore still closer to Schneider’s phase two (exonormative stabilization), whereas the Cluster II varieties are further advanced in the direction of endonormativity, either having already moved or currently moving into phase four (endonormative stabilization). The last variety type to be considered in this section is constituted by the seven pidgins and nineteen creoles in the WAVE data set. It is for this variety type that we find by far the largest set of diagnostic features of all variety types. For reasons of space, the per cent AR threshold for diagnostic features has therefore been raised to per cent. This yields the five features in Table .. A series of studies by Kortmann and Szmrecsanyi, based in part on the WAVE data, has shown that, in line with McWhorter (), English-based pidgins and creoles exhibit the highest degree of structural simplification of grammar compared with all
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
HKE MalE
IndE
cluster II BlSAfE
CamE
InSAfE
MaltE KenE NigE cluster a
UgE
CollFijiE
TznE
JamE
cluster c
FijiE cluster I
GhE
PakE
SLkE
ChcE
cluster b
. NeighborNet clustering of L varieties in WAVE Note: The varieties shown are as follows, listed counter-clockwise. Cluster I, sub-cluster a: KenE, NigE, UgE, TznE; subcluster b: FijiE, GhE, PakE, SLkE, ChcE; Cluster II, sub-cluster c: JamE, CollFijiE; other: MaltE, InSAfE, BlSAfE, IndE, HKE, MalE, CamE.
Table . The most diagnostic features of English-based pidgins and creoles (AR of P/Cs ≥ %, AR difference ≥ %), sorted by AR difference AR P/Cs (N=) AR difference P/C exceptions
No.
Feature
F
generalized third person singular pronoun: subject pronouns (That girl, it is living on Aitutaki now; PalmE)
%
%
SanAC, TrinC, HawC, ButlE, Norf ’k
F
no gender distinction in third person singular (De pretty girl make Jack lay he head in him lap; BahC)
%
%
TrinC, PalmE, HawC
F
generalized third person singular pronoun: object pronouns (À sí-am ‘I saw him/her/it’; NigP)
%
%
SanAC, TrinC, Bislama, VLibE, ButlE, TP
F deletion of copula be: before AdjPs (She happy; Bajan)
%
%
TP, Saramaccan
F existentials with forms of get (Gat orlem fish in ar sorlwater. ‘There are lots of fish in the sea’; Norfolk Island/Pitcairn English)
%
%
BrC, Gullah, EMarC, Saramaccan, Sranan
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
other variety types. That is, compared with L and L varieties of English, they show the highest proportion of both features which simplify their grammars vis-à-vis the standard grammar (notably regularization of irregular paradigms, levelling processes, loosening of syntactic rules) and features which are known to be characteristic of interlanguage varieties, especially of the (unmonitored) speech of adult language learners. These so-called L-simple features include avoidance of inflections, widespread copula absence, avoidance of agreement, the tendency to overgeneralize, etc. This finding, in turn, that within the Anglophone world these two types of structural simplification are most widely found in pidgins and creoles supports the claim (e.g., by Trudgill a,b) that simplification in the domain of grammar correlates with degree of contact. Pidgins and creoles are, of course, contact varieties par excellence.
. A :
.................................................................................................................................. This section will zoom in on the geographical signal in the WAVE dataset. More exactly, we will identify those morphosyntactic features which are most characteristic for the individual Anglophone world regions.9 As in the previous section, the attestation rate (AR) of a given feature serves as the crucial metric for identifying areal distinctiveness. More exactly, the following three thresholds were defined for a given feature to qualify as a strong local signal, possibly even as a feature diagnostic of a given Anglophone world region (see footnote on how in general the thresholds were determined): (i) the per cent mark, i.e. the feature in question must be attested in at least per cent of all the L/ L varieties, pidgins or creoles in the relevant world region; (ii) the per cent mark, i.e. in addition, for a feature to deserve mention in this section, its AR in only one world region must exceed the AR of this feature in the rest of the world (RoW) by at least per cent (calculus: AR region minus AR RoW); (iii) the per cent mark, i.e. features where the AR difference ‘region minus RoW’ reaches or exceeds per cent are considered diagnostic for a given world region. Only these diagnostic features are marked in bold throughout the tables in this section. On the whole, sixty-six out of the WAVE features (i.e. about per cent) qualify minimally as distinctive (crossing thresholds i and ii) or even diagnostic (also crossing threshold iii) of a given Anglophone world region. However, these features are very 9
See Kortmann () and Kortmann and Schröter () for much more differentiated accounts of areal patterns when exploring morphosyntactic variation across the Anglophone world. Table . is based on the Synopsis section of the latter article. Detailed accounts of major areal patterns for the grammar domains of pronouns, negation, and tense and aspect, all based on a somewhat smaller WAVE dataset, can be found in Wagner (), Anderwald (), and Lunkenheimer (a) respectively. For synopses of the morphosyntactic variation in the individual world regions, the reader may consult the regional profiles in Kortmann and Lunkenheimer (a: –).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
unevenly distributed across the English-speaking world. The strongest geographical signals are found for America (i.e. in WAVE the US varieties joined by Newfoundland English), the Caribbean, and South and Southeast Asia. Almost per cent of the sixtysix areally most distinctive morphosyntactic features fall into one of these three Anglophone world regions, with the Caribbean clearly taking the lead (), followed by America () and, at some distance, Asia (). This ‘ranking order’ in terms of areality based on the absolute numbers is confirmed when quantified against the average number of attested morphosyntactic features for each of these three world regions: the twenty-five areally most distinctive Caribbean features translate into a proportion of . per cent (against the Caribbean feature average of .), for America the corresponding eighteen features translate into . per cent (against the American feature average of .), and the thirteen features most distinctive for Asia account for . per cent of the Asian feature average (.). Moreover, the Caribbean is most distinctively marked by features which are both diagnostic and of medium to high usage frequency within individual varieties (twenty out of the twenty-one exclusively Caribbean features). Remarkable for South and Southeast Asia is the fact that, out of its thirteen areally distinctive features, four are found in all eight varieties. This is by far the highest proportion of true areoversals among these areally diagnostic features for any Anglophone world region. The relevant four features are the following, three of them, not coincidentally (cf. Schröter and Kortmann ), relating to deletion processes (F, F, F): F
pro-drop with subjects
F
zero where StE has definite article zero where StE has indefinite article present perfect for StE simple past
F F
Yeah, always Øi play TV games and hei didn’t care about her. (HKE) That was Ø first time I did promise them. (SgE) Can I get Ø better grade? (HKE) I have learned to play piano a few years ago but now I forget. (HKE)
By contrast, among their areally most diagnostic features America and the Caribbean have only one each with per cent coverage, namely F and F respectively: F
benefactive ‘personal dative’ construction F go-based future markers
I got me a new car. (UAAVE) He gun build de house. ‘He will build the house.’ (Bajan)
As indicated by the bold print, Table . includes all and only those morphosyntactic features which are truly diagnostic of individual Anglophone world regions, i.e. which cross the per cent threshold for AR difference between the region and the rest of the Anglophone world. Again it emerges that almost all of these areally diagnostic features
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Diagnostic morphosyntactic features per Anglophone world region No.
Feature
Region Example from world region
F*
benefactive ‘personal dative’ construction
Am
They found them an apartment (ChcE)
F
completive/perfect have/be + done + past participle
Am
He is done gone (EAAVE)
F*
new quasi modals: core modal meanings
Am
He belongs to come here today ‘He ought to come . . . ’ (AppE)
F
affirmative anymore ‘nowadays’
Am
Anymore they have a hard time protecting things like that (AppE)
F*
‘negative inversion’
Am
Ain nobody ga worry wid you (Gullah)
F*
completive/perfect done
Am, Car
He done gone (Bajan)
F*
you as (modifying) possessive pronoun
Car
Tuck in you shirt (TrinC)
F*
second person pronoun forms other than you as (modifying) possessive pronoun
Car
Tek out unu buk! (SanAC)
Car
Mi go pik dem uhp (VinC)
other forms/phrases for copula be: before locatives
Car
Mi sisa de na skoro ‘My sister is at school’ (Sranan)
F*
serial verbs: come = ‘movement towards’
Car
Run come quick (TrinC)
F*
for-based complementizers
Car
I haad fi kraas di riba ‘It’s hard to cross the river’ (JamC)
Car
You have people that own big piece a land (BelC)
F** go-based future markers F*
F* existentials with forms of have F
levelling of the difference between present perfect and simple past: present perfect for StE simple past
As
Ben has return back the product yesterday (MalE)
F*
non-standard use of modals for politeness reasons
As
Must I give you some water? (HKE)
F*
more number distinctions in personal pronouns than simply singular versus plural
Aus, Pac
We two is going . . . (PalmE)
(AR difference region – rest of world ≥ 60%; * for medium-frequency features, ** for highfrequency/pervasive features)
are attested in America, Asia, and the Caribbean. Only the very last feature in the table (F ‘more number distinctions in personal pronouns than simply singular versus plural’) is diagnostic for both Australia and the Pacific. Contrasting sharply with the three Anglophone world regions which account for the bulk of areally diagnostic features, there are far fewer (between two and four) features which are highly distinctive, let alone diagnostic, of one of the other four world regions.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Table . Top distinctive features for the British Isles No.
Feature
Example
F
be sat/stood with progressive meaning
I was sat at the bus stop for ages. (North) He was stood on the corner. (EA)
F
was/were generalization
When you come home fae your honeymoon if you had one, you was ‘kirkit’. (ScE)
F
was – weren’t split
They wus interested, but I weren’t. (EA)
F
either order of pronominal objects in double object constructions
She’d teach us it. (SW) Give me it / Give it me. (North)
Table . Top distinctive features for Africa No.
Feature
Example
F
double determiners
This our country is terrible. (CamE)
F
no number distinction in demonstratives
Despite all this ‘beehive-like’ activities . . . (KenE)
F
come-based future/ingressive markers
He is coming to attend to you. ‘ . . . is about to . . . ’ (NigE)
F
conjunction-doubling: clause + conj. + conj. + clause
It was Busketi who played but yet still the guy performed wonderfully. (GhE)
Table . Top distinctive features for the Australia Pacific region No.
Feature
Example
F
distinct forms for inclusive/exclusive first person non-singular
afla (inclusive, i.e. ‘we, including you’) versus mifela (exclusive, i.e. ‘we, not including you’) (Norf’k/Pitcairn English)
F
more number distinctions in pronouns than simply singular versus plural
Hemii goe nawii. ‘Let the two of us go swim.’ (Norf’k/Pitcairn English)
F
postnominal phrases with bilong/blong/ long/blo to express possession
dog blong/blo maan ‘the man’s dog’ (AusC)
F
transitive verb suffix -em/-im/-um
Mi bin bai-im kaikai. ‘I bought-TR some food’ (TorSC)
F
no more/nomo as negative existential marker
Nomo kaukau in da haus. ‘There no food in the house.’ (HawC)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Among these is not a single feature that is attested in all varieties of the relevant world region. Moreover, the vast majority of the features in Tables .– are optional and used rather infrequently in the individual varieties. The most distinctive features for the British Isles and Africa are shown in Tables . and ., respectively. Striking for Australia and the Pacific is their high proportion of features (five out of six; see Table .) which are hardly or not at all found elsewhere in the Anglophone world. The two most important instances of these are F ‘distinct forms for inclusive/exclusive first person non-singular’ and, as the only truly diagnostic one (i.e. crossing the per cent AR difference threshold), F ‘more number distinctions in personal pronouns than simply singular versus plural’. Features such as these two are most likely explained by very strong substrate influence (with transfer of L features to English).
. T W S E
.................................................................................................................................. A global perspective on the currently observable variation across the grammars of varieties of English and English-based pidgins and creoles may help us in making predictions concerning the spontaneous spoken standard varieties of the future. More exactly, the present section will focus on the following question: How are we to interpret the most widespread features in the Anglophone world, i.e. angloversals (section .), on the one hand, and the different types of distinctive and diagnostic features identified in sections . and ., on the other hand, against Mair’s (b) suggestion of a World System of (Standard and Non-Standard) Englishes? Mair’s World System complements traditional World Englishes models in that it breaks ‘ . . . new ground . . . in shedding light on the unexpectedly wide reach of a small number of non-standard varieties of English, particularly in the post-colonial world’ (b: ). This endeavour is part of the World Englishes research strand interested in the ‘deterritorialization of vernaculars through globalization’ (b: ), especially via computer-mediated communication. As one aspect of this deterritorialization process, Mair considers the global spread of non-standard morphosyntactic features. This, in turn, makes this model interesting for the WAVE perspective on global morphosyntactic variation. Mair’s World System of Standard and Non-standard Englishes looks as follows (b: ; bold print as in the original): • hyper-central variety / hub of the World System of Englishes: Standard American English • super-central varieties: () standard: British English, Australian English, South African English, Nigerian English, Indian English, and a very small number of others () non-standard: AAVE, Jamaican Creole, popular London, and a very small number of others (+ domain-specific English as a Lingua Franca (ELF) uses: science, business, international law, etc.)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
• central varieties: () standard: Irish English, Scottish (Standard) English, Jamaican English, Ghanaian English, Kenyan English, Sri Lankan English, Pakistani English, New Zealand English, and a small number of others () non-standard: Northern English urban koinés, US Southern, and a small number of others • peripheral varieties: () standard: Maltese English, St. Kitts English, Cameroonian English, Papua New Guinea English, and others () non-standard: all traditional rurally based non-standard dialects, plus a large number of colonial varieties including pidgins and creoles. Against the background of the previous sections, Mair is certainly right in considering the traditional L varieties and pidgins/creoles to be peripheral varieties, i.e. peripheral in the sense that they will play little to no role in contributing to the pool of morphosyntactic features of an emerging spontaneous spoken global standard of English. Consequently, this means that high-contact L varieties10 and, especially in Africa and Asia, indigenized L varieties contribute most to this kind of feature pool. Not surprisingly, many of the relevant Lc and L varieties in the WAVE data set are thus listed among the central and super-central (standard) varieties of Mair’s model. The main relevance of Mair’s model in the context of the present chapter lies, however, in the central role he ascribes to the standard and non-standard US varieties as hypercentral (Standard American English) and super-central varieties (AAVE). Urban AAVE enjoys a particularly high degree of (sub-)cultural salience and prestige in hip-hop music and the social media around the world. According to Mair, supercentral non-standard varieties such as Urban AAVE (and, for example, popular London in the UK) serve an important function as carriers or propagators of a globalized non-standard English, with some of the relevant morphosyntactic features increasingly finding their way into an emerging global spontaneous spoken standard. What, in a more concrete way, can the WAVE data set contribute to predicting the US impact on the future shape of the grammar of spontaneous spoken global (non-)standard English? Part of the answer has already been given: we have identified the sets of: • angloversals (attested in minimally per cent of the seventy-six WAVE varieties; Table .) and their immediate runners-up (attestation in minimally per cent; Table .), • top diagnostic features of L varieties, in general (Table .), and of high-contact L varieties (which both Standard American English and AAVE qualify as), in particular (Table .), and
10
Recall that, according to Trudgill (a), high-contact L varieties include not only many nonstandard varieties, but also the spoken standard varieties of, for example, British and American English.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
• areally distinctive features of America (including those in Table .), with the Lc varieties Colloquial American English and Urban AAVE belonging to the ten WAVE varieties spoken in North America. If we add up the features in all the sets in (a) to (c) the total is thirty-seven, and indeed out of these, thirty-four are attested in Colloquial American English and even as many as thirty-six in Urban AAVE, for both varieties in the vast majority of cases with mediumto-high usage frequencies. So if we want to characterize the backbone of the grammar of the super-central spoken non-standard, possibly even increasingly standardizing hypercentral spoken non-standard, then this set of some thirty morphosyntactic features already moves us much further forward. Additional candidates for the future global spoken non-standard we may glean from WAVE by identifying features of medium-tohigh frequency that are attested in both Colloquial American English and Urban AAVE. As top candidates among these qualify the following, especially since almost all of them are simplifying features (e.g., all the regularization features) and a couple have already attained a near-global reach (most prominently F ‘quotative like’): • F ‘regularization of plural formation: extension of -s to StE irregular plurals’ (e.g., sheeps, mices) • F ‘regularization of plural formation: phonological regularization’ (e.g., wifes) • F ‘associative plural marked by postposed and them/them all/dem’ (e.g., Marcia and them left already?) • F ‘regularized comparison strategies: extension of synthetic marking’ (e.g., beautifulest) • F ‘regularized comparison strategies: extension of analytic marking’ (e.g., the most happy instead of happiest) • F ‘inverted word order in indirect questions’ (I wonder could they have done it today), and • F ‘like as a quotative particle’ (e.g., I’m like ‘that’s right’). In sum, then, this section has shown how Mair’s model and WAVE can fruitfully complement and enrich each other in making present-day morphosyntactic variation across the Anglophone world the basis for predicting the likely future shape of global spontaneous spoken standard English.
. C
.................................................................................................................................. This chapter has looked at global variation in the Anglophone world from the point of view of morphology and syntax using a typological approach. This approach is interested in determining patterns of variation and the search for universals, which in the context of variation across English boils down to the search for angloversals,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
varioversals, and areoversals. The central tool which was specifically designed for this purpose and which has served as the basis for this chapter is eWAVE (the electronic World Atlas of Varieties of English). This tool offers a uniform data set and secures a high degree of comparability despite the evident heterogeneity in the morphosyntax of the seventy-six varieties of English and English-based pidgins and creoles it covers. Moreover, this tool is dynamic in that it can be easily modified, corrected, or enlarged as a result of the systematic testing of individual features or feature sets against natural language corpora of English (see also Chapter , this volume), and by adding to the data set further morphosyntactic features or varieties of English. The survey given in this chapter is meant to offer orientation and a reference frame for anyone interested in investigating morphosyntactic variation across Englishes, at whatever level (feature, grammar domain, variety, world region) and degree of granularity. It should help avoid the notorious problem of not seeing the wood for the trees, of forgetting the big picture in view of all the admittedly fascinating insights into individual varieties at the ground level, i.e. particular language and communication (including usage) phenomena. The comparative approach in dialectology (especially in the study of dialect syntax) and in World Englishes research has seen an increasing number of followers. Future research on the grammars of World Englishes and varieties of English will no longer be restricted to simply identifying morphosyntactic features that are different from Standard English. It will become standard procedure to immediately check in how many other varieties of English a feature can be found as well, and what the nature of the relevant varieties, the relevant subsystems of grammar, and their usage conditions is. Another aim of this chapter is to demonstrate that mapping the currently observable variation on a global scale will help us in predicting likely future developments, especially concerning emerging world-regional standards, global international spoken English, and the future spoken standard. Such mappings are necessarily reductionist. At the same time they are extremely useful as a point of reference for research based on other types of data sets on morphosyntactic variation in the Anglophone world, independent of the research paradigms (e.g., functionalist, formalist, corpus-linguistic, contact-linguistic, dialectological, variationist, creolist) within which they have been collected.
A Many thanks to Agnes Schneider and Verena Schröter for their support in preparing the statistics and analyses in sections . and ., and to Christoph Wolk, the WAVE chief statistician, for the two NeighborNet diagrams (or phenetic networks) included as Figures . and .. Many thanks, too, for the extremely helpful comments by the volume editors, Ray Hickey, and one anonymous reviewer.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
......................................................................................................................
......................................................................................................................
T concept of ‘genre’ (from Latin genus, ‘kind, people’) has a long tradition in literary and linguistic studies. It captures the idea that speakers adapt their language, including their usage of grammar, according to the situation in which it is used and that distinct form-based linguistic patterns converge in specific types of text. Genre, therefore, is inherently associated with morphosyntactic variation. In this chapter we will give a critical survey of research traditions and insights in the study of genre variation and will discuss case studies as examples. Special emphasis will be paid to grammatical constructions that are known to vary by the dichotomy of written versus spoken language as well as to grammar in evolving text types and the digital media.
. I
.................................................................................................................................. Any overview of the study of linguistic variation and genre needs to start with defining the concept of genre, a term which Swales (: ) describes as ‘highly attractive’ but also ‘extremely slippery’. It is both attractive and slippery because it can be used to frame very distinct questions about patterns and categories in different domains of investigation, such as literary studies, visual arts, in addition to linguistics. Even within linguistics, the term ‘genre’ is not used consistently. If there is any consensus at all, it is that the study of genre in linguistics is concerned with patterns of linguistic variation based on situational factors of communication, such as the communicative purpose of a text, its topic, and production circumstances (e.g., speaking versus writing). There is also agreement that situation-based language variation exists at the level of the system as well as at the level of the individual speaker: all speech communities have different genres and all speakers adapt their language according to situation. The two main paradigms of exploration can broadly be characterized as the ‘genre as social action’ approach (Miller ), which focuses on the functions of genres in guiding speakers to understand and participate in the actions of a community, and the
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
‘genre as a determinant of linguistic variation’ approach, which emphasizes linguistic form and patterns arising from situational factors (Biber et al. ). The first line of inquiry draws on categories and questions from rhetorical studies and considers genres as ‘not just forms’, but rather as ‘frames for social action’ (Bazerman : ); the second line of inquiry is more closely related to the sociolinguistic study of language variation, but with a focus on variation by situation rather than by speaker-based categories, such as gender, age, or class. We will restrict ourselves to the second approach here (for a more comprehensive approach, see Dorgeloh and Wanner ). Further terminological complications arise from the overlap—to the degree of interchangeability—between the terms ‘genre’ and the more established term ‘register’. Traditionally, the study of registers is concerned with linguistic varieties that describe patterns of situational variation. Register variation recognizes that ‘[p]eople speak differently depending on whether they are addressing someone older or younger, of the same or opposite sex, of the same or higher or lower status . . . ; whether they are speaking on a formal occasion or casually, whether they are participating in a religious ritual, a sports event, or a courtroom scene’ (Ferguson : ). A genre analysis has a different focus: while it also examines which linguistic features are associated with which situational settings, it describes conventions, which may or may not align with linguistic features, and it generally considers the text as a whole. An example would be Swales’s () work on research articles in English, which relates linguistic features, such as the use of past tense or inanimate subjects, to the textual structure of the genre, in particular the Introduction–Methods–Results–Discussion format. However, such a principled distinction between the study of registers and genres is not the norm. As Biber and Conrad (: ) point out, ‘[m]any studies simply adopt one of these terms and disregard the other’. For example, Biber et al. () stick with the term ‘register’ and do not bring up the term ‘genre’ at all, while Biber and Conrad () make efforts to distinguish between a ‘register perspective’ and a ‘genre perspective’ on functional linguistic variation. The former calls for a quantitative approach (in order to determine linguistic features that are pervasive and frequent), while the latter is not concerned with pervasiveness, but rather with what is typical. For example, the boundary marker ‘Amen’ is clearly a marker of the genre ‘prayer’, but it is not a pervasive feature of prayers. For this overview we will stick with the term ‘genre’, fully aware that many of the studies we report use the term ‘register’. This being a handbook on grammar, we will focus on grammatical structures and patterns that have emerged, from a variety of studies in this field, as salient in either a register or a genre. In section ., we introduce the reader to research traditions in the study of genre and grammatical variation, including approaches that consider genre a determinant of linguistic variation and such that look at how genres develop over time. Section . focuses on specific situational environments and the grammatical patterns associated with them, in particular patterns associated with spoken language (section ..) and written language (section ..), including linguistic features associated with digital media. In section ., we shift our lens to major grammatical concepts discussed in the context of genre variation. Section .. presents studies on complexity, section .. is concerned with the syntactic representation of agentivity, and in section .. we
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
discuss word order phenomena, such as inversion and adverbial placement. In section ., we conclude by reflecting on the relationship between variation and grammar as a system.
. R
.................................................................................................................................. Two different perspectives are possible when doing research on the impact of genre on grammar. Either genres are a determinant, in the sense of ‘predictor’ (Biber ), of grammatical variation, or one or several genres constitutes the object of linguistic investigation. While the former, variationist perspective means looking at proportional preferences of two or more grammatical variants, genres often differ in that within them certain grammatical forms are more or less pervasive (see Biber : ). The study of grammatical variation rests upon the structural availability of variants, i.e. ‘formal alternatives which [ . . . ] are nearly equivalent in meaning’ (Biber et al. : ). Many grammatical phenomena studied in this field concern individual grammatical categories, such as: • the system of relative pronouns (wh-pronoun, that, or ‘zero’) • complementizer deletion (that versus ‘zero’) • the co-existence of the of-genitive and the s-genitive (the car of my sister versus my sister’s car) • analytic and synthetic comparison of adjectives (more fresh versus fresher) • placement of verbal particles (give up something versus give something up). The English genitive is one widely studied case of grammatical variation for which the role of genre is generally acknowledged. Rosenbach (), in a state-of-the-art article, describes the choice between noun phrases such as the company’s legal status and the legal status of the company (Rosenbach : ), or women’s roles (Grafmiller : ) as opposed to the roles of women, as an effect, among others, of spoken versus written English as well as of individual genres. Genre is described here as ‘orthogonal’ to linguistic factors (Rosenbach : ), which means that in each context internal factors, such as animacy and informational density, and genre are interdependent in a characteristic way. For example, the preference for the of-genitive in most genres is ‘significantly attenuated in journalistic English’ (Grafmiller : ): Although speakers are in general ‘less likely to use s-genitives in writing than in spoken conversation [ . . . a] potential genitive is about . times more likely to be an s-genitive in newspaper writing’ (Grafmiller : ) than in other written texts. There is also research on grammatical variation that compares the strength of the impact of individual genres. Hundt and Mair () coined the concept of ‘agile’ as opposed to ‘uptight’ genres, referring to the relative openness of a textual category to
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
internal variation and ongoing innovation. Several of these studies show that changes currently under way in English, such as the long-term development of non-finite clauses (see Leech et al. ), take place at varying speed in different genres. For example, the use of infinitival rather than V-ing complements with the verbs start or begin (the team started to break down versus the team started breaking down) had already become the norm in news writing by the end of the twentieth century, news writing being thus ‘an agile genre quick to respond to trends in the language’ (Mair : ). In a similar vein, Mondorf () finds genre effects in the replacement of reflexives by particles in resultative constructions (such as straighten oneself versus straighten up). Although the general pattern of variation is that there is ‘a spread of the particle at the expense of the reflexive’ (Mondorf : ), the reflexive nonetheless ‘stands its ground’ in the written, obviously less agile genre in this respect, as opposed to a more innovative trend in speech. Turning now to the interaction of grammar and genre with individual genres as object of investigation, this line of research ranges from early work on English styles (e.g., Crystal and Davy ) to multi-dimensional studies of register variation (Biber , ). What unites this kind of work is the aim to identify and compare genres through characteristic densities of occurrence of grammatical features. This permits the comparison of genres, either with respect to particular areas of interest, for instance, grammatical complexity or the use of pronouns, or with respect to groups of co-occurring features (the ‘dimensions’), such as ‘involved versus informational’ or ‘narrative versus non-narrative’. The dimension ‘narrative concerns’, for example, is marked by a pervasive use of past-tense verbs, third person pronouns, perfect-aspect verbs, public (or speech act) verbs, synthetic negation, as well as present-participial clauses (Biber ). Individual positive correlations of grammatical features and widely studied registers include the pervasiveness of nominal features (nominalizations, nouns as premodifiers, PPs as post-modifiers) in news language and academic prose, as in (), or the presence of ‘nested’ subordinate clauses in legal English (Hiltunen ), as in (): ()
This patterning of behavior by households on other households takes time. (Academic textbook, taken from Biber and Gray : )
() A person who, when riding a cycle, not being a motor vehicle, on a road or other public place, is unfit to ride through drinks or drugs, shall be guilty of an offence. (Road Traffic Act, taken from Hiltunen : ) There is also work on the grammar of individual registers in the study of English for Specific Purposes (Paltridge and Starfield ), for example, on the depersonalized grammar of ‘disease language’ in the field of medicine (Fleischman : ), which assigns to an ill person the grammatical subject of an agentive verb, while at the same time this person has the semantic role of patient or experiencer (She started chemotherapy; see Dorgeloh : ). Most grammatical features in professionalized varieties
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
are multi-functional, but in some cases they even become grammatical markers of these varieties. For instance, in ‘Aviation English’, we find many verbs in the imperative, and some of them, such as the verbal, imperative use of standby (meaning ‘wait and I will call you’), are highly unlikely outside this specialized context (Bieswanger : ). The analysis of grammar also constitutes a major research tool for text analysis in the study of literature. In poetry, grammar is considered a ‘tool for creating literary meaning’ (Jeffries : , see also Chapter in this volume), and for narrative prose grammatical constructions have been described as establishing perspective and point of view (e.g., Ehrlich ). For example, inversion and other word order patterns in English evoke the setting of a story world from an ‘eyewitness’ perspective (see section ..). Work in the tradition of historical pragmatics investigates genre changes and the way these induce changes in grammar usage. Medical research articles, for example, underwent a rhetorical shift, following which medical case material no longer constituted the centre of interest. This triggered significant changes in the grammar of the clause, such as the increase of non-animate subjects and passive constructions (Atkinson ). Similarly, the rise of experiment-based articles in science furthered nominalization and pre-modification in the noun phrase (Gotti : ; also Biber and Gray , ). The majority of these research traditions draw heavily on corpus methodology (discussed in Chapter , this volume). As methods of digital inquiry have developed, linguistic corpora have grown enormously, both in size and variety. What used to be a benchmark corpus at one million words (e.g., the BROWN corpus of American English, compiled in the s) is now considered small, and reference corpora are about one hundred times bigger, such as the BNC (British National Corpus, one hundred million words) or COCA (Corpus of Contemporary American English, million words; Davies –). Yet, for the analysis of grammar in individual genres one needs to look at smaller, more carefully balanced and classified corpora (Hundt and Leech ; see also Mukherjee ). Ultimately, much work in this tradition, although sometimes called into question for its lack of representativeness, is still of a qualitative nature.
. T
.................................................................................................................................. This section will highlight ‘one of the most important situational parameters’ (Biber and Conrad : ) for the linguistic description of genres: the distinction between spoken and written language (also referred to as ‘channel’ or ‘mode’ of communication). After outlining the theoretical underpinnings of this distinction, we will present key findings from studies of spoken language (section ..) and written language (section ..). One of the most established parameters in the study of genre variation is the distinction between spoken and written language, to the extent that stereotypical writing—planned, edited, created for publication—is seen as having the ‘opposite characteristics’ of stereotypical speech (Biber : ). The labels ‘written’ or ‘spoken’
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
language are usually linked not just to the channel of the discourse, but to whole clusters of situational—and linguistic—characteristics. Traditionally, spoken language is associated with spontaneous production and a high degree of interactivity between speaker and addressee, while written language, with a delay between production and reception, is often planned or edited. Participants in spoken discourse are typically in face-to-face contact and can therefore rely on paralinguistic cues, such as gestures and facial expressions, as well as intonation, to construct the meaning of an utterance (Tannen ). In written discourse, on the other hand, spelling, punctuation, and layout can be used to structure discourse units (Crystal ). However, these are just generalizations, and while they may be adequate to describe the situational differences between conversations and expository prose, the two genres used most frequently to represent speech and writing (Biber : ), there are many examples of texts that are spoken but carefully planned (such as political speeches), or written but produced under (self-imposed) time constraints and not addressed towards an unknown audience (personal letters, diary entries). One widely accepted attempt to disentangle different aspects of the spoken/written parameter is Koch and Oesterreicher’s () distinction between medially and conceptually oral or written language. Medially, the spoken/written distinction is truly a dichotomy: spoken language is conveyed through phonic code, written language is conveyed through graphic code (: ),1 and texts can be transposed from one code to another (for example, a speech, once delivered orally, may be printed). Conceptually, i.e. with regard to the communicative strategies that are employed, the two modes ‘written’ and ‘spoken’ should be regarded as idealized poles (or prototypes) with a continuum of realizations in between (: –). While this model can capture the idea that genres are more or less related to the idealized prototypes of private conversation and expository prose, it has been criticized for its narrow definition of ‘medial’ (Dürscheid ). Koch and Oesterreicher are only interested in the medium as a mode of representation (graphic versus phonic); they do not consider the medium as a channel of transmission. Therefore, their model cannot easily capture genre characteristics of computer-mediated communication, which is perceived as neither spoken nor written, but of its own kind (Crystal ; see also section ..). Writing from a bottom-up perspective, Biber (: ) argues that quantitative corpus data do not warrant an ‘absolute spoken/written distinction’. He finds that ‘variation among texts within speech and writing is often as great as the variation across the two modes’. His later work confirms that there are no absolute linguistic markers of (medially) spoken versus written language, and that, due to more uniform production circumstances, spoken genres show less variation than written genres: ‘[A]n author can create almost any kind of text in writing, and so written texts can be highly similar to spoken texts, or they can be dramatically different. In contrast, all spoken texts are
1
Obviously, their classification does not easily accommodate sign languages.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
surprisingly similar linguistically, regardless of communicative purpose’ (Biber and Conrad : ). Despite these findings, the categorization as spoken or written language is perceived as genre-formative and is kept up in large benchmark corpora, such as the BNC (British National Corpus), COCA (Corpus of Contemporary American English), and LGSWE (Longman Grammar of Spoken and Written English; see Biber et al. ).
.. Grammatical features of spoken English The most widely studied spoken language variety is conversation. Its features of grammar correlate with a situation in which speaker and addressee usually share the same context, they can directly seek information from each other, and there is little planning ahead. As a result, features of grammar pervasive in speech are first- and second-person pronouns, interrogatives and imperatives, relatively short clauses, and a high density of verbs (since there is typically one verb per clause, texts with shorter clauses also have proportionally more verbs than texts with long clauses). There are also grammatical characteristics associated particularly with conversation, such as verbless clauses (probably not your cup of tea mum), left-dislocation (Rik Maio, I can’t stand him), prefaces (I mean), and tag questions (did we?), exemplified in () below, taken from BNC (Davies –): ()
SP: what made you watch Oscar? SP: cos time was going on and I thought well just watch that, I knew you wanted it taped and I thought time were going on SP: (unclear) SP: Maria and (unclear) SP: (unclear) SP: we never finished watching that did we? SP: (unclear) SP: that was a good film that SP (unclear) SP: I mean I can’t stand Rik (pause) what’s his surname Maio? SP: Rik Maio SP: Rik Maio, I can’t stand him like, I enjoyed it, I thought it was a good film SP: I haven’t watched it yet still on the (unclear) SP: probably not your cup of tea mum, it’s daft humour, he’s funny SP: He’s smashing the fridge (unclear) ah! the fridge! the dragon lady! the three headed dragon SP: he’s waiting for that (unclear) SP: he’s got twenty four (unclear) he has SP: (unclear) SP: I thought he was in fifties I did (BNC: S_conv; emphasis added)
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
As can be seen in the extract, talking in highly interactive situations leads to a lot of ellipsis, mostly for reasons of economy. Verbless clauses are typical of speech, even though more specific forms of ellipsis, such as ‘compressed’ adverbial clauses (if necessary, when in trouble), also occur regularly in written English (Wiechman and Kerz ). Subject ellipsis, i.e. a finite sentence with a non-overt subject, is also characteristic of informal spoken English, but, again, also occurs in written genres, for example, in diary writing (Weir ), notes or postcards (Haegeman and Ihsane ), as well as nowadays in many (conceptually oral) digital genres. Examples from personal blogs from the CORE Corpus of Online Registers of English (Davies –) are given in () below: ()
a. Yesterday had a fairly slow day. Went for a walk up Mt Rogers. Then had Nat and Andrew over to play [ . . . ] (CORE: kazza.id.au) b. Can’t eat, can’t rest. Went to see my doctor today who wrote me a letter to show them. (CORE: bbc.co.uk)
Another typical feature is sentence coordination, which in speech usually means a ‘loose’ continuation (Hughes : ), with and functioning as an ‘initiator’ of the following clause or turn (Biber et al. : ). Another construction affecting the left clause periphery is left-dislocation, which is by far more frequent in conversation than in written discourse. It has plausibly been argued to originate in a turn sequence such as in () (taken from Geluykens : ). ()
A: now, the last paragraph B: yes A: I seem to remember it being different from what’s printed
Other discourse-motivated constructions that typically correlate with conversational contexts are right-dislocated NPs (as in It’s useless, that printer) as well as demonstrative wh-clefts (as in That’s what I want). See Chapter in this volume for further discussion of dislocations and clefts as information packaging constructions. To these features of grammar typical and more pervasive in the spoken mode, Biber et al. (: –) add various features of a ‘grammar of conversation’. These are, on the one hand, grammatical peculiarities due to ‘dysfluency and error’ (Biber et al. : ), such as silent or filled pauses, repetitions, reformulations and repairs, or blended and incomplete utterances. In the corpus sample of conversation investigated for the LGSWE, Biber et al. find a percentage of . per cent of non-clausal units. On the other hand, there are individual expressions or phrases added to clausal grammar for pragmatic or discourse reasons, as illustrated in () above. Depending on their position in the clause, one may classify them as prefaces (I thought well just watch that; I mean I can’t stand Rik . . . ) or tags (I thought he was in fifties I did). Their role is predominantly interactive and pragmatic, which is why some of them are treated in the literature as discourse markers. It has also been argued that complement-taking predicates with a meaning of stance in the main clause (I thought well just watch
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
that, I knew you wanted it taped) originate as parentheticals in conversation (Thompson : ; for an alternative view see Newmeyer ). A grammatical symptom supporting this would be the omission of the complementizer that, which is also more common in speech than in writing (Biber et al. : ). Varieties within speech also show genre-specific grammatical variation. In university office hours, for example, one finds twice as many conditional clauses as in everyday conversation (Biber and Conrad : ), while in doctor–patient encounters conditional usage differs across the speech of doctors and of patients. Patients are not only more concerned with predicting and therefore use more conditionals overall, they also combine tense and modality in patterns significantly different from those used by their doctors (Dorgeloh ). There are thus many spoken sub-genres with a different and characteristic pattern of pervasive grammatical features.
.. Grammatical features of written English Among the situational characteristics that set written and spoken language apart, in addition to the mode/channel of expression, are the relationship between the participants, the production circumstances, the setting, and the purpose of the communication. Writer and addressee often do not know each other, they typically do not share the same time/ place setting, and written language is often edited and associated with ‘communicating information rather than developing a personal relationship’ (Biber and Conrad : ). The following sentences (from COCA) are easily recognizable as samples of written English. () a. For all its short-term growth, Shell still had to show investors that its longterm future was as bright as it once looked on paper. (New York Times, ) b. In addition to congruent audiovisual vowel perception, taking advantage of the lip-rounding feature of Dutch vowels, we investigated the perceptual fusion using incongruent audiovisual stimuli. (Journal of Speech, Language & Hearing Research, ) The data in () illustrate a number of pervasive grammatical features found to be characteristic of writing (Biber , Biber et al. , Biber and Conrad ): • longer and more complex sentences than in spoken English (both sentences are complex in the sense that they include two subordinate clauses as well as sentenceinitial adverbials) • higher density of nouns (in relation to overall word count) and more morphologically complex nouns (investors, perception) • more noun modification through N–N compounding (vowel perception), PP modifiers (of Dutch vowels), and prenominal adjectives (short-term growth, lip-rounding feature).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Since adjectives often occur as prenominal modifiers, it does not come as a surprise that written texts also have higher numbers of adjectives. Whereas in conversation many noun phrases are realized as pronouns and noun modifiers are rare, more complex noun phrases are the norm in writing (Biber et al. : ). In the written genres of academic prose and news writing, adjectives are the most common premodifier, followed by nouns (N–N compounds). As for post-modifiers, prepositional phrases are about three times as common as relative clauses (Biber et al. : ). Unlike spoken language, written language is not acquired without instruction. Not only do children need to learn to spell words correctly (not an easy feat in English), they also learn early on how to write in different genres. The curriculum for second graders in the USA, for example, includes instruction in writing narratives (‘use temporal words to signal order’), opinion pieces (‘use linking words’), and explanatory texts (‘introduce topic’).2 Due to their more permanent nature, written texts are under a lot more scrutiny than spoken texts—they are the target of style manuals, self-styled language mavens, and grammar checkers built into word-processing software (Curzan ). Linguistic characteristics of particular genres therefore are not necessarily an unmediated reflection of the underlying situational characteristics, they may very well result from prescriptive ideas about certain genres or venues of publication.3 Some electronic corpora of English are only made up of data from written language. The Lancaster-Oslo-Bergen Corpus (LOB), compiled in the s, consists of ten written genres, of which academic prose, press reportage, and fiction have received the most attention. They correspond to the three broad categories of written English in Biber et al. (): academic prose, news writing, and fiction. News writing and academic prose are very similar with regard to production circumstances (both go through a process of editing), but they are very different with regard to audience (more specialized in the case of academic prose) and communicative purpose, resulting in preferences for different linguistic patterns. News writing has a tradition of separating factual reporting (the actual news section) from stating one’s opinion (editorials, op-eds, and columns), while in the modern research article the factual is always intertwined with interpretation (Bazerman , Biber and Conrad : ). These differences result in a higher use of the past tense in news writing (reporting what happened) and of the present tense in academic writing (the interpretation part). Due to its narrative structure, news writing has more time adverbials (also see section ..), while academic prose, with its focus on argumentation and interpretation, shows more causal connectors. As a result of the need to describe complex concepts with precision, academic prose uses more nouns overall, more nominalizations, and
http://www.corestandards.org/ELA-Literacy/W//, last accessed March . For example, the guidelines for submitting an article to the Journal of Moral Philosophy (Brill) state that contractions (haven’t, can’t) are to be avoided; https://brill.com/fileasset/downloads_products/ Author_Instructions/JMP.pdf, last accessed March . 2 3
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
more noun modifiers, especially postnominal PPs and prenominal adjectives, than news writing. By contrast, more interactive grammatical features, such as pronouns and questions, which are typical of conversations, occur with lower frequency in both news writing and academic prose. Fictional genres share some of the situational characteristics of academic prose and news writing: the text is carefully planned and edited, it is also generally written for a large unspecified audience, and there is no real interaction between author and reader. However, Biber and Conrad (: ) find that ‘these external situational characteristics have almost no influence on the linguistic characteristics of a fictional text’. Rather, it is the fictional world that is constructed that determines what kind of language is used. For example, if the story is told from a first-person point of view, a high frequency of first-person pronouns is to be expected. Fiction writing can also include large stretches of dialogue, personal letters, or diary entries, in which case the sampled language is not so much a reflection of ‘fiction’ as a genre, but rather of the embedded text type, such as narration or description. When situational circumstances change, genres change as well—and new constructions emerge that serve the changed communicative purpose better. The experimental research article today is considered ‘one of the most highly conventionalized genres in English’ (Biber and Conrad : ), with clearly defined sections (Abstract, Introduction, Methods, Results, Discussion, Conclusion; see also Swales ). Each of these sections is organized along conventionalized rhetorical moves, resulting in formulaic opening strings like This paper argues . . . (Dorgeloh and Wanner ). Other linguistic characteristics of the modern research article are a low frequency of first-person pronouns (even though, as Wanner () points out, style manuals advise to use them); a high frequency of inanimate subjects, many of them the result of passivization or nominalization; and a high density of nouns, of noun modification, and of subordinate structures. However, initially (i.e. in the seventeenth century), scientific articles looked very different. They had the function of giving an account of a concrete event observed by the writer, which linguistically translated into a more narrative pattern, with past tense, first person, and temporal markers. Many early articles published in the Philosophical Transactions of the Royal Society in the seventeenth century were purely descriptive, framed as letters to the editor (with formulaic salutation form and closing), with no discussion or conclusion whatsoever. They are so different from modern research articles that it is unclear if one should still call them both the same genre. From a form-based perspective, an article written in the seventeenth century does not have much in common with an article that appears in Science today. From a functional perspective, however, both articles belong to the same genre, because they have the same function of integrating knowledge into the scientific community (Gotti ). A more recent development in written language is the emergence of computermediated communication (CMC), a type of discourse that has brought about many new genres (e.g., e-mail, blogs, chatrooms, text messages, multimedial posts on social media platforms) with their own linguistic conventions. As different as these genres
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
may be with regard to audience, purpose, and production circumstances, there are some characteristic linguistic features across the board: • a preference for constructions normally associated with spoken language, such as noun phrases without modifiers and ‘a clausal style similar to conversation’ (Biber and Conrad : ), as illustrated in section .. • the use of non-standard forms, including simplified spellings (nite) and incomplete sentences (U busy?), which Herring () analyses as a deliberate choice to minimize typing effort • the mimicking of the shared context of face-to-face conversation, for example through the use of inflectives (*shudder*; see Schlobinski ) and pictorial elements (emoji) • the encoding of ‘textual aboutness’ (Kehoe and Gee ) through searchable topic identifiers, such as the symbol ‘#’ (hashtag/hash), which have acquired the additional function of providing meta-commentary or expressing an attitude, similar to evaluative adverbials (e.g., ‘#madashell’ expressing annoyance in a post about a delayed flight; see also Zappavigna ). Hashtags have spread well beyond the context of digital media. They are used in print and TV marketing campaigns (e.g., #BeExtraordinary, a hashtag created by the cosmetic firm L’Oréal) as well as in spoken discourse, if often humorously.4 In this way, they are a prime example of linguistic innovation resulting from (medially) written language.
. K
.................................................................................................................................. This section will discuss three areas of grammatical variation that result from the differences between spoken and written genres. On the one hand, written language tends to be more integrated, which shapes grammatical complexity at various levels of sentence structure (section ..). On the other hand, written language tends to focus more on content, calling on a variety of external sources. In doing so it often suppresses references to the immediate context, including reference to the author as ‘representing consciousness’ (Chafe : –). We will discuss in section .. how this affects the grammatical category of voice and brings about alternative constructions. Finally, we will discuss word order phenomena that also vary by genre (section ..). 4
A video clip of American talk show host Jimmy Fallon and guest Justin Timberlake showing a parody of a conversation interspersed with comments introduced by the word hashtag has garnered over thirty million views: see https://www.youtube.com/watch?v=dzaMaouXA, last accessed March .
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. Grammatical complexity Grammatical complexity in the sense of sentence complexity is a classic research issue of genre variation. It is an almost stereotypical view (Biber and Gray ) that writing shows ‘longer and more complex clauses with embedded phrases and clauses’. The grammar of speech, by contrast, is typically thought of as tending towards simple clauses with little embedding and a ‘high incidence of co-ordinated clauses’ (Hughes : –). Coordination varies indeed with speech and writing. Clause- (or turn-) initial coordinators, notably and, establish a non-constructional relationship, i.e. one which already comes about by the mere sequence. In written texts, which possess a material continuity, overt sentence coordination is much less common (Biber et al. : –) and it is generally associated with less complex and less mature styles (Crystal : ). The decline of additive coordination, by contrast, reflects a professionalization and ‘de-narrativization’ of genres over time (Dorgeloh ). Interestingly, however, research on subordination across genres shows that, counter to the stereotypical view, most types of dependent clause are ‘considerably more common in speech than in writing’ (Biber and Gray : ). For example, a short spoken utterance such as () is a sentence with two embedded clauses, while the much longer written sentence in () is technically just a simple main clause. (Biber and Gray : )
()
I just don’t know if that’s what he wants.
()
From the system perspective, these stages are marked by the appearance of new systemic mechanisms and corresponding levels of complexity. (Biber and Gray : )
This suggests that clausal embedding, depending on the type of embedding we look at, can well turn out to be more typical of speech than of writing. Only nominal, phrasal embedding is clearly a grammatical characteristic of the written mode, leading to a form of sentence complexity that has been described as ‘compressed’, rather than ‘elaborated’ (Biber and Gray : –). Complexity within the noun phrase is the locus of ‘one of the most dramatic areas of historical change in English over the past three centuries’ (Biber and Conrad : ). While conversations are characterized by a high density of one-word noun phrases, written genres depend much more on nouns in combination with modifiers to identify the referents of noun phrases, as discussed already in section ... The most prevalent patterns of modifying nouns in scholarly writing, for example (according to Atkinson ), are N–N compounding (system perspective), prenominal adjectives (new systemic mechanisms), postnominal PPs (levels of complexity), all from example () above, as well as (participial or full) relative clauses (an argument [made by Chomsky]). Similar patterns, however, have also been noted for some spoken subgenres: the analysis of different types of spoken monologue discourse, for example, ‘shows that if a text is “scripted” this entails that there will be fewer simple NPs’ (Aarts and Wallis : ).
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Biber and Conrad (: –) further show that the development towards more complex noun phrases is a genre-specific phenomenon in the history of written English. For example, fiction and medical prose texts did not differ all that much with regard to noun modification in the eighteenth century. The most frequent modifier was the postnominal PP, followed by the prenominal adjective, N–N sequences, and relative clauses (which occur more frequently in fiction texts than in medical prose). Three hundred years later, the use of attributive adjectives and N–N sequences has remained stable in fiction, but has almost doubled for medical prose. The use of PPs has increased markedly for medical prose (from an already high level) and has decreased for fiction. Biber and Gray (: ) argue that such an increase in complexity can only happen in writing—writers take as much time as they want to wordsmith their texts and the final text is likely to have undergone many revisions. A grammatical phenomenon related to noun phrase complexity is noun phrase ellipsis, i.e. the nominal use of adjectival modifiers. According to Payne and Huddleston (: ), a construction such as Lucie likes big dogs, but I prefer small occurs predominantly with modifiers denoting ‘basic physical properties’. Recent work on this construction (Günther ) discusses other semantic types of headless NPs, such as tell a good lie from a bad. These occur both in speech and in writing, as the following examples from the BNC illustrate: () ()
That none of them had could be a bad sign or a good. (BNC: W_fict_prose) Hydrogen is positive. [ . . . ] Chlorine is a negative. [ . . . ] Sodium is a positive. (BNC: S_classroom)
Günther (: ) notes that different conditions for elliptical NPs are at work under the conditions of speech versus writing. She finds that in speech adjectives referring to physical properties are in fact the dominant type whereas the limitation in the written mode is to adjectives belonging to a taxonomy or binary pair (such as good and bad; Günther : ). The explanation she provides is that under the more decontextualized conditions of written text, an evoked opposition or taxonomy makes a referent sufficiently salient to be retrievable, which is why a head noun is less needed. As example () from spoken discourse shows, such retrievability does ultimately not depend on physical mode alone. Complexity also varies within individual genres. Pérez-Guerra and Martìnez-Insua () compare phrasal complexity in two written genres: news and personal letters. Their findings—a higher degree of complexity in news writing than in letters—confirm an effect of the genre, but one that is less pronounced for subjects and adverbials than for objects (Pérez-Guerra and Martìnez-Insua : ). This confirms that genres have different effects on different forms of complexity, varying, for instance, by argument positions. In other measures of complexity, such as depth of embedding, the same genres nonetheless turn out to be more alike.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
.. The representation of agentivity Genre conventions often affect the way in which events are encoded in syntax. The unmarked case in English is that agents are realized as subjects and patients are realized as objects, but there are genres in which other concerns take precedence over the need to spell out most clearly who did what to whom. Scientific discourse in particular is a register in which the expression of agenthood is often minimized (Biber , Atkinson , Orasan , Wanner ), which leads to a high presence of constructions with no visible agent, such as the following, taken from the academic portion of COCA. ()
a. Neoliberal institutionalism attempts to use self-interested rational actor assumptions of neorealism to show that cooperation under anarchy is possible within the international system. b. This paper argues that a bipolar model with profit maximization at one end and charity on the other . . . is inadequate and particularly ill-equipped to address the problem of poverty. c. As in the original study . . . , infants were seated comfortably on their caregiver’s lap facing a -inch Dell computer screen. d. This study moved beyond previous studies on student progress monitoring by showing that information about progress is adding information that can be used in the instructional decision-making process beyond what is obtained in a single-time point assessment at the beginning of the school year.
The data in () illustrate the use of a number of constructions that either have subjects that are not animate agents or have no visible subject at all: • • • •
nominalizations (Neoliberal institutionalism attempts . . . ) textual subjects (This paper argues . . . ) passive voice (infants were seated comfortably) gerunds (by showing that information . . . is adding information . . . ).
The passive voice, in particular, has ‘a long and venerable tradition’ (Penrose and Katz : ) in scientific writing in putting more emphasis on facts than on those who uncover them. In much of scientific writing the agent of the sentence is identical with the writer (the writer being the person who makes and tests claims, collects and interprets data) and choosing the passive voice is a strategy to avoid the use of first person reference (It will be argued . . . versus I will argue . . . ). It does not come as a surprise, then, that the frequency of the passive (in terms of occurrences per one million words) is more than twice as high in the academic prose sub-corpus of LGSWE as in general fiction and is lowest in conversation (Biber et al. : ), even though conversation is the sub-register with the highest density of verbs. Additionally, as pointed out by Dorgeloh and Wanner () and Wanner (), even in sentences
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
with active voice the subject is often an inanimate entity in academic prose. This does not just hold for the methods and results sections of experiment-based papers in the sciences. In our study of abstracts from different disciplines, we found that for so-called ‘reporting’ verbs (suggest, conclude, propose, argue, reject, contradict, etc.), i.e. verbs for which one would normally expect a first-person subject, many writers resort to combining active voice with inanimate subjects. We observed a preference for textual subjects, such as this article or this paper, in the humanities and social sciences, illustrated in (a), and for data-based subjects, such as these numbers or these results, in the sciences, illustrated in (b). ()
a. This article suggests that the new institutionalism contains ambiguous and contradictory notions of change. (American Journal of Economics and Sociology, ) b. These results contradict the notion that metal bioavailability in sediments is controlled by geochemical equilibration of metals . . . (Science, )
The patterns illustrated by the data in () are the result of a rhetorical change in scientific discourse that began in the first half of the twentieth century, especially in American English. Today, style guides such as the publication manual of the American Psychological Association (APA) strongly favour the use of the active voice—it is considered more ‘vigorous’ and more ‘transparent’ in recounting who did what to which effect. By contrast, the passive voice has become associated with the ‘negative practice of conscious deception by deliberately hiding the doer of the action’ (Baron : ).5 With this background, it does not come as a surprise that the percentage of passive constructions in academic prose has declined considerably, especially in American English. Seoane () reports a decrease in be-passives from per cent (LOB corpus, ) of all transitive verb constructions to . per cent (FLOB corpus, ) in British English and from . per cent to . per cent in American English (BROWN and FROWN corpora, and , respectively). Potential reasons for this shift include a preference for a more involved style (Taavitsainen ), marked by the use of the first person, and a general trend towards the use of colloquial and traditionally oral forms, such as subject–auxiliary contraction, in written English (Mair and Leech ). A tendency towards inanimate subjects in verb constructions that used to take primarily agents as subjects has also been noted for the English progressive. Hundt (: ), for example, identifies the ‘weakening of the semantic constraint’ as ‘one of the conditioning factors’ for the increased usage of the progressive in the nineteenth century. Her corpus data reveal that early medical and scientific texts regularly use inanimate subjects with progressive aspect (an abscess was gathering, the water is 5 However, see Wanner () for an argument that even the short passive (a passive construction without a by-phrase) is not an agentless construction—rather, it is a construction with an ‘implicit’ agent.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
constantly changing; see Hundt : ). She therefore concludes that the observed overall spread of the progressive in the English verbal system results from genre variation and the spread of genre conventions, rather than system-internal grammaticalization (Hundt : ).
.. Word order variation Narratives occur both in speech and in writing and affect the usage of grammar in several systematic respects. Since storytelling involves the reconstruction of events and of a storyworld (Georgakopoulou and Goutsos ), one area of grammar relevant for this mode is constructions with a presentational focus (Rochemont and Culicover ), such as ‘existential’ (there-) constructions and inversion constructions where the subject swaps position with a canonically postverbal constituent such as a locative expression (see also Chapter , this volume). While the presentative meaning is one of mere existence in ‘existential’ presentationals (Drubig : –), as in (a), it is one of ‘appearance’ on the scene in the case of an inversion, as in (b) below. The two constructions differ in that there-insertion is preferred in sentences ‘which fail to convey visual impact’ (Breivik : ). ()
a. There were fields with vines outside the villages. (= Drubig : , example ()) b. Outside the villages were fields with vines. (= Drubig : , example ())
Due to this visual impact, inversion adds to the vividness of storytelling in that presenting scenes and events from another location and time in non-canonical word order, which is structurally similar to the (underlying or fictive) experience, produces a ‘displaced immediacy’ (Chafe : –). In (b), but not in (a), the viewing of the fields with vines is represented like a distant, but nonetheless an immediate experience (Dorgeloh : –). This ‘immediate observer effect’ (Kreyer ) is more widely used in narration (and embedded description) because it helps to focus, not only on a different discourse world, but also on the ‘experiencing consciousness’ (Chafe : ). Accordingly, we find it more in narrative than in non-narrative texts (Dorgeloh ), though not only in fiction, but also, as () illustrates, in non-fiction texts when an immediate, visual experience is reproduced: ()
THE perfect Christmas present for republican tea drinkers – the Windsor In Flames commemorative mug, priced at .. On one side is a line drawing of the burning castle plus the date of the conflagration. On the other is a cartoon of the Queen [ . . . ]. (BNC: W_news; emphasis added)
There are other constructions with a characteristic usage in narrative discourse, in particular at the left clause periphery. Narratives typically possess a chronological structure, which is often supported by sentence-initial adverbial placement. These
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
adverbials can be strategy markers: they serve as ‘reference-points’ in the story, which ‘in number, size and/or information status signal the size of the boundary that they mark’ (Virtanen : –). For example, in () the three adverbials mark a relatively major shift in the storyline, compared to the more local shifts, combined with ‘narrative syntax’ (Labov ), in the oral narrative in (): ()
Then one day in winter when I was seventeen, things began to go wrong for me. (BNC: fict-prose; emphasis added)
() last night I couldn’t get to sleep. [ . . . ] I didn’t, I didn’t get to sleep till about two o’clock. And then I woke up again, and I come upstairs to look at the clock and it was half past three. And I went back down to bed and I got up again at five, went back there then the baby wake up at seven. (BNC: S_interview_oral history). The use of narration in both speech and writing also tells us that this is a discourse mode, embedded in many contexts and hence hosted by various genres (Georgakopoulou and Goutsos : ). The same therefore applies to the word order phenomena discussed here, which can also be put to further uses in many other genres.
. C
.................................................................................................................................. In this chapter, we have looked at how spoken and written genres affect the usage of grammar. While there are strong correlations between specific discourse situations and the occurrence (or absence) of certain linguistic constructions, most constructions are multi-functional—they may be pervasive in a given genre, but they also occur elsewhere. For example, spoken discourse favours the use of pronouns, but complex noun phrases are not completely out of place, while pronouns are also characteristic of specific written genres, such as diary entries. Therefore, most grammatical categories and constructions tend to be features, rather than markers, of a given genre. Ellipsis, for instance, is frequent in informal talk, but it also occurs in personal ads, diary entries, and online writing. It constitutes a grammatical phenomenon typical both of spoken and written genres; only the motivation for using it varies. It is in this sense that genres are ‘multi-dimensional constructs’ (Biber : –). Nonetheless genres tend to regularize and conventionalize grammar and thereby increase ‘the likelihood of successful, forceful communication’ (Bazerman : ; see also Dorgeloh and Wanner ). On a theoretical level, this touches upon the question of whether the distinction between knowledge of grammar (competence) and language use (performance) holds altogether. Formalist and functionalist linguists disagree on the question whether or not patterns of grammar usage affect the representation of grammatical knowledge (Schmid : ; also Newmeyer , Boye and Engberg-Pedersen ). Formalists concede the usefulness of frequency information for the study of ‘language variation, use, and change’, but doubt that one can come to
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
‘conclusions about the grammar of an individual from usage facts about communities’ (Newmeyer : ). By contrast, functionalist work in probabilistic syntax (Gahl and Garnsey ; Bresnan ) presents evidence that language users have a ‘collectively conventionalized knowledge of linguistic structures’ (Schmid : ). They tend to select phrases and sentences ‘that have accompanied past actions’ (Hopper : ), which brings about patterns of variation in the sense of both what is acceptable and preferred. For example, the passive construction and nominalizations were originally ‘adaptive features of literary grammar’ (Pawley and Syder : ). While the grammar of unplanned speech documents usage of grammatical features, such as dysfluency or ellipsis, that are not regularly present in most kinds of writing, more recent developments in writing show that new contexts push grammar usage beyond old boundaries and develop new modes of expression, which then feed back into the language system.
A Support for this research was provided by the Office of the Vice Chancellor for Research and Graduate Education at the University of Wisconsin-Madison with funding from the Wisconsin Alumni Research Foundation.
......................................................................................................................
......................................................................................................................
. I
.................................................................................................................................. I this chapter I will investigate grammatical variation in literary texts. As a full survey of this field would be a book in itself, I have instead chosen some topics that stylisticians have been concerned with at the interface between literature and grammar. These include the use in literary texts of non-standard forms of the language, either by writing in regional or social varieties or by the stretching of ‘rules’ (section .), and the direct use of grammar to produce literary meaning through the exploitation of its iconic potential (section .). Stylistics is now very much more than the application of linguistic description to the language of literature and has developed theories and models of its own. But at the core of the discipline remains text (both spoken and written) and the notion that text producers have choices as to how they put language into texts. Section . will introduce some of the background to stylistic discussion of literary texts. Literature is one creative use of language (others include political speeches and advertising) which provides an opportunity to test the models arising from different grammatical theories. This opportunity arises because literary texts often stretch the norms of everyday usage. Interpreting divergent grammar relies on understanding what is ‘normal’ and can thereby reflect the systematic aspects of grammar. The strategies readers use to understand divergent grammar need explaining just as much as the stages that learners go through in acquiring a language or the loss of grammatical ability in aphasia. As John Sinclair puts it: no systematic apparatus can claim to describe a language if it does not embrace the literature also; and not as a freakish development, but as a natural specialization of categories which are required in other parts of the descriptive system. (Sinclair : )
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
. S:
.................................................................................................................................. In this section I will introduce some of the fundamental ideas and terminology used in stylistics (see Leech and Short and Jeffries and McIntyre for further reading). Section .. considers whether there is such a thing as ‘literary style’; section .. introduces fundamental concepts in stylistics; section .. summarizes the development of cognitive approaches to style; and section .. discusses the influence of corpus methods on stylistic analysis of literary grammar.
.. Is literary language different from everyday language? It is often popularly assumed that the language of literature has its own set of fundamentally different language features, and in some periods of history and some cultures today, there is indeed an expectation that the language of literature will be regarded as ‘elevated’ or special in some way. However, if there ever was an identifiably separate literary ‘register’ in English, it would be difficult to pin down to more than a handful of lexical items, most of which were regular words which fell out of fashion in everyday language and lingered on mostly in poetry (e.g., yonder and behold). In fact, although early stylistics, based on Russian formalism (Ehrlich ), spent a great deal of time and effort trying to define what was different about literary language, the discipline ultimately came to the view that if there is anything special about literary language, it is nevertheless drawing on exactly the same basic linguistic resources as the language in general. Thus, the idea that one might be able to list or catalogue the formal linguistic features in ‘literary’ as opposed to ‘non-literary’ language was eventually shown to be mistaken and recognition of the overlap between literary and other genres has now reached the point where stylistics tends to see ‘literariness’ as a point on a cline (Carter and Nash : ) rather than as an absolute. Features that might be seen as more literary (though not limited to literature) would include many of the traditional figures of speech and other literary tropes, such as metaphor, irony, metonymy, and so on, though in stylistics these would be defined more closely by their linguistic nature. Thus personification, for example, would now be shown as the result of a lexico-grammatical choice, such as the decision to combine an inanimate subject with a verb requiring an animate Actor: e.g., the moon yawned. Scholars of literary language conclude that what makes some literary works unique and/or aesthetically pleasing is the way in which they combine, frame, or contextualize the otherwise ordinary bits of language they are using. This may imply a relatively unusual concentration of one kind of feature (e.g., continuous forms of the verb such as traipsing, trudging, limping) or unusual juxtapositions of otherwise different styles of language (e.g. a mixture of formal and informal language or a mixture of different dialects).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Another feature often thought to be literary, though not unusual in other text-types, is parallelism of structure, or even exact repetition (see section ..). Though such features may indicate literary style, judgements about literary value are, in the end, not uniquely linguistic and are at least partly culturally determined as well as changeable (Cook ).
.. Stylistic concepts: foregrounding, deviation, and parallelism Stylistics began in the early twentieth century, drawing on ideas of defamiliarization from Russian formalism, which considered that works of art were aiming to make the familiar seem new. Stylistics captured this interest from a linguistic perspective by developing the related concepts of foregrounding and deviation. Foregrounding refers to the process by which individual features of text stand out from their surroundings, while deviation refers to one of the means by which this foregrounding is achieved (Douthwaite , Leech : ). Two kinds of deviation, external and internal, are distinguished. External deviation refers to features that are different in some way to the norms of the language generally. Internal deviation refers to features that differ from the norms of the text in which they appear. Studies of the psychological reality of foregrounding have demonstrated that what scholars are noticing matches the reader’s experience. Van Peer (), for example, used experimental methods to ascertain which features readers were paying special attention to and found that these matched the features identified in stylistic analysis. Miall and Kuiken () found further support for the psychological reality of foregrounding and also produced evidence that the foregrounded features can provoke an emotional response. There are many potential candidates for prototypically literary features, but none of them would be found uniquely in literature and most derive their power, including their literary effect, by reference to the norms of everyday language or the norms of the text they appear in. We will see some further examples of this phenomenon in section . on iconicity in grammar, but a straightforward illustration of foregrounding is the use of parallelism in the structuring of literary (and other) texts, as seen in the opening stanza of the classic children’s poem by Alfred Noyes, ‘The Highwayman’ (: –):1 The wind was a torrent of darkness among the gusty trees, The moon was a ghostly galleon tossed upon cloudy seas, The road was a ribbon of moonlight over the purple moor, And the highwayman came riding – Riding – riding – The highwayman came riding, up to the old inn-door. 1
With thanks to The Society of Authors as the Literary Representative of the Estate of Alfred Noyes for permission to use this extract.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
There is nothing grammatically unusual about the structure of each of the first three lines of this stanza, which have a Subject–Verb–Complement–Adverbial pattern.2 It is rather the three repetitions of the same structure that both set the musical framework for the poem, and allow the emergence of the highwayman from the distance to be foregrounded, since he arrives at the point where the grammatical pattern set up in the first three lines is disrupted for the first time. There is also a change of pattern from descriptive sentences of the pattern ‘X was Y’, where the copular verb (be) sets up a metaphorical equivalence between the subject and the complement of the sentence. The new pattern from line four by contrast has a human Agent in subject position (the highwayman), followed by an active verb phrase (came riding). This change in pattern foregrounds the highwayman from the surrounding moorland-plus-sky scene, emphasizing that he is the only sentient—and active— being for miles around. This kind of patterning is not unique to literary genres. The form and function of parallelism in political speech-making, for example, is similar to that in literature, making the language memorable and creating both musicality and foregrounding. An example is the following passage from the famous wartime speech by UK prime minister Winston Churchill, calling the British people to fight:3 we shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender
Advertising also uses parallel structures and in everyday life we sometimes use parallelism to foreground something in the stories we tell each other. Parallelism (Leech : –) can occur at any level of structure, from phonological to syntactic, and can be internally deviant (i.e. the remainder of the text does not consist of parallel structures) as well as being externally deviant (i.e. we do not normally speak or write using continually repeated structures). Parallelism can also become backgrounded where it is ubiquitous and therefore normalized. All of these effects are seen in ‘The Highwayman’, where the stanza patterns are the same each time, so that although there is a disruption (internal deviation) at line four, this is itself a regularity causing the reader to expect the same pattern in each stanza after the foregrounding of the disruption in verse one. As this example of parallelism shows, foregrounding depends upon the background, which is also stylistically patterned. Anyone who has ever described an utterance as
2
Arguably, the Adverbials could instead be analysed as postmodifiers within the Complement noun phrases. 3 See https://www.winstonchurchill.org/resources/speeches/-the-finest-hour for the full text and audio of this and other speeches by Churchill. Last accessed March .
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
‘Pinteresque’, referring to the idiosyncratically sparse style of playwright Harold Pinter, is instinctively responding to these background features of the text. The rise of corpus stylistics (see section .. below) has enabled scholars to examine these less stark patterns of style empirically for the first time.
.. Text and context: author, narrator, character, and reader The task of describing the style of literary texts combines an understanding of the norms of the language whilst recognizing unusual and abnormal usage for literary effect. In addition to the increased accuracy of description enabled by linguistic analysis, stylistics has developed a range of theories and models to address questions of textual meaning and where it resides. These theories need to recognize the special communicative circumstances of literary works. Though readers may believe the author is directly addressing them, there is often a layering of participants in the discourse situation reflected in a distance between authors, readers, narrators, and characters invoked in reading a text. The discourse structure model, proposed by Leech and Short (: –), envisages a number of levels of participation, including implied author and implied reader, which are the idealized form of these participants as envisaged by each other. There may also be one or more layers of narration as, for example, in the Joseph Conrad novel Heart of Darkness. The opening of this novel is in the voice of a first person narrator telling a tale about himself and three friends on a yacht with the ‘Director of Companies’ as their captain and host. Very soon, this I-narrative gives way to another first person narrative, when one of the friends, Marlow, starts to tell a tale about his adventures in Africa. The remainder of the novel, with the exception of the last short paragraph, is in Marlow’s voice, though it is being channelled through the voice of the first (unnamed) narrator. Within Marlow’s story, there is also a great deal of quotation of others’ speech, mostly between other characters, but also often with Marlow himself as speaker or addressee, as in the following extract (Conrad : ): “‘His end,’ said I, with dull anger stirring in me, ‘was in every way worthy of his life.’ “‘And I was not with him,’ she murmured. My anger subsided before a feeling of infinite pity. “‘Everything that could be done—’ I mumbled.
Marlow is quoting (to his friends) his conversation with the fiancée of Kurtz, the subject of the novel, and the original narrator quotes him quoting this conversation. The original narrator also implies the existence of an interlocutor (his addressee). In addition, the author (Conrad) is telling the reader this whole narrative and there is therefore an implied (idealized) author and reader as well as the actual author and reader. Thus there are at least five levels at which some kind of communication is taking place.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Such complexity of structure clearly has knock-on effects in the grammar (for example, the patterning of tenses) of the text. Remarkably, perhaps, the average reader seems to find it relatively easy to navigate the different layers of narration, possibly by letting only the most relevant layer take precedence at any one time. Recent developments in stylistics have attempted to explain the reader’s negotiation of such complexity, drawing on cognitive linguistic and psychological theories to explain readers’ construction of textual meaning. Among these, Emmott’s () contextual frame theory, Werth’s () Text World Theory, and Lakoff and Johnson’s () conceptual metaphor theory have influenced our understanding of the reading process. Emmott, for example, proposes a framework where the reader constructs contextual frames as they read, with episodic and non-episodic information being attached to the frames as the text is processed. This information is kept in mind by the reader through two processes, which Emmott labels ‘binding’ and ‘priming’. The features of any scene in the text at any one point will be bound to that scene and may be primed (i.e. brought to the forefront of the reader’s attention) if the scene is the current focus in the text being read. Once the reader moves on to a new passage, the features of that scene become ‘unprimed’ but remain bound into the scene in the reader’s memory. Thus, the reader of Heart of Darkness will at some level be aware of the initial scene on the yacht where Marlow, the original narrator and three other men are sitting, throughout. However, as this scene is not referred to again for a very long period, it is unprimed for most of the main narrative and starts to fade from the reader’s attention. The return of this scene for one final paragraph is both a jolt to the reader’s memory as they recall the features that were bound to the scene early on, and also a renewal of the many narrative layers in the novel, which could perhaps produce some scepticism for the reader as to the accuracy of the story they have just been told. Another cognitive theory, deictic shift theory (McIntyre ), helps to explain how the reader perceives the unfolding action through linguistically determined and changing points-of-view. Each of these cognitive theories envisages how some aspect of the text’s meaning is likely to be produced in the reader’s mind on the basis of features of the text. The grammatical aspects considered by these theories tend to be at levels higher than the sentence, such as textual cohesion (Halliday and Hasan ), verb tense sequences, narration types (first, second, third person, etc.) and other textual features such as transitivity or modality patterning that produce particular effects in constructing text worlds (see Werth and Gavins ). As we shall see below, there are some aspects of grammatical variation in literature where the potential effect on the reader appears to be based on a combination of what we might consider their ‘normal’ experience of the language and the foregrounded and deviant structures they encounter. This extends the cognitive trend in stylistics to include grammatical structure itself and as a result, increasingly stylistics scholars are using cognitive grammar to analyse literary texts (Harrison et al. ; see also Taylor, this volume).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
.. Corpus stylistics The advent of powerful computer technology and software to process large corpora of linguistic data has impacted practically and theoretically on the field of stylistics. The processing of literary texts using corpus software can not only tell us something about the style of a particular author4 but also show how authors produce characterization through stylistic choices (e.g., Culpeper ). Corpus analysis reveals other broader patterns, not necessarily consciously noticed by readers, detailing the background style of texts and genres, against which internally deviant features are foregrounded. Important early work in this field includes Burrows () who took computational analysis of literature to a new level by demonstrating that the behaviour of certain word classes (e.g., the modal verbs) could be shown to differentiate Austen’s characters, her speech and narration and different parts of her oeuvre. Generic tools available through most corpus linguistic software can be used to identify regular collocational patterns, which involve words that tend to occur together (e.g., boundless energy), and are largely semantic in nature; and also to identify n-grams (see Stubbs ), significantly frequent multiword sequences (e.g., it seemed to me), used to discover expressions typical of an author, character or genre. Corpus stylistics of this kind depends on an electronic version of the text in which word forms can be identified. One good example is Ho () who demonstrates the production and testing of hypotheses about the style of John Fowles. A step up from using word-form recognition software is to use a part-of-speech tagged corpus as in Mahlberg (), who, amongst other things, examines the repetition of lexical clusters in the work of Dickens. These consist of repeated sequences of words with lexical gaps as in ‘(with) his/her ___ prep det ____’ (where ‘prep’ and ‘det’ match words tagged as prepositions and determiners, respectively): examples of this pattern include ‘with his eyes on the ground’ or ‘his hand upon his shoulder’. Such clusters demonstrate a link with more lexically-based approaches to communication (Hoey ) and developments in grammatical theory such as construction grammar which emphasizes the importance of semi-prefabricated utterances (Croft and see also Hilpert, this volume, for more on constructional approaches to grammar). Whilst construction grammar attempts to describe the facility with which speakers produce apparently new structures in general language use, however, Mahlberg’s clusters are identified as having what she terms a ‘local textual function’ indicating important features of the plot or character. Progress towards a reliable automated word class tagging tool has been made (Garside , Fligelstone et al. , Fligelstone et al. , Garside and Smith ), but correctly identifying the word class of even high percentages (– per cent) of words is not the same as producing a full grammatical analysis. Very few attempts have been made to study literature with a fully-parsed corpus of data, though Moss () is 4
Note, however, that identifying typical features of an author’s style is not the same as establishing beyond doubt who actually wrote a text. See Kahan () for a sceptical history of stylometrics.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
an exception in creating and analysing a parsed corpus of the work of Henry James. (See also Wallis, this volume for more on corpus approaches to grammatical research.) Automatic parsing (structural analysis) has a much higher error rate than automatic part-of-speech tagging, and the process of manual correction is labour-intensive. Consequently, there are only a small number of fully parsed and corrected corpora, such as ICE-GB (https://www.ucl.ac.uk/english-usage/projects/ice-gb/) based at the Survey of English Usage (University College London) (see Wallis, this volume). Though some attempts have been made to semi-automate the tagging of corpora for discourse presentation (see, for example, Mahlberg’s CliC software5), there remain many difficulties with identifying the different functions of some formal structures, such as modal verbs, whose function (e.g., epistemic or deontic) can only be discerned by a human analyst (see Ziegeler, this volume, for discussion of modality). The other huge advance in corpus methods is the chance to use very large corpora (e.g., the British National Corpus: www.natcorp.ox.ac.uk/) as a baseline against which smaller corpora or individual texts or features can be compared. Although there remain problems in defining what constitutes any particular norm, the growing number of both general and specific corpora has enabled researchers to establish what might be considered a background style.
. U -
.................................................................................................................................. Most recent developments in literary language have tended towards democratization. The use of social and regional dialect forms in literature has challenged the dominance of varieties connected with education, wealth and power (section ..), and many writers have challenged not only the supremacy of one variety of English, but also the dominance of the written over the spoken mode (section ..). In addition, there have been attempts by writers to represent thought rather than language itself, and this is seen in the relative ‘ungrammaticality’ of some literary work (section ..).
.. Representing regional and social varieties in literature Perhaps the most obvious question to ask about the grammar of literary works is whether they use a standard form of the language. Authors can be constrained by social 5 Started at the University of Nottingham, and now a collaboration with the University of Birmingham, this software was first developed to investigate the language of Dickens, but is now being developed for wider application to literary language. One of its features is the ability to locate direct speech in suspended quotations (where the reporting clause interrupts the quoted speech). See Mahlberg et al. () for more information.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
and cultural pressures to use a standard form but increasingly in literatures written in English there is a sense that anything goes. Literary fashions at times dictated which variety of a language was acceptable for literary works, but this kind of linguistic snobbery is no longer the norm and geographical, social, and ethnic varieties are increasingly in use in mainstream as well as niche forms of literature. Much contemporary literature attempts to replicate dialect forms in the written language, in the dialogue and sometimes also in the narration, particularly where the narrator is a character in the story. Examples of different dialect use include the novels of James Kelman (e.g., The Busconductor Hines and How Late it Was, How Late), which are written in a Glaswegian dialect; the No. Ladies’ Detective Agency novels by Alexander McCall Smith who represents Botswanan English in his characters’ speech; Roddy Doyle whose novels (e.g., Paddy Clarke Ha Ha Ha; The Commitments) conjure up Irish dialects of English and Cormac McCarthy (e.g., Blood Meridian) whose characters and narration reflect the language of North America’s Southwest. There are many others, of course, and the increasing informality of literary prose means that the boundary between dialect and informal language is increasingly invisible (see Hodson on the use of dialect in film and literature). Since the early twentieth century, writers have found themselves increasingly able to exercise choice of variety, yet sometimes these decisions can be politically charged. Writers who operated entirely in their native dialect in the more distant past were perhaps less conscious of making a decision than those using their dialect to contest oppressive regimes or to give voice to people with less access to prestigious forms of the language. The relationship between dialect literature and oppression, however, is not always straightforward. Whilst some writers (e.g., Irvine Welsh, who used code-switching between idiomatic Scots and Standard English in both the dialogue and narration of his novel Trainspotting) might wish to assert their right to write in a non-standard dialect (or a minority language), others have been advised to do so by the powerful elites of dominant culture: Howells suggests that in order to assure both critical acclaim and commercial success, the poet should dedicate himself to writing verses only in “black” dialect (Jarrett : ).
Here, we see Jarrett’s representation of the advice from a benevolent, if patronising, literary reviewer and publisher, William Howells, to the poet and fiction writer Paul Laurence Dunbar (–), who was ‘torn between, on the one hand, demonstrating his commitment to black political progress and, on the other, representing blacks as slaves and their language as dialect, which are what prominent literary critics and publishers expected of him and black writers in general’ (Jarrett : ). The representation of variation in language use has been a significant part of characterization at least since Chaucer’s time, when, as Phillips (: ) says, ‘Chaucer’s fictional language lessons run counter to much conventional wisdom about late medieval multilingual practice, as he celebrates enterprising poseurs instead of diligent scholars or eloquent courtiers’. Whilst Chaucer’s pilgrims were found code-switching repeatedly between English, French, and Latin, Shakespeare’s characters were more prone to
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
being caricatured by isolated and repeated dialect forms such as ‘look you’ for the Welsh characters (e.g., Fluellen in Henry V or Evans in Merry Wives of Windsor) or incorrect Early Modern English grammar, such as ‘this is lunatics’ (Crystal a: ).
.. Representation of spoken/informal language in literature There are many ways in which grammatical variation is used to invoke the spoken language in literary works, often to produce realism, but also to produce foregrounding by internally deviant uses of non-standard spoken forms contrasting with surrounding standard forms. The literary effects of this are wide-ranging, but include the kind of pathos we find in Yorkshire poet Tony Harrison’s poems about his (dead) parents, where the sounds and structures of his father’s voice, for example, are blended with the poet’s own educated standard English style. Thus, we find the opening line of ‘Long Distance II’ includes ‘my mother was already two years dead’ (Harrison : ), where a standard version would change was dead to had been dead. Apart from advantages for the scansion and rhyme-scheme, this hints at the humble origins of the poet whose life has taken such a different path to those of his parents. Harrison’s evocation of his father in ‘Long Distance I’ includes non-standard demonstratives preceding nouns such as ‘them sweets’ (instead of those sweets) which evoke the old man whose dialect the poet used to share. Much of Harrison’s early poetry relates his experience of becoming differentiated from his family background by his education and he mostly writes in Standard English where his ‘own’ voice is being represented. To find occasional hints of his earlier dialect in amongst the standard forms, then, directly references his bifurcated identity and is a kind of iconic use of dialect forms to reflect the meaning of the poem directly. As well as including regional dialect forms, writers have increasingly adopted informal modes of expression, even in novels. Some writers adopt an almost entirely consistent spoken dialect (e.g., Irvine Welsh in Trainspotting) but others represent spoken language only in the reported speech of novels or the script of plays. Modern grammatical description based on spoken corpora of language has developed ways of identifying patterns in this informal usage of speakers (Carter and McCarthy ) which show that it is also highly structured. Examples include utterances that are grammatically indeterminate (e.g., Wow); phrases that function communicatively in context without clause structure (e.g., Oh that); aborted or incomplete structures (e.g., A bit) and back-channelling (e.g., Mm or Yeah) where speakers support each other (examples from Carter ; see also Dorgeloh and Wanner, this volume). The direct quoting of spoken language is a staple of narrative fiction and there are also clear grammatical regularities in any indirect quotation, so that present tense verbs take on a past form when quoted indirectly, first person becomes third person and
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
any proximal deixis becomes distal. Thus I am coming home next Tuesday turns into He said he was going home the following Tuesday. These clear categories of direct and indirect speech, however, have been shown by stylistics to be more complex in practice, as there are other categories of speech presentation, including perhaps the most versatile and interesting category, ‘free indirect speech’ (Leech and Short : –). With free indirect speech, some of the lexis and grammatical form of the quoted speaker is included, although some changes in form (i.e. tense, person, and deixis) match those for indirect speech. Thus, we might see He was coming home next Tuesday as a kind of blend of the two examples above, presenting the reader with both a narrated version of the speech and also something of the flavour of the original utterance, with the tense and person changed, but the deixis left intact. More or fewer of the features of indirect speech may be included, resulting in an apparently more or less faithful version of the original utterance. However, context is crucial for understanding whose words are being represented, as the loss of quotation marks and reporting clause can make the utterance appear to belong to the narrator or implied author instead. Whilst not unique to literary texts, free indirect speech is particularly useful to fiction-writers for blending the point-of-view of their characters with that of the narrator and/or implied author.
.. Representing cognitive patterns through ‘ungrammaticality’ One consequence of increasing variation in literary language is the experimentation with non-dialectal ‘ungrammaticality’ which has featured in many movements including modernist writing (e.g., Ernest Hemingway, William Faulkner, Virginia Woolf ), L=A=N=G=U=A=G=E poetry (e.g., Leslie Scalapino, Stephen Rodefer, Bruce Andrews) and feminist poetry (e.g., Fleur Adcock, U. A. Fanthorpe, Carol Rumens). The twentieth century in particular was a time when structuralist views of how human language worked induced in some writers the conviction that the language they spoke and used on a daily basis was inappropriate to their needs as artists, as it was likely to reproduce and reinforce social attitudes and inequalities embedded in the community from which it arose. This view arose from popular versions of the ‘Whorfian hypothesis’ (or ‘Sapir–Whorf hypothesis’; see Aarts et al. ) which, in its strong form, claimed that human beings were limited to thinking along lines laid out by the language they spoke. Feminists responded strongly to this theory as helping to explain some aspects of their oppression in a patriarchal society (see, for example, Spender ). Creative responses to this hypothesis, now accepted in a much weaker form (that language influences thought rather than determining it), went beyond the more basic reaction against the (standard) language of a recognized oppressor, with the result that writers tried to reflect mental and emotional states directly through a fragmentary and ‘new’ form of the language.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Here is an example from ‘Droplets’ by Carol Rumens, who, like other contemporary poets, has responded to the freedom of challenging grammatical ‘correctness’ in her poems: Tiniest somersaulter through unroofed centuries, puzzling your hooped knees as you helter-skelter, dreamy, careless, Carol Rumens, ‘Droplets’ (Hex, Bloodaxe Books, , p. )6
Not only does Rumens play with the morphology here, creating nouns from verbs (‘somersaulter’) and verbs from nouns (‘helter-skelter’) as well as possibly noncederived forms such as ‘unroofed’, she also fails to resolve the syntax by not introducing any main clause verbs until so late that they seem to have no real connection to the long series of images conjured up by multiple noun phrases and subordinate clauses. Her explanation for this (private communication) is that it is an ‘apostrophe’, a poetic form in which the speaker in a poem addresses someone or something not present. This may have influenced the long sequence of address forms, but it does not alter the fact that the syntax, by everyday standards, is stretched to the limit. These literary responses to language as a kind of socio-cultural straightjacket originally resulted in writing that was deemed ‘too difficult’ by the reading public, though in some cases, such as the poetry of T. S. Eliot, critics celebrated the innovations. Over time, however, writers reverted to exploiting in less extreme ways some of the flexibilities built into the ‘system’ of the language, bending the rules without completely abandoning them and relying on more subtle forms of foregrounding to communicate their meaning. In addition, poetry readers became more accustomed to seeing the grammar stretched in new ways and were better able to process the results. In a number of cases, such examples of gentle, but often repetitive grammatical abnormality—or spoken forms—have been used to produce a particular kind of characterization to identify the neuro-atypicality of characters and indicate their unusual ways of seeing the world. This technique is labelled ‘mind-style’ in stylistics (Fowler ) and, unlike the modernist and other writers expressing their own angst in relation to the world, mind-style is a use of ungrammaticality to express—and to try to understand—the world view of people we may otherwise not comprehend. There are a few famous and well-studied novels that feature the language of a character whose cognitive processes differ from those of the majority population. These include, for example, characters with learning difficulties such as Benjy in Faulkner’s The Sound and the Fury and characters who appear to lack the normal cognitive abilities of Homo sapiens such as the Neanderthal, Lok, in Golding’s The Inheritors (Leech and Short ). Other studies have focussed on characters showing 6
With thanks to Carol Rumens for permission to use this extract.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
symptoms of high-functioning autism such as Christopher in Haddon’s The Curious Incident of the Dog in the Night-time (Semino ). In the case of Benjy and Lok, from totally different worlds, but sharing a lack of understanding about causation, we find, for example, that transitive verbs are used without their objects, since the connections between the action and its effect are missing (Leech and Short : – and – respectively). Here, for example, Benjy is describing a game of golf: ‘Through the fence, between the curling flower spaces, I could see them hitting’. Possibly because of the distance (he is looking through the fence) or because he doesn’t understand the purpose of golf, Benjy just describes the actions of the golfers and leaves the normally transitive verb without its goal. The reader has to work out what is going on from this and similar sentences. In the case of Christopher, the linguistic correlates of his cognitive style include his lack of use of pro-forms, resulting in repetitive parallel structures (e.g., ‘I do not like proper novels. In proper novels people say . . . ’) and his over-use of coordinated structures rather than subordinated structures (e.g., ‘I said that I wanted to write about something real and I knew people who had died but I did not know any people who had been killed, except Edward’s father from school, Mr Paulson, and that was a gliding accident, not murder, and I didn’t really know him’) (Semino ). In all these cases, the grammatical oddness provides clues as to the character’s cognitive state, rather than an attempt to literally reproduce the likely language of such a character. Similarly, where a short-term rather than a chronic cognitive impairment is depicted by textual means, the apparent breakdown of language may symbolically represent the mental distress of the character, rather than directly representing their language use. An example is the William Golding novel Pincher Martin, whose protagonist is drowning and where the stylistic choices reflect his mental state as his body parts appear to take on a life of their own: ‘His hand let the knife go’. A fuller analysis of this phenomenon in the novel can be found in Simpson (: –) who shows that as Pincher Martin gets closer to death, the role of Actor in relation to material action clauses is increasingly taken by either body parts or by inanimate entities such as ‘the lumps of hard water’ or ‘sea water’. Pincher Martin stops having agency and is reduced to an observer of his own demise.
. I , ,
.................................................................................................................................. The discussion of grammatical variation in literature in this section will compare literary style with general or statistical norms of grammatical behaviour in English. In considering grammatical aspects of literary style, there are interesting effects that poets can achieve by the subtlest of manipulations of the ‘normal’ grammatical resources of English. The effects examined below do, of course, occur in prose as well as poetry and in non-literary as well as literary texts, but they seem peculiarly concentrated in
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
poems as some of them are likely to have detrimental effects on comprehension in the more functional genres, which, unlike poetry, depend on conformity to norms of structure. Poetic style depends a great deal on the local effects of individual stylistic choices made by the poet, which are therefore more often foregrounded effects, rather than cumulative background effects of a pattern of choices. However, some of these choices also build up over a text to produce a particular topographical effect which may provide the norm against which a later internally deviant feature could be contrasted. In this section I will use the semiotic concept of ‘iconicity’ to refer to the kind of direct meaning-making that emanates from certain grammatical choices made by poets. Whilst not as directly mimetic as the sound of a word (e.g., miaouw) representing the sound it refers to, this form of iconicity can be said to be mimetic insofar as it exploits the inevitable temporal processing of language (Jeffries b). It is no surprise, then, to find that the speeding up and delaying of time in structural ways forms a large part of the iconicity to be found amongst grammatical choices in poetry.
.. Nominal versus verbal grammar and literary effect For convenience, we might argue that language use is largely made up of labelling/naming (Jeffries a) and of the representation of processes and states. Roughly speaking, these divisions of meaning match nominal and verbal structures. The remainder of the language is largely subsumed into these divisions (e.g., adjectives often occur within noun phrases; many adverbs are attached to verbs) and what is not accounted for in this way—mostly adverbials and a few adjectival complements—forms a tiny part of the whole. Whereas grammatical description is mainly concerned with the structures present in linguistic data, stylisticians have an additional perspective based on their awareness that producers of text repeatedly make choices between different options. One of the most frequent decisions that writers make, albeit subconsciously, is whether to put information into the nominal or the verbal (or sometimes the adverbial) parts of the structure. It may seem surprising that such choices are available on a regular basis, but if we consider the option to nominalize processes, the existence of denominal verbs and the presence of processes within the modification of noun phrases, there is less of a clear division between nominal and verbal information than might seem to be the case at first sight. So, for example, the choice of a denominal verb in the opening of Sansom’s poem ‘Potatoes’ focuses more on the whole action than would the more standard me, putting them in a bucket: The soil turns neatly as he brings them to light; And me, bucketing them. Peter Sansom, ‘Potatoes’ (Point of Sale, Carcanet, , p. )7
7
Extract used by permission of Carcanet Press Limited; © Carcanet Press Limited, Manchester, UK.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
This kind of inventiveness, using recognizable derivational processes, is standard fare for poets, but it demonstrates an important feature of grammatical variation in literature. In order to be able to understand creative uses of language, the reader must be able to refer to some systematic core of the language. In this case, we may be subconsciously aware of all sorts of container-to-action zero derivations of this kind in the standard vocabulary (as in bottle the wine or box the eggs) which give us the pattern for the usage in this line. Note that we are unlikely to link it to an existing verbal meaning as in the rain was bucketing down unless it fits the context. As with morphology, in syntax there are many examples where the decision to put information either nominally or verbally has a literary effect. One of the reasons that Newlyn’s poem ‘Comfortable box’, about her childhood in Yorkshire, has the impact of a photograph capturing a fleeting moment is because it is almost entirely made up of noun phrases, lacking a main-clause tensed verb, so there is no link to time, present or past: Nothing so cheerfully compact as the full fat cardboard box delivered Fridays, proudly stacked by our man from Groocock’s. Lucy Newlyn, ‘Comfortable box’ (Ginnel, Carcanet, , p. )8
One could argue that there is an ellipted ‘dummy’ clause here (There is nothing so cheerfully compact . . . ) but the effect would surely be different if one were to add the ‘missing’ elements, as the present tense seems at odds with what is clearly a memory while the past tense (There was nothing so cheerfully compact . . . ) lacks the vividness of the original. The choice made by the poet is not to follow the usual clause structure, thus avoiding what is normally expected in main clauses: that it will be located in some kind of time frame, rather than a photo frame. Noun phrases with no predicate are a common grammatical occurrence in casual conversation, as for example in the answer to a question (Who stole the bike?—The man next door). However, these are normally interpreted in relation to some prior language, in this case the question. By contrast, the use of noun phrases with no further clause structure and no prior language is a staple of poetic language and common too in other literary forms. The reason for this is the variability of effect that can be achieved by naming things and people, with as much description as the author desires, but without fixing them into a time-limited, verbally mediated relationship with the context.
.. Verbal delay and other clausal variation If the lack of a main-clause tensed verb can produce in the reader the effect of timelessness that the poem is trying to communicate, this is just one of the ways in 8
Extract used by permission of Carcanet Press Limited; © Carcanet Press Limited, Manchester, UK.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
which syntax can be iconic. Jeffries (b) argues that certain syntactic adjustments which diverge from the norms of English could be said to have a more or less directly iconic effect on the reader: Through the juxtaposition of subordinate clauses and main clauses, the poet may cause the reader not just to perceive but to actually experience some of the same feelings of frustration and resignation that are being described. (Jeffries b: )
In this case, the Fanthorpe poem ‘The Unprofessionals’ was under scrutiny. In this poem the feelings of someone being visited after something terrible (a bereavement?) had happened are reflected in the syntax as well as the semantics of the poem. Here the main-clause tensed verb together with its subject (‘They come’) arrives only after three lines of adverbial delay beginning ‘When the worst thing happens . . . ’. This has the effect of a structural hiatus representing directly the feeling of disorientation after the ‘worst thing’ and before the relief of friends and neighbours starts to arrive with ‘They come’. The remainder of the poem lacks further main-clause verbs (except a repeat of ‘they come’). The effect is to somewhat undermine the effect of relief since there is a very long list of non-finite verbs (holding, talking, sitting, doing etc.) which imply busyness, possibly without purpose or result. This effect is confirmed by informal discussions with readers of the poem who often vary between seeing the ‘unprofessionals’ as good people (i.e. informal relief from the situation) or as bad people (i.e. busybodies who don’t know when to leave). There are very many similar examples, and not just in poetry, but here is a simple one from a poem by Helen Dunmore: The queue’s essentially docile surges get us very slowly somewhere. Helen Dunmore, ‘The Queue’s Essentially’ (The Malarkey, Bloodaxe Books, , p. )
As argued elsewhere (e.g., Jeffries b: ), the English reader is keen to arrive at a main-clause verb in a sentence. One of the effects of the long noun phrases discussed in the last section is that the reader will be wondering when the verb is coming. In cases like the Newlyn extract discussed there, it never does and the reader has to learn to live outside time. In this short example from Dunmore, the verb does eventually arrive, but is still quite late after a relatively long subject (up to and including ‘surges’). The topic of the stanza, which observes how queues work, slowly but surely, is reflected in this initial wait (for the verb). Once the verbal element is arrived at, it is then followed immediately not by the obligatory locative complement (‘somewhere’) which the verb requires, but by an optional, and therefore potentially frustrating, adverbial of manner whose own structure includes a structurally unnecessary intensifier (‘very’) which also adds to the impression of waiting too long for syntactic closure.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
There are a number of ways of making the reader (of English at least) wait for the main clause verb of the sentence, including the use of a long subject, as we saw above, and also the inclusion of a long optional adverbial or a string of adverbials as we see in the opening of Davidson’s poem ‘Margaret in the Garden’: With your new strappy sandals and summer top and sat in the one garden chair against the fence, with the light breeze lifting the hem of your skirt and the corners of your Saturday newspaper, you remind me of one of those domestic landscapes from between the wars, now so laden with foreboding that we feel worried when we look at them. Jonathan Davidson, ‘Margaret in the Garden’ (Early Train, smith|doorstop, , p. )9
The structure of this sentence is marked, because there is almost an excess of detail in the first four lines which act both informationally and grammatically as the preface to the main proposition (‘you remind me . . . ’). This produces an effect like the opening frames of a film where the shot pans out from the detail of the clothing to show the garden chair, the fence, and then the wind’s minor effects on an otherwise static scene. There are a number of potential effects of waiting for the subject and main verb to arrive. Like the noun phrases with no clause structure, the scene remains somehow timeless, until the verb is arrived at. In this particular case, the verb does not break the spell since it describes a relatively static process (‘remind’), which is then linked to a visual image (‘domestic landscape’), confirming the snapshot or filmic effect of the first four lines. As we have seen in this section, the effect of playing with what grammarians might call ‘information structure’ (see Kaltenböck, this volume) can be to produce direct experiences for readers responding to abnormal or exaggerated structural effects.
. C
.................................................................................................................................. This chapter has attempted to discuss the wide range of what can be said about grammatical variation in literary texts and also illustrate in a little more detail some of the developments in stylistics to show how grammatical concerns are taken up in relation to literary texts. Literature has been shown to reflect everything from the reality of spoken variation through geographical or social dialects to the individual characteristics of fictional characters or the author’s attempt to convey something of the mental state of the narrator or characters in their work. In addition, studies of the language of 9
Extract used with permission of The Poetry Business, Sheffield, UK.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
literature have tried to demonstrate how it may directly induce certain cognitive states in the reader through the manipulation of the norms of grammatical structure. Though stylistics has been often portrayed as a kind of applied version of linguistics proper, there are many ways in which what it shows about textual meaning is central to the endeavours of linguistics in understanding and describing human language in all its forms (Jeffries ). Having developed a range of theories and models of textual meaning, many of which focus on the reader’s perspective, stylistics has some questions to ask about grammatical structure and meaning which may be testing for theories of grammar. I have tried to show in this chapter that although there have been some specific literary registers in the past, and these may still exist in some languages, for much of literature, the resources on which it draws are identical to those for any other language use. The literary and aesthetic issues that arise are a kind of secondary effect derived from the normal effects of the same linguistic (including grammatical) features. These secondary effects arise in various ways, such as: • using non-standard varieties of language in unexpected contexts, genres, or manners • using ungrammaticality for particular literary effect (e.g., to represent breakdown of characters) • deviating from grammatical norms (e.g., length of subject) for iconic effect on reader • interpreting grammar in the light of the genre of the text. Grammatically deviant language can occur in a range of text types and genres, including advertising, and unlike the grammatical deviance of, for example, learners or those with speech and language pathologies, this deviation is deliberate (though not always self-conscious) and used for particular meaningful effect. As readers, we rely on our knowledge of some kind of core grammatical system in order to interpret the new structures, and these same techniques are used to interpret unusual or accidental language encountered in non-literary situations. The next stage in understanding the grammar of literary works may well be to expand the range of grammatical and other theories that are used to explore literary meaning, in order to explain the interpretation of more or less deviant language. This has already begun in cognitive stylistics (see, for example, Stockwell ) but there is more to be done. At the beginning of this chapter, we saw Sinclair’s claim that linguistic description needed to ‘embrace’ literary usage as a natural part of human language and I have tried to demonstrate in the discussion above how literary style both depends on, and differs from, ordinary everyday language. If we are to explain how readers decode and interpret literary texts, we either have to posit a huge range of different abilities which are tuned to respond to texts which look a bit like ‘normal’ language, but are different in kind, or we have to assume that the process of decoding and interpreting literary language a) is based on exactly the same skills and abilities as understanding
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
everyday language and b) relies on this ‘normal’ experience to interpret those aspects of literary texts that diverge from the norm. Of course, I am arguing for the latter, and this brings with it a significant conclusion: that language users possess some kind of mentally stored core grammar on which they rely for understanding both regular and deviant texts. This is not to argue that these internal grammars of speakers are in existence from birth and it does not mean that they are immutable. One possible view of grammatical understanding would be to accept that we develop internal grammars through experience, that these grammars become relatively stable over time, but not completely unchanging—and that nevertheless they are stable enough to use as a baseline against which to judge and interpret new experiences, whether produced accidentally (as in the grammar of learners or impaired speakers) or deliberately (as in literary texts).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
R
.......................................
Aarts, Bas (). ‘English binominal noun phrases.’ Transactions of the Philological Society (): –. Aarts, Bas (). ‘Corpus linguistics, Chomsky and Fuzzy Tree Fragments’, in Christian Mair and Marianne Hundt (eds), Corpus Linguistics and Linguistic Theory. Amsterdam: Rodopi, –. Aarts, Bas (). ‘Subordination’, in Keith Brown (ed), The Encyclopedia of Language and Linguistics, nd edn. (vol. ). Oxford: Elsevier, –. Aarts, Bas (a). Syntactic Gradience: The Nature of Grammatical Indeterminacy. Oxford: Oxford University Press. Aarts, Bas (b). ‘In defence of distributional analysis, pace Croft.’ Studies in Language. : –. Aarts, Bas (). Oxford Modern English Grammar. Oxford: Oxford University Press. Aarts, Bas (). ‘The subjunctive conundrum in English.’ Folia Linguistica (): –. Aarts, Bas (). English Syntax and Argumentation, th edn. Basingstoke: Palgrave Macmillan. Aarts, Bas, and Liliane Haegeman (). ‘English word classes and phrases’, in Bas Aarts and April MacMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –. Aarts, Bas, and Sean Wallis (). ‘Noun phrase simplicity in spoken English’, in Ludmila Veselovská and Markéta Janebová (eds), Complex Visibles Out There (Proceedings of the Olomouc Linguistics Colloquium : Language Use and Linguistic Structure). Olomouc: Palacký University, –. Aarts, Bas, Sylvia Chalker, and Edmund Weiner (). The Oxford Dictionary of English Grammar, nd edn. Oxford: Oxford University Press. Aarts, Bas, Joanne Close, and Sean Wallis (). ‘Choices over time: Methodological issues in current change’, in Bas Aarts, Joanne Close, Geoffrey Leech, and Sean Wallis (eds), The Verb Phrase in English. Cambridge: Cambridge University Press, –. Aarts, Bas, David Denison, Evelien Keizer, and Gergana Popova (eds) (). Fuzzy Grammar: A Reader. Oxford: Oxford University Press. Aarts, Flor, and Jan Aarts (). English Syntactic Structures. New York/Leiden: Prentice Hall and Martinus Nijhoff (First published in by Pergamon Press). Abney, Steven P. (). The English Noun Phrase in its Sentential Aspect. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Aboh, Enoch (). ‘Topic and Focus within D’, in Linguistics in the Netherlands . Amsterdam: John Benjamins, –. Abraham, Werner (). ‘The aspectual source of the epistemic-root distinction of modal verbs’, in Winfried Boeder, Christoph Schroeder, Karl Heinz Wagner, and Wolfgang Wildgen (eds), Sprache in Raum und Zeit: In Memoriam Johannes Becher [Band : Beiträge zur Empirischen Sprachwissenschaft]. Tübingen: G. Narr, –. Abraham, Werner, and Elisabeth Leiss (eds) (). Modality-Aspect Interfaces: Implications and Typological Solutions. Amsterdam: John Benjamins.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Achard, Michel (). ‘The syntax of French raising verbs’, in Alan Cienki, Barbara J. Luka, and Michael B. Smith (eds), Conceptual and Discourse Factors in Linguistic Structure. Stanford: CSLI Publications, –. Achard, Michel (). ‘Complementation’, in Dirk Geeraerts and Hubert Cuyckens (eds), The Oxford Handbook of Cognitive Linguistics. Oxford: Oxford University Press, –. Ackema, Peter, and Maaike Schoorlemmer (). ‘The middle construction and the syntaxsemantics interface.’ Lingua, : –. Ackerman, Farrell, Gregory T. Stump, and Gert Webelhuth (). ‘Lexicalism, periphrasis, and implicative morphology’, in Robert D. Borsley and Kersti Börjars (eds), Nontransformational Syntax: Formal and Explicit Models of Grammar. Oxford: Wiley Blackwell, –. Acquaviva, Paolo, and Phoevos Panagiotidis (). ‘Lexical decomposition meets conceptual atomism.’ Lingue e Linguaggio : –. Acuña-Fariña, Juan Carlos (). The Puzzle of Apposition: On So-called Appositive Structures in English. Universidade de Santiago de Compostela. Acuña-Fariña, Juan Carlos (). ‘Aspects of the grammar of close apposition and the structure of the noun phrase.’ English Language and Linguistics (): –. Ades, Anthony E., and Mark Steedman (). ‘On the order of words.’ Linguistics and Philosophy (): –. Adger, David (). Core Syntax: A Minimalist Approach. Oxford: Oxford University Press. Adger, David (). A Syntax of Substance. Cambridge, MA: MIT Press. Åfarli, Tor A. (). ‘Do verbs have argument structure?’, in Eric Reuland, Tanmoy Bhattacharya, and Giorgos Spathas (eds), Argument Structure. Amsterdam: John Benjamins, –. Ágel, Vilmos (). Valenztheorie. Tübingen: Narr. Ágel, Vilmos, Ludwig M. Eichinger, Hans-Werner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds) (). Dependenz und Valenz: Ein Internationales Handbuch Der Zeitgenössischen Forschung. Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: De Gruyter. Ágel, Vilmos, Ludwig M. Eichinger, Hans-Werner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds) (). Dependenz und Valenz: Ein Internationales Handbuch der Zeitgenössischen Forschung. Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: De Gruyter. Aikhenvald, Alexandra Y. (). Imperatives and Commands. Oxford: Oxford University Press. Aikhenvald, Alexandra Y., and R. M. W. Dixon (eds) (). Serial Verb Constructions: A Cross-Linguistic Typology. Oxford: Oxford University Press. Ainsworth-Darnell, Kim, Harvey G. Shulman, and Julie E. Boland (). ‘Dissociating brain responses to syntactic and semantic anomalies: Evidence from event-related potentials.’ Journal of Memory and Language : –. Aitchison, Jean (). Language Change: Progress or Decay? Cambridge: Cambridge University Press. Ajdukiewicz, Kazimierz (). ‘Die syntaktische Konnexität.’ Studia Philosophica : –. Translated as Ajdukiewicz (). Ajdukiewicz, Kazimierz (). ‘Syntactic connexion’, in McCall Storrs (ed), Polish Logic, –. Oxford: Clarendon Press, –. Translation of Ajdukiewicz ().
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Akmajian, Adrian, and Adrienne Lehrer (). ‘NP-like quantifiers and the problem of determining the head of an NP.’ Linguistic Analysis (): –. Alexiadou, Artemis (a). ‘Nominalizations: A probe into the architecture of grammar. Part I: the nominalization puzzle.’ Language and Linguistics Compass, : –. Alexiadou, Artemis (b). ‘Nominalizations: A probe into the architecture of grammar. Part II: the aspectual properties of nominalizations, and the lexicon vs. syntax debate.’ Language and Linguistics Compass, : –. Alexiadou, Artemis (). ‘Syntax and the lexicon’, in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis: An International Handbook, Handbücher zur Sprachund Kommunikationswissenschaft. Berlin: Mouton de Gruyter Verlag, –. Alexiadou, Artemis, Elena Anagnostopoulou, and Florian Schäfer (). External Arguments in Transitivity Alternations: A Layering Approach. Oxford: Oxford University Press. Alexiadou, Artemis, Liliane Haegeman, and Melita Stavrou (). Noun Phrase in the Generative Perspective. Berlin: Mouton de Gruyter. Alexiadou, Artemis, Monika Rathert, and Arnim von Stechow (eds) (). Perfect Explorations. Berlin/New York: Mouton de Gruyter. Allan, Keith (). Natural Language Semantics. Oxford: Blackwell. Allan, Keith, and Kasia M. Jaszczolt (eds) (). The Cambridge Handbook of Pragmatics. Cambridge: Cambridge University Press. Allen, Margaret Rees (). Morphological Investigations. Ph.D. thesis. Storrs, CT: University of Connecticut. Allerton, D. J. (). ‘The notion of “givenness” and its relation to presupposition and to theme.’ Lingua : –. Allerton, D. J. (). Valency and the English Verb. London/New York: Academic Press. Allott, Stephen (). Lindley Murray –. York: Sessions Book Trust. Allwood, Jens, Lars-Gunnar Andersson, and Östen Dahl (). Logic in Linguistics. Cambridge: Cambridge University Press. Al-Mutairi, Fahad Rashed (). The Minimalist Program: The Nature and Plausibility of Chomsky’s Biolinguistics. Cambridge: Cambridge University Press. Aloni, Maria, and Paul Dekker (eds) (). The Cambridge Handbook of Formal Semantics. Cambridge: Cambridge University Press. Alston, R. C. (). A Bibliography of the English Language from the Invention of Printing to the Year . Vol. . English Grammars Written in English. Leeds: E. J. Arnold. Alston, William (). Illocutionary Acts and Sentence Meaning. Ithaca: Cornell University Press. Anderson, John. M. (). ‘Structural analogy and universal grammar.’ Lingua, : –. Anderson, Stephen R. (). A-Morphous Morphology. Cambridge: Cambridge University Press. Anderson, Stephen R. (). ‘The morpheme: Its nature and use’, in Matthew Baerman (ed), The Oxford Handbook of Inflection. Oxford: Oxford University Press, –. Anderwald, Lieselotte (). Negation in Non-Standard British English: Gaps, Regularizations and Asymmetries. London: Routledge. Anderwald, Lieselotte (). ‘The varieties of English spoken in the Southeast of England: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Anderwald, Lieselotte (). ‘Negation in Varieties of English’, in Raymond Hickey (ed), Areal Features of the Anglophone World. Berlin/Boston: Mouton De Gruyter, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Anderwald, Lieselotte, and Bernd Kortmann (). ‘Typological methods in dialectology’, in Manfred Krug and Julia Schlüter (eds), Research Methods in Language Variation and Change. Cambridge: Cambridge University Press, –. Andreou, Marios, and Angela Ralli (). ‘Revisiting exocentricity in compounding’, in Ferenc Kiefer, Mária Ladányi, and Péter Siptár (eds), Current Issues in Morphological Theory: (Ir)regularity, Analogy and Frequency. Amsterdam/Philadelphia: Benjamins, –. Andrews, Avery D. (). ‘Lexical structure’, in Frederick J. Newmeyer (ed), Linguistics: The Cambridge Survey. Vol. . Cambridge: Cambridge University Press, –. Andrews, Avery D. (). ‘Propositional glue and the architecture of LFG.’ Linguistics and Philosophy (): –. Anonymous (). The Art of Teaching in Sport. London: John Marshall. Anthony, Laurence (). ‘AntConc: Design and development of a freeware corpus analysis toolkit for the technical writing classroom’, in: Proceedings of IPCC , –. Anttila, Arto, and Young-mee Yu Cho (). ‘Variation and change in optimality theory.’ Lingua, , –. Arbini, Ronald (). ‘Tag-questions and tag-imperatives in English.’ Journal of Linguistics : –. Ariel, Mira (). ‘Discourse, grammar, discourse.’ Discourse Studies (): –. Arnauld, Antoine, and Claude Lancelot (). Grammaire Générale et Raisonnée. Paris: Pierre le Petit. Arnold, Doug, and Andrew Spencer (). ‘A constructional analysis for the skeptical’, in Stefan Müller (ed), Proceedings of the HPSG Conference. Stanford, CA: CSLI Publications, –. Arnold, Jennifer E., Anthony Losongco, Thomas Wasow, and Ryan Ginstrom (). ‘Heaviness vs. newness: The effects of structural complexity and discourse status on constituent ordering.’ Language : –. Aronoff, Mark (). Word Formation in Generative Grammar. Cambridge, MA: MIT Press. Aronoff, Mark (). Morphology by Itself: Stems and Inflectional Classes. Cambridge, MA: MIT Press. Aronoff, Mark, and Mark Lindsay (). ‘Productivity, blocking, and lexicalization’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Derivational Morphology. Oxford: Oxford University Press, –. Asudeh, Ash (). Resumption as Resource Management. Ph.D. thesis. Stanford, CA: Stanford University. Asudeh, Ash (a). ‘Control and semantic resource sensitivity.’ Journal of Linguistics : –. Asudeh, Ash (b). ‘Relational nouns, pronouns, and resumption.’ Linguistics and Philosophy (): –. Asudeh, Ash (). ‘Direct compositionality and the architecture of LFG’, in Miriam Butt, Mary Dalrymple, and Tracy Holloway King (eds), Intelligent Linguistic Architectures: Variations on Themes by Ronald M. Kaplan. Stanford, CA: CSLI Publications, –. Asudeh, Ash (). The Logic of Pronominal Resumption. Oxford: Oxford University Press. Asudeh, Ash, and Gianluca Giorgolo (). ‘Flexible composition for optional and derived arguments’, in Miriam Butt and Tracy Holloway King (eds), Proceedings of the LFG Conference. Stanford, CA: CSLI Publications, –. Asudeh, Ash, and Ida Toivonen (). ‘With lexical integrity.’ Theoretical Linguistics (–): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Asudeh, Ash, Mary Dalrymple, and Ida Toivonen (). ‘Constructions with lexical integrity.’ Journal of Language Modelling (): –. Asudeh, Ash, Gianluca Giorgolo, and Ida Toivonen (). ‘Meaning and valency’, in Miriam Butt and Tracy Holloway King (eds), Proceedings of the LFG Conference. Stanford, CA: CSLI Publications. Athanasiadou, Angeliki, Costas Canakis, and Bert Cornillie (eds) (). Subjectification: Various Paths to Subjectivity. Berlin: Mouton De Gruyter. Atkinson, Dwight (). ‘The evolution of medical research writing from to : The case of the Edinburgh Medical Journal.’ Applied Linguistics : –. Atkinson, Dwight (). Scientific Discourse in Sociohistorical Context. London: Lawrence Erlbaum. Atlas, Jay David (). ‘Presupposition’, in Horn and Ward (), –. Austin, J. L. (). How to do Things with Words. Oxford: Clarendon Press. Austin, J. L. (). How to Do Things with Words. nd edn. Oxford: Oxford University Press. Bach, Emmon (). ‘An extension of classical transformational grammar’, in Problems of Linguistic Metatheory, –. Michigan State University. Proceedings of the Conference. Bach, Kent (). ‘Conversational impliciture.’ Mind and Language (): –. Bach, Kent (). ‘Quantification, qualification and context: A reply to Stanley and Szabó.’ Mind and Language (–): –. Bach, Kent (). ‘Impliciture vs. explicature: What’s the difference?’, in Belén Soria and Esther Romero (eds), Explicit Communication: Robyn Carston’s Pragmatics. Basingstoke: Palgrave Macmillan, –. Bach, Kent, and Robert M. Harnish (). Linguistic Communication and Speech Acts. Cambridge, MA: MIT Press. Baerman, Matthew, Dunstan Brown, and Greville G. Corbett (). The Syntax-Morphology Interface: A Study of Syncretism. Cambridge: Cambridge University Press. Baeskow, Heike (). ‘Semantic restrictions on word-formation: The English sufffix -ee’, in Peter O. Müller, Ingeborg Ohnheiser, Susan Olsen and Franz Rainer (eds), Word-Formation: An International Handbook of the Languages of the World, vol. . Berlin: Mouton De Gruyter, –. Baker, Mark C. (). Lexical Categories. Cambridge: Cambridge University Press. Baker, Paul, and Erez Levon (). ‘Picking the right cherries? A comparison of corpus-based and qualitative analyses of news articles about masculinity.’ Discourse and Communication (): –. Baker, Robert G., and Philip T. Smith (). ‘A psycholinguistic study of English stress assignment rules.’ Language and Speech, : –. Barber, Charles (). The English Language: A Historical Introduction. Cambridge: Cambridge University Press. Bard, Ellen Gurman, Dan Robertson, and Antonella Sorace (). ‘Magnitude estimation of linguistic acceptability.’ Language : –. Bar-Hillel, Yehoshua (). ‘A quasi-arithmetical notation for syntactic description.’ Language (): –. Barker, Chris (). ‘The dynamics of vagueness.’ Linguistics and Philosophy (): –. Barker, Chris (). ‘Episodic -ee in English: A thematic role constraint on a new word formation.’ Language, : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Barker, Chris, and Pauline Jacobson (eds) (). Direct Compositionality. Oxford: Oxford University Press. Baron, Dennis (). Declining Grammar. Urbana, IL: National Council of Teachers of English. Barsky, Robert F. (). Chomsky: A Life of Dissent. Cambridge, MA: MIT Press. Bartels, Christine (). The Intonation of English Statements and Questions: A Compositional Interpretation. New York: Garland. Barth-Weingarten, Dagmar, and Elizabeth Couper-Kuhlen (). ‘On the development of final though: A case of grammaticalization?’, in Ilse Wischer and Gabriele Diewald (eds), New Reflections on Grammaticalization. Amsterdam/Philadelphia: John Benjamins, –. Bary, Corien, and Dag Haug (). ‘Temporal anaphora across and inside sentences: The function of participles.’ Semantics and Pragmatics (): –. Bastiaansen, Marcel, Ali Mazaheri, and Ole Jensen (). ‘Beyond ERPs: Oscillatory neuronal dynamics’, in Emily S. Kappenman and Steven J. Luck (eds), The Oxford Handbook of Event-Related Potential Components. Oxford: Oxford University Press, –. Bates, Elizabeth, and Brian MacWhinney (). ‘Competition, variation, and language learning’, in Brian MacWhinney (ed), Mechanisms of Language Acquisition. Hillsdale NJ: Erlbaum, –. Bauer, Laurie (). The Grammar of Nominal Compounding. Odense: Odense University Press. Bauer, Laurie (). English Word-Formation. Cambridge: Cambridge University Press. Bauer, Laurie (). ‘Be-heading the word.’ Journal of Linguistics, : –. Bauer, Laurie (). ‘Derivational paradigms’, in Booij, Geert, and van Marle, Jaap (eds), Yearbook of Morphology . Dordrecht: Kluwer Academic Publishers, –. Bauer, Laurie (). ‘Is there a class of neoclassical compounds and is it productive?’ Linguistics, : –. Bauer, Laurie (). ‘Word’, in Geert Booij, Christian Lehmann, and Joachim Mugdan (eds), Morphology: An International Handbook of Inflection and Word-Formation. Berlin/New York: De Gruyter, –. Bauer, Laurie (a). Morphological Productivity. Cambridge: Cambridge University Press. Bauer, Laurie (b). ‘Compounds’, in Martin Haspelmath, Ekkehard König, Wulf Oesterreicher, and Wolfgang Raible (eds), Language Universals and Language Typology. Berlin/ New York: De Gruyter, –. Bauer, Laurie (). Introducing Linguistic Morphology. nd edition. Edinburgh: Edinburgh University Press. Bauer, Laurie (a). ‘Classical morphemics: Assumptions, extensions and alternatives’, in Andrew Hippisley and Gregory Stump (eds), The Cambridge Handbook of Morphology. Cambridge: Cambridge University Press, –. Bauer, Laurie (b). ‘Re-evaluating exocentricity in word-formation’, in Daniel Siddiqi and Heidi Harley (eds), Morphological Metatheory. Amsterdam/Philadelphia: Benjamins, –. Bauer, Laurie, and Rodney Huddleston (). ‘Lexical word-formation’, in Rodney Huddleston and Geoffrey K. Pullum (eds), The Cambridge Grammar of the English Language. Cambridge: Cambridge University Press, –. Bauer, Laurie, and Antoinette Renouf (). ‘A corpus-based study of compounding in English.’ Journal of English Linguistics : –. Bauer, Laurie, Rochelle Lieber, and Ingo Plag (). The Oxford Reference Guide to English Morphology. Oxford: Oxford University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Baumgärtner, Klaus (). ‘Konstituenz und Dependenz’, in Hugo Steger (ed), Vorschläge für eine Strukturale Grammatik des Deutschen. Darmstadt: Wissenschaftliche Buchgesellschaft, . Bazerman, Charles (). Shaping Written Knowledge: The Genre and Activity of the Experimental Article in Science. Madison: University of Wisconsin Press. Bazerman, Charles (). ‘The life of genre, the life in the classroom’, in Wendy Bishop and Hans Ostrom (eds), Genre and Writing: Issues, Arguments, Alternatives. Portsmouth, NH: Boynton/Cook-Heinemann, –. Beal, Joan (). ‘The grammar of Tyneside and Northumbrian English’, in James Milroy and Lesley Milroy (eds), Real English: The Grammar of English Dialects in the British Isles. London: Longman, –. Beal, Joan (). ‘English dialects in the North of England: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar W. Schneider, and Clive Upton (eds) (), –. Beal, Joan (). ‘ years of prescriptivism (and counting)’. In Ingrid Tieken-Boon van Ostade and Wim van der Wurff (eds) () Current Issues in Late Modern English. Berlin: Peter Lang, –. Beard, Robert (). ‘On the separation of derivation from affixation: Toward a lexeme/ morpheme based morphology’. Quaderni di semantica, : –. Beard, Robert (). Lexeme-Morpheme Base Morphology. Stony Brook, NY: SUNY Press. Beaugrande, Robert-Alain de, and Wolfgang Ulrich Dressler (). Introduction to Text Linguistics. London: Longman. Beaver, David I. (). ‘Presupposition’, in van Benthem and ter Meulen (), –. Beaver, David I., and Bart Geurts (). ‘Presupposition’, in Edward N. Zalta (ed), The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, CSLI, Stanford University, Winter edn. Bell, Melanie J. (). ‘The English noun-noun construct: A morphological and syntactic object’, in Angela Ralli, Geert Booij, Sergio Scalise, and Athanasios Karasimos (eds), Morphology and the Architecture of Grammar, –. https://geertbooij.files.wordpress. com///mmm_proceedings.pdf (last accessed April ). Bell, Melanie J., and Ingo Plag (). ‘Informativeness is a determinant of compound stress in English.’ Journal of Linguistics : –. Bemis, Douglas K., and Liina Pylkkänen (). ‘Simple composition: A magnetoencephalography investigation into the comprehension of minimal linguistic phrases.’ Journal of Neuroscience : –. Benczes, Réka (). Creative Compounding in English: The Semantics of Metaphorical and Metonymical Noun-Noun Combinations. Amsterdam/Philadephia: Benjamins. Bender, Emily M., and Ivan A. Sag (). ‘Incorporating contracted auxiliaries in English’. In Ronnie Cann, Claire Grover and Philip H. Miller (eds), Grammatical Interfaces in HPSG, number in Studies in Constraint-Based Lexicalism, –. Stanford: CSLI Publications. Ben-Shachar, Michal, Dafna Palti, and Yosef Grodzinsky (). ‘Neural correlates of syntactic movement: Converging evidence from two fMRI experiments.’ Neuroimage : –. Ben-Shachar, Michal, Talma Hendler, Itamar Kahn, Dafna Ben-Bashat, and Yosef Grodzinsky (). ‘The neural reality of syntactic transformations: Evidence from functional magnetic resonance imaging.’ Psychological Science : –. Berg, Thomas (). ‘The position of adjectives on the noun–verb continuum.’ English Language and Linguistics, , –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Berko, Jean (). ‘The child’s learning of English morphology.’ Word : –. Bermúdez-Otero, Ricardo (). Stratal Optimality Theory. Oxford: Oxford University Press. Bermúdez-Otero, Ricardo, and Kersti Borjars (). ‘Markedness in phonology and in syntax: The problem of grounding.’ Lingua, : –. Bermúdez-Otero, Ricardo, and Patrick Honeybone (). ‘Phonology and syntax: A shifting relationship.’ Lingua, : –. Berwick, Robert C., Paul Pietroski, Beracah Yankama, and Noam Chomsky (). ‘Poverty of the stimulus revisited.’ Cognitive Science : –. Bianchi, Valentina (). ‘The raising analysis of relative clauses: A reply to Borsley.’ Linguistic Inquiry : –. Biber, Douglas (). Variation across Speech and Writing. Cambridge: Cambridge University Press. Biber, Douglas (). Dimensions of Register Variation: A Cross-Linguistic Study. Cambridge: Cambridge University Press. Biber, Douglas (). ‘Historical patterns for the grammatical marking of stance: A crossregister comparison.’ Journal of Historical Pragmatics (): –. Biber, Douglas (a). ‘Stance in spoken and written university registers.’ Journal of English for Academic Purposes, (): –. Biber, Douglas (b). University Language: A Corpus-Based Study of Spoken and Written Registers. Amsterdam: John Benjamins. Biber, Douglas (). ‘Register as a predictor of linguistic variation.’ Corpus Linguistics and Linguistic Theory : –. Biber, Douglas, and Susan Conrad (). Register, Genre, and Style. Cambridge: Cambridge University Press. Biber, Douglas, and Bethany Gray (). ‘Challenging stereotypes about academic writing: Complexity, elaboration, explicitness.’ Journal of English for Academic Purposes : –. Biber, Douglas, and Bethany Gray (). ‘Grammar emerging in the noun phrase: The influence of written language use.’ English Language and Linguistics : –. Biber, Douglas, and Bethany Gray (). Grammatical Complexity in Academic English: Linguistic Change in Writing. Cambridge: Cambridge University Press. Biber, Douglas, and Randy Reppen (eds) (). The Cambridge Handbook of English Corpus Linguistics. Cambridge: Cambridge University Press. Biber, Douglas, Stig Johansson, Geoffrey Leech, Susan Conrad, and Edward Finegan (). Longman Grammar of Spoken and Written English. Harlow: Longman. Bieswanger, Markus (). ‘Aviation English: Two distinct specialized registers?, in Christoph Schubert and Christina Sanchez-Stockhammer (eds), Variational Text Linguistics: Revisiting Register in English. (Topics in English Linguistics .) Berlin: Mouton De Gruyter, –. Billig, Michael (). ‘The language of critical discourse analysis: the case of nominalization.’ Discourse and Society, (): –. Binnick, Robert (). Time and the Verb: A Guide to Tense and Aspect. Oxford: Oxford University Press. Binnick, Robert (). ‘The markers of habitual aspect in English.’ Journal of English Linguistics : –. Binnick, Robert (). ‘Used to and habitual aspect in English.’ Style, : –. Binnick, Robert (ed) (). The Oxford Handbook of Tense and Aspect. Oxford: Oxford University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Birner, Betty J. (). ‘Information status and word order: An analysis of English inversion.’ Language (): –. Birner, Betty J. (). ‘Form and function in English by-phrase passives.’ Chicago Linguistic Society : –. Birner, Betty J. (). ‘Inferential relations and noncanonical word order’, in Betty J. Birner and Gregory Ward (eds), Drawing the Boundaries of Meaning: Neo-Gricean Studies in Pragmatics and Semantics in Honor of Laurence R. Horn. Amsterdam/Philadelphia: John Benjamins, –. Birner, Betty J. (). Introduction to Pragmatics. Malden, MA.: Wiley-Blackwell. Birner, Betty J., and Gregory Ward (). Information Status and Noncanonical Word Order in English. Amsterdam/Philadelphia: John Benjamins. Birner, Betty J., and Gregory Ward (). ‘Information structure’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Malden, MA: Blackwell, –. Blevins, James P. (). Word and Paradigm Morphology. Oxford: Oxford University Press. Blevins, James P., and Ivan A. Sag (). ‘Phrase structure grammar’, in Marcel den Dikken (ed), The Cambridge Handbook of Generative Syntax. Cambridge: Cambridge University Press, –. Bloch, B. (). ‘English verb inflection.’ Language, : –. Bloomfield, Leonard (). Language. New York: Holt. Blumenthal-Dramé, Alice (). Entrenchment in Usage-Based Theories. Berlin: De Gruyter. BNC = The British National Corpus. Distributed by Oxford University Computing Services on behalf of the BNC Consortium. http://www.natcorp.ox.ac.uk/ (last accessed June ). Also available through other interfaces, e.g. Davies (–). Boas, Hans C. (). A Constructional Approach to Resultatives. Stanford: CSLI Publications. Boas, Hans C. (). ‘Zum Abstraktionsgrad von Resultativkonstruktionen’, in Stefan Engelberg, Anke Holler, and Kristel Proost (eds), Sprachliches Wissen zwischen Lexikon und Grammatik. Berlin/New York: Mouton de Gruyter, –. Boas, Hans C. (). ‘Cognitive construction grammar’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Boas, Hans C. (). ‘Lexical and phrasal approaches to argument structure: Two sides of the same coin.’ Theoretical Linguistics (–): –. Bobaljik, Jonathan David (). Universals in Comparative Morphology: Suppletion, Superlatives, and the Structure of Words. Cambridge, MA: MIT Press. Bobaljik, Jonathan David (). ‘Distributed Morphology’, MS accessed at: http://bobaljik. uconn.edu/papers/DM_ORE.pdf on April, . Bodomo, Adams (). ‘Joan Bresnan’, in Philipp Strazny (ed), Encyclopedia of Linguistics. New York: Fitzroy Dearborn, –. Boeckx, Cedric (). Linguistic Minimalism. Oxford: Oxford University Press. Boeckx, Cedric, Norbert Hornstein, and Jairo Nunes (). Control as Movement. Cambridge: Cambridge University Press. Boehm, Barry (). ‘A Spiral Model of Software Development and Enhancement’. ACM SIGSOFT Software Engineering Notes :, –. Boersma, Paul, and. D. Weenink (–). Praat: Doing phonetics by computer (Version ..) [http://www.praat.org]. Bolinger, Dwight (). ‘Accent is predictable (if you’re a mind-reader).’ Language, : –. Bolinger, Dwight (). Meaning and Form. London: Longman.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Bolinger, Dwight (a). ‘Intonation across languages’, in Joseph H. Greenberg (ed), Universals of Human Language. Volume : Phonology. Stanford: Stanford University Press. Bolinger, Dwight (b). ‘A semantic view of syntax: Some verbs that govern infinitives’, in Mohammad Ali Jazayery, Edgar C. Polomé, and Werner Winter (eds), Linguistic and Literary Studies in Honor of Archibald A. Hill. The Hague: Mouton, –. Bonami, Olivier, and Gilles Boyé (). ‘Supplétion et classes flexionnelles dans la conjugaison du français.’ Langages, : –. Bonami, Olivier, and Gregory Stump (). ‘Paradigm function morphology’, in Andrew Hippisley and Gregory Stump (eds), The Cambridge Handbook of Morphology. Cambridge: Cambridge University Press, –. Booij, Geert (). The Grammar of Words. nd edition. Oxford: Oxford University Press. Booij, Geert (). Construction Morphology. Oxford: Oxford University Press. Booij, Geert (). ‘Morpheme structure constraints’, in Marc Van Oostendorp, Colin J. Ewen, Elizabeth Hume and Keren Rice (eds), The Blackwell Companion to Phonology. Oxford: Blackwell. Booij, Geert (). ‘Morphology in construction grammar’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. New York: Oxford University Press, –. Borer, Hagit (). ‘The projection of arguments’, in Elena Benedicto and Jeffrey Runner (eds), Functional Projections, vol. of University of Massachusetts Occasional Papers in Linguistics. Amherst, MA: GLSA Publications, –. Borer, Hagit (a). In Name Only, vol. of Structuring Sense. Oxford: Oxford University Press. Borer, Hagit (b). The Normal Course of Events, vol. of Structuring Sense. Oxford: Oxford University Press. Borer, Hagit (). Taking Form, vol. of Structuring Sense. Oxford: Oxford University Press. Borge, Steffen (). ‘Questions’, in Marina Sbisà and Ken Turner (eds), Pragmatics of Speech Actions. Berlin: Mouton de Gruyter, –. Borik, Olga (). Aspect and Reference Time. Oxford: Oxford University Press. Bornkessel, Ina, and Matthias Schlesewsky (). ‘The extended argument dependency model: A neurocognitive approach to sentence comprehension across languages.’ Psychological Review : –. Bornkessel, Ina, Stefan Zysset, Angela D. Friederici, D. Yves von Cramon, and Matthias Schlesewsky (). ‘Who did what to whom? The neural basis of argument hierarchies during language comprehension.’ Neuroimage : –. Bornkessel-Schlesewsky, Ina, and Matthias Schlesewsky (). ‘An alternative perspective on “semantic P” effects in language comprehension.’ Brain Research Reviews : –. Bornkessel-Schlesewsky, Ina, Matthias Schlesewsky, and D. Yves von Cramon (). ‘Word order and Broca’s region: Evidence for a supra-syntactic perspective.’ Brain and Language : –. Borsley, Robert D. (). ‘Relative clauses and the theory of phrase structure.’ Linguistic Inquiry : –. Borsley, Robert D. (). ‘Against ConjP.’ Lingua : –. Borsley, Robert D. (). ‘Constructions, functional heads and comparative correlatives’, in Olivier Bonami and Patricia Cabredo Hofherr (eds), Empirical Issues in Syntax and Semantics , –. Borsley, Robert D. (). ‘Don’t move!’ IBERIA : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Bošković, Željko (). ‘Last resort with move and agree in derivations and representations’, in Cedric Boeckx (ed), The Oxford Handbook of Linguistic Minimalism. Oxford: Oxford University Press, –. Bouma, Gosse, Robert Malouf, and Ivan A. Sag (). ‘Satisfying constraints on extraction and adjunction.’ Natural Language and Linguistic Theory : –. Bowerman, Sean (). ‘White South African English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Bowers, John (). ‘The syntax of predication.’ Linguistic Inquiry : –. Bowie, Jill, and Bas Aarts (). ‘Clause fragments in English dialogue’, in María José LópezCouso, Belén Méndez-Naya, Paloma Núñez-Pertejo, and Ignacio M. Palacios-Martínez (eds), Corpus Linguistics on the Move: Exploring and Understanding English Through Corpora (Language and Computers: Studies in Digital Linguistics, ). Leiden/Boston: Brill Rodopi, –. Bowie, Jill, and Sean Wallis (). ‘The to-infinitival perfect: A study of decline’, in Valentin Werner, Elena Seoane, and Cristina Suárez-Gómez (eds), Re-assessing the Present Perfect. Topics in English Linguistics (TiEL) . Berlin: De Gruyter, –. Bowie, Jill, Sean Wallis, and Bas Aarts (). ‘Contemporary change in modal usage in spoken British English: Mapping the impact of “genre”’, in Juana I. Marín-Arrese, Marta Carretero, Jorge Arús Hita, and Johan van der Auwera (eds), English Modality: Core, Periphery and Evidentiality. Berlin/New York: De Gruyter, –. Boyé, Gilles, and Gauvain Schalchli (). ‘The status of paradigms’, in Andrew Hippisley and Gregory Stump (eds), The Cambridge Handbook of Morphology. Cambridge: Cambridge University Press, –. Boye, Kasper, and Elisabeth Engberg-Pedersen (eds) (). Language Usage and Language Structure. Berlin: Mouton De Gruyter. Boye, Kasper, and Peter Harder (). ‘(Inter)subjectification in a functional theory of grammaticalization.’ Acta Linguistica Hafniensia, : –. Brazil, David (). A Grammar of Speech. Oxford: Oxford University Press. Breban, Tine, and Caroline Gentens (). ‘Multiple pathways: New views on pathways and mechanisms of grammaticalization in the English noun phrase.’ Functions of Language (): –. Breivik, Leiv Egil (). ‘On the interpretation of existential there.’ Language (): –. Brekle, Herbert E. (). Generative Satzsemantik und Transformationelle Syntax im System der Englischen Nominalkomposition. Munich: Fink. Brems, Lieselotte (). Layering of Size and Type Noun Constructions in English. Berlin: Mouton de Gruyter. Brems, Lieselotte, and Kristin Davidse (). ‘The grammaticalisation of nominal type noun constructions with kind/sort of : Chronology and paths of change.’ English Studies : –. Brennan, Jonathan, and Liina Pylkkänen (). ‘MEG evidence for incremental sentence composition in the anterior temporal lobe.’ Cognitive Science (S): –. Brennan, Jonathan, Yuval Nir, Uri Hasson, Rafael Malach, David J. Heeger, and Liina Pylkkänen (). ‘Syntactic structure building in the anterior temporal lobe during natural story listening.’ Brain and Language : –. Brentari, Diane (). ‘Sign language phonology’, in John A. Goldsmith, Jason Riggle, and Alan C. L. Yu (eds), The Handbook of Phonological Theory, nd edn. Oxford: Wiley-Blackwell, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Bresnan, Joan (). ‘Nonarguments for raising.’ Linguistic Inquiry : –. Bresnan, Joan (). ‘A realistic transformational grammar’, in Morris Halle, Joan Bresnan, and George A. Miller (eds), Linguistic Theory and Psychological Reality. Cambridge, MA: MIT Press, –. Bresnan, Joan (ed) (). The Mental Representation of Grammatical Relations. Cambridge, MA: MIT Press. Bresnan, Joan (). Lexical-Functional Syntax. Oxford: Blackwell. Bresnan, Joan (). ‘Is syntactic knowledge probabilistic? Experiments with the English dative alternation.’ Roots: Linguistics in Search of Its Evidential Base (Studies in Generative Grammar ). Berlin: Mouton de Gruyter, –. Bresnan, Joan, and J. Grimshaw (). ‘The syntax of free relatives in English.’ Linguistic Inquiry : –. Bresnan, Joan, and Jennifer Hay (). ‘Gradient grammar: An effect of animacy on the syntax of give in New Zealand and American English.’ Lingua (): –. Bresnan, Joan, and Ronald M. Kaplan (). ‘Introduction: Grammars as mental representations of language’, in Joan Bresnan (ed), The Mental Representation of Grammatical Relations. Cambridge, MA: MIT Press, xvii–lii. Bresnan, Joan W., Ash Asudeh, Ida Toivonen, and Stephen Wechsler (). Lexical-Functional Syntax. Second edition. Malden, MA: Wiley-Blackwell. Bresnan, Joan W., Anna Cueni, Tatiana Nikitina, and R. Harald Baayen (). ‘Predicting the dative alternation’, in Gerlof Bourne, Irene Kraemer, and Joost Zwarts (eds), Cognitive Foundations of Interpretation. Amsterdam: Royal Netherlands Academy of Science, –. Brezina, Vaclav, and Miriam Meyerhoff (). ‘Significant or random?: A critical review of sociolinguistic generalisations based on large corpora.’ International Journal of Corpus Linguistics (): –. Brinton, Laurel J. (). Grammaticalization and Discourse Functions. Berlin/New York: Mouton de Gruyter. Brisard, Frank (). ‘Introduction: The epistemic basis of deixis and reference’, in Frank Brisard (ed), Grounding: The Epistemic Footing of Deixis and Reference. Berlin/New York: Mouton de Gruyter, xi–xxxiv. Briscoe, Ted (). ‘Robust Parsing’, in Ronald Cole, Joseph Mariani, Hans Uszkoreit, Annie Zaenen, and Victor Zue (eds), Survey of the State of the Art in Human Language Technology. Cambridge: Cambridge University Press, –. Broccias, Cristiano (). ‘The syntax-lexicon continuum’, in Terttu Nevalainen and Elizabeth Closs Traugott (eds), The Oxford Handbook of the History of English. Oxford: Oxford University Press, –. Broccias, Cristiano, and Willem B. Hollmann (). ‘Do we need summary and sequential scanning in (Cognitive) grammar?’ Cognitive Linguistics : –. Bromberger, Sylvain, and Morris Halle (). ‘Why phonology is different.’ Linguistic Inquiry, : –. Brown, Dunstan, and Andrew Hippisley (). Network Morphology: A Defaults-based Theory of Word Structure. Cambridge: Cambridge University Press. Brown, Gillian, and George Yule (). Discourse Analysis. Cambridge: Cambridge University Press. Brown, Goold (). The Grammar of English Grammars. New York: Samuel S. and William Wood.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Brown, Roger (). ‘Linguistic determinism and the part of speech.’ The Journal of Abnormal and Social Psychology : . Bruening, Benjamin (). ‘The lexicalist hypothesis: Both wrong and superfluous.’ Language (): –. Bullokar, William (). William Bullokar’s Pamphlet for Grammar. London: Edmund Bollifant. Büring, Daniel (). The Meaning of Topic and Focus: The th Street Bridge Accent. London: Routledge. Büring, Daniel (). Binding Theory. Cambridge: Cambridge University Press. Büring, Daniel (). ‘(Contrastive) Topic’, in Caroline Féry and Shin Ishihara (eds), The Oxford Handbook of Information Structure. Oxford: Oxford University Press. Burridge, Kate (). ‘Synopsis: Morphological and syntactic variation in the Pacific and Australasia’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Burridge, Kate (). ‘Cos – a new discourse marker for Australian English?’ Australian Journal of Linguistics (): –. Burrows, John (). Computation into Criticism: A Study of Jane Austen’s Novels and an Experiment in Method. Oxford: Clarendon Press. Burton-Roberts, Noel (). ‘Nominal apposition.’ Foundations of Language : –. Burton-Roberts, Noel, and Geoffrey Poole (). ‘Syntax vs. phonology: A representational approach to stylistic fronting and verb-second in Icelandic.’ Lingua, : –. Butler, Charles (). English Grammar. Oxford: Turner. Butler, Christopher S. (). Structure and Function: A Guide to Three Major StructuralFunctional Theories. volumes. Amsterdam/Philadelphia: John Benjamins. Butler, Christopher S., and Francisco Gonzálvez-García (). Exploring FunctionalCognitive Space. Amsterdam/Philadelphia: John Benjamins. Bybee, Joan L. (). Morphology: A Study of the Relation between Meaning and Form. Amsterdam/Philadelphia: John Benjamins. Bybee, Joan L. (). ‘The diachronic dimension in explanation’, in John A. Hawkins (ed), Explaining Language Universals. Oxford: Blackwell, –. Bybee, Joan L. (). ‘A Functionalist approach to grammar and its evolution.’ Evolution of Communication (): –. Bybee, Joan L. (). Phonology and Language Use. Cambridge: Cambridge University Press. Bybee, Joan L. (). ‘Mechanisms of change in grammaticization: The role of frequency’, in Brian D. Joseph, and Richard D. Janda (eds), The Handbook of Historical Linguistics. Malden, MA: Blackwell. Bybee, Joan L. (). ‘From usage to grammar: The mind’s response to repetition.’ Language (): –. Bybee, Joan L. (). Frequency of Use and the Organization of Language. Oxford: Oxford University Press. Bybee, Joan L. (). Language, Usage and Cognition. Cambridge: Cambridge University Press. Bybee, Joan L. (). ‘Usage-based theory and exemplar representations of constructions’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford/New York: Oxford University Press, –. Bybee, Joan L., and Suzanne Fleischman (eds) (). Modality in Grammar and Discourse. Amsterdam/Philadelphia: John Benjamins.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Bybee, Joan L., and Paul J. Hopper (). ‘Introduction to frequency and the emergence of linguistic structure’, in Joan Bybee L. and Paul J. Hopper (eds), Frequency and the Emergence of Linguistic Structure. Amsterdam/Philadelphia: John Benjamins, –. Bybee, Joan L., and Paul J. Hopper (eds) (). Frequency and the Emergence of Linguistic Structure. Amsterdam/Philadelphia: John Benjamins. Bybee, Joan L., and William Pagliuca (). ‘Cross-linguistic comparison and the development of grammatical meaning’, in Jacek Fisiak (ed), Historical Semantics and Historical Word-Formation. Berlin: Mouton de Gruyter, –. Bybee, Joan L., and William Pagliuca (). ‘The evolution of future meaning’, in Anna Giacalone Ramat, Onofrio Carruba, and Giuliano Bernini (eds), Papers from the Seventh International Conference on Historical Linguistics. Amsterdam/Philadelphia: John Benjamins, –. Bybee, Joan L., Revere D. Perkins, and William Pagliuca (). ‘Back to the future’, in Elizabeth C. Traugott and Bernd Heine (eds), Approaches to Grammaticalization, Vol. . Amsterdam/Philadelphia: John Benjamins, –. Bybee, Joan L., Revere D. Perkins, and William Pagliuca (). The Evolution of Grammar: Tense, Aspect and Modality in the Languages of the World. Chicago: University of Chicago Press. Campbell, Lyle (). ‘Areal linguistics’, in Keith Brown (ed), Encyclopedia of Language and Linguistics, nd edn, Vol. I, –. Oxford: Elsevier. Cann, Ronnie, Ruth Kempson, and Lutz Marten (). The Dynamics of Language: An Introduction (Syntax and Semantics, ). Amsterdam: Elsevier. Caplan, David, Nathaniel Alpert, and Gloria Waters (). ‘PET studies of syntactic processing with auditory sentence presentation.’ Neuroimage : –. Cappelle, Bert, and Ilse Depraetere (). ‘Short-circuited interpretations of modal verb constructions: Some evidence from The Simpsons.’ Constructions and Frames, (): –. Cardinaletti, Anna (). ‘Toward a cartography of subject positions’, in Luigi Rizzi (ed), The Structure of CP and IP: The Cartography of Syntactic Structures Volume . Oxford: Oxford University Press, –. Carnap, Rudolf (). Meaning and Necessity: A Study in Semantics and Modal Logic. Chicago, IL: University of Chicago Press. Carnie, Andrew (). Syntax: A Generative Introduction, rd edn. Oxford: Wiley-Blackwell. Carpenter, Bob (). Type-Logical Semantics. Cambridge, MA: MIT Press. Carr, Philip, and Patrick Honeybone (). ‘English phonology and linguistic theory: An introduction to issues, and to “Issues in English Phonology”.’ Language Sciences, : –. Carstairs-McCarthy, Andrew (). An Introduction to English Morphology. Edinburgh: Edinburgh University Press. Carston, Robyn (). ‘Implicature, Explicature, and Truth-Theoretic Semantics’, in Ruth M. Kempson (ed), Mental Representations: The Interface between Language and Reality. Cambridge: Cambridge University Press, –. Carston, Robyn (). ‘Explicature and Semantics’, in UCL Working Papers in Linguistics. London: University College London, –. Carston, Robyn (). Thoughts and Utterances: The Pragmatics of Explicit Communication. Oxford: Blackwell. Carston, Robyn (). ‘Explicature and Semantics’, in Davis and Gillon (), –. Revision of Carston ().
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Carter, Ron (). ‘The grammar of talk: Spoken English, grammar and the classroom’, in Qualifications and Currriculum Authority (ed), New Perspectives on English in the Classroom. London: Qualifications and Curriculum Authority, –. Carter, Ronald, and Michael McCarthy (). Cambridge Grammar of English: A Comprehensive Guide. Cambridge: Cambridge University Press. Carter, Ronald, and Walter Nash (). Seeing Through Language: A Guide to Styles of English Writing. Oxford: Blackwell. Cassidy, Kimberley Wright, and Michael H. Kelly (). ‘Phonological information for grammatical category assignments.’ Journal of Memory and Language, : –. Cassidy, Kimberley Wright, and Michael H. Kelly (). ‘Children’s use of phonology to infer grammatical class in vocabulary learning.’ Psychonomic Bulletin and Review : –. Cattell, Ray (). ‘Negative transportation and tag questions.’ Language : –. Cauthen, John V. (). Chasing the Wind. Exlibris (ebook), . Chafe, Wallace L. (). ‘Givenness, contrastiveness, definiteness, subjects, topics, and point of view’, in Charles N. Li (ed), Subject and Topic. New York: Academic Press, –. Chafe, Wallace L. (). ‘Cognitive constraints on information flow’, in Russell S. Tomlin (ed), Coherence and Grounding in Discourse. Amsterdam/Philadelphia: John Benjamins, –. Chafe, Wallace L. (). ‘Grammatical subjects in speaking and writing.’ Text : –. Chafe, Wallace L. (). Discourse, Consciousness, and Time: The Flow and Displacement of Conscious Experience in Speaking and Writing. Chicago: University of Chicago Press. Chafe, Wallace L. (). ‘Searching for meaning in language: A memoir.’ Historiographia Linguistica : –. Chapin, Paul (). Review of Integration of Transformational Theories of English Syntax. Language : –. Chapman, Siobhan (). Pragmatics. Basingstoke: Palgrave Macmillan. Chaves, Russell S. (). ‘Linearization-based word-part ellipsis.’ Linguistics and Philosophy : –. Chemla, Emmanuel (). ‘Universal implicatures and free choice effects: Experimental data.’ Semantics and Pragmatics (): –. Chemla, Emmanuel, and Benjamin Spector (). ‘Experimental evidence for embedded scalar implicatures.’ Journal of Semantics (): –. Chen, Danqi, and Christopher D. Manning (). ‘A fast and accurate dependency parser using neural networks’, in Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar: ACL, –. Available at: https://cs. stanford.edu/~danqi/papers/emnlp.pdf. Chen, Evan, W. Caroline West, Gloria Waters, and David Caplan (). ‘Determinants of BOLD signal correlates of processing object-extracted relative clauses.’ Cortex : –. Chen, Rong (). English Inversion: A Ground-Before-Figure Construction. Berlin: Mouton de Gruyter. Cheshire, Jenny, Viv Edwards, and Pamela Whittle (). ‘Non-standard English and dialect levelling’, in James Milroy and Lesley Milroy (eds), Real English: The Grammar of English Dialects in the British Isles. London/New York: Longman, –. Cheshire, Jenny, Paul Kerswill, Sue Fox, and Eivind Torgersen (). ‘Contact, the feature pool and the speech community: The emergence of Multicultural London English.’ Journal of Sociolinguistics, : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Chierchia, Gennaro (). ‘Scalar implicatures, polarity phenomena, and the syntax/ pragmatics interface’, in Adriana Belletti (ed), Structures and Beyond, vol. of The Cartography of Syntactic Structures. Oxford: Oxford University Press, –. Chierchia, Gennaro (). Logic in Grammar: Polarity, Free Choice, and Intervention. Oxford: Oxford University Press. Chierchia, Gennaro, and Sally McConnell-Ginet (). Meaning and Grammar: An Introduction to Semantics, nd edn. Cambridge, MA: MIT Press. Chierchia, Gennaro, Danny Fox, and Benjamin Spector (). ‘The grammatical view of scalar implicatures and the relationship between semantics and pragmatics’, in Claudia Maienborn, Klaus von Heusinger, and Paul Portner (eds) (), –. Chilton, Paul (). Analysing Political Discourse: Theory and Practice. London: Routledge. Chilton, Paul (). ‘Missing links in mainstream CDA: Modules, blends and the critical instinct’, in Ruth Wodak and Paul Chilton (eds), A New Agenda in (Critical) Discourse Analysis: Theory, Methodology and Interdisciplinarity. Amsterdam/Philadelphia: John Benjamins, –. Chomsky, Noam (/). The Logical Structure of Linguistic Theory. Chicago: University of Chicago Press. Chomsky, Noam (). Syntactic Structures. The Hague: Mouton. Chomsky, Noam (a). Current Issues in Linguistic Theory. The Hague: Mouton. Chomsky, Noam (b). ‘Formal discussion’, in Ursula Bellugi and Roger Brown (eds), The Acquisition of Language. Yellow Springs, OH: Antioch Press, –. Chomsky, Noam (). Aspects of the Theory of Syntax. Cambridge, MA: MIT Press. Chomsky, Noam (). ‘Remarks on nominalization’, in Roderick A. Jacobs and Peter S. Rosenbaum (eds), Readings in English Transformational Grammar. Waltham, MA: Ginn, –. Chomsky, Noam (). ‘Deep structure, surface structure, and semantic interpretation’, in Danny Steinberg and Leon Jakobovits (eds), Semantics: An Interdisciplinary Reader in Philosophy, Linguistics, and Psychology. Cambridge: Cambridge University Press, –. Chomsky, Noam (). ‘Conditions on transformations’, in Stephen R. Anderson and Paul Kiparsky (eds), A Festschrift for Morris Halle, New York: Holt, Rinehart and Winston, –. Chomsky, Noam (). The Amherst lectures. Unpublished lecture notes distributed by Documents Linguistiques. Paris: University of Paris VII. Chomsky, Noam (). ‘On wh-movement’, in Peter Culicover, Thomas Wasow, and Adrian Akmajian (eds), Formal Syntax. New York: Academic Press, –. Chomsky, Noam (). Lectures on Government and Binding. Dordrecht: Foris Publications. Chomsky, Noam (). The Generative Enterprise: A Discussion with Riny Huybregts and Henk van Riemsdijk. Dordrecht: Foris Publications. Chomsky, Noam (a). Barriers. Cambridge, MA: MIT Press. Chomsky, Noam (b). Knowledge of Language: Its Nature, Origin, and Use. New York: Praeger. Chomsky, Noam (). ‘A minimalist program for linguistic theory’, in Kenneth Hale and Samuel Keyser (eds), The View from Building : Essays in Linguistics in Honor of Sylvain Bromberger. Cambridge, MA: MIT Press, –. Chomsky, Noam (). The Minimalist Program. Cambridge, MA: MIT Press. Chomsky, Noam (). On Nature and Language. Cambridge: Cambridge University Press. Chomsky, Noam (). ‘Beyond explanatory adequacy’, in Adriana Belletti (ed), Structures and Beyond: The Cartography of Syntactic Structures Volume . Oxford: Oxford University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Chomsky, Noam (). ‘Problems of projection.’ Lingua : –. Chomsky, Noam (). ‘Problems of projection: Extensions’, in Elisa Di Domenico, Cornelia Hamann, and Simona Matteini (eds), Structures, Strategies and Beyond: Studies in Honour of Adriana Belletti. Amsterdam/Philadelphia: John Benjamins, –. Chomsky, Noam, and Morris Halle (). The Sound Pattern of English. New York: Harper and Row. Christian, Donna (). ‘The personal dative in Appalachian speech’, in Peter Trudgill and Jack K. Chambers (eds), Dialects of English: Studies in Grammatical Variation. London: Longman, –. Chung, Sandra (). ‘Syntactic identity in sluicing: How much and why.’ Linguistic Inquiry : –. Church, Kenneth (). ‘Empirical estimates of adaptation: The chance of two Noriega’s is closer to p/ than p2.’ Proceedings of COLING , –. Cinque, Guglielmo (). ‘A null theory of phrase and compound stress.’ Linguistic Inquiry, : –. Cinque, Guglielmo (). ‘Evidence for partial N-movement in the Romance DP’, in Guglielmo Cinque, Jan Koster, Jean-Yves Pollock, Luigi Rizzi, and Raffaella Zanuttini (eds), Paths Towards Universal Grammar: Studies in Honor of Richard Kayne. Washington, DC: Georgetown University Press, –. Cinque, Gugliemo, and Luigi Rizzi (). ‘The cartography of syntactic structures.’ Studies in Linguistics, University of Siena CISCL Working Papers, : –. Clark, Herbert H., and Susan E. Haviland (). ‘Comprehension and the given-new contract’, in Roy O. Freedle (ed), Discourse Production and Comprehension. Norwood, NJ: Ablex, –. Clarke, Sandra (). ‘Newfoundland English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Close, Joanne, and Bas Aarts (). ‘Current change in the modal system of English. A case study of must, have to and have got to’, in Ursula Lenker, Judith Huber, and Robert Mailhammer (eds), The History of English Verbal and Nominal Constructions. Volume of English Historical Linguistics : Selected papers from the fifteenth International Conference on English Historical Linguistics (ICEHL ), Munich, – August . Amsterdam/Philadelphia: John Benjamins, –. Coates, Jennifer (). The Semantics of Modal Auxiliaries. London: Croon Helm. Coates, Jennifer (). ‘The expression of root and epistemic possibility in English’, in Bybee and Fleischman (eds) (), –. COCA = Mark Davies (–) The Corpus of Contemporary American English (COCA): million words, –present. Available online at https://www.english-corpora.org/coca/ (last accessed May ). Coene, Martina, and Yves D’hulst (). ‘Theoretical background’, in Martina Coene and Yves D’hulst (eds), From NP to DP. Vol. : The syntax and semantics of noun phrases. Amsterdam/Philadelphia: John Benjamins, –. Coffin, Caroline, Theresa Lillis and Kieran O’Halloran (). Applied Linguistics Methods: A Reader: Systemic Functional Linguistics, Critical Discourse Analysis and Ethnography. London/Milton Keynes: Routledge and The Open University. Cohen, Mike X. (). Analyzing Neural Time Series Data. Cambridge, MA: MIT Press. Collins, Chris (). ‘A smuggling approach to the passive in English.’ Syntax : –. Collins, Peter (). Cleft and Pseudo-Cleft Constructions in English. London/New York: Routledge.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Collins, Peter (). ‘Extraposition in English.’ Functions of Language (): –. Collins, Peter (). ‘The indirect object construction in English: An informational approach.’ Linguistics (): –. Collins, Peter (). Modals and Quasi-modals in English. Amsterdam: Rodopi. Collins, Peter, and Pam Peters (). ‘Australian English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds), (), –. Collins COBUILD English Grammar (). Editor-in-chief John Sinclair. London/Glasgow: Collins. Collins COBUILD English Language Dictionary (). Editor-in-chief John Sinclair. London/ Glasgow: Collins. Comrie, Bernard (). Aspect. Cambridge: Cambridge University Press. Comrie, Bernard (). Tense. Cambridge: Cambridge University Press. Comrie, Bernard (). ‘On identifying future tenses’, in Werner Abraham and Theo Janssen (eds), Tempus-Aspekt-Modus. Tübingen: Max Niemeyer Verlag, –. Condoravdi, Cleo, and Stefan Kaufmann (eds) (). Special issue on Modality and Temporality. Journal of Semantics . . Conrad, Joseph (). Heart of Darkness. New York: Random House. Cook, Guy (). Discourse and Literature: The Interplay of Form and Mind. Oxford: Oxford University Press. Cook, Walter A. (). Introduction to Tagmemic Analysis. New York: Holt Rinehart and Winston. Copestake, Ann, Dan Flickinger, Carl Pollard, and Ivan A. Sag (). ‘Minimal recursion semantics: An introduction.’ Research on Language and Computation (): –. Corbett, Greville G. (). Number. Cambridge: Cambridge University Press. Corbett, Greville G. (). Agreement. Cambridge: Cambridge University Press. Corbett, Greville G. (). ‘Canonical derivational morphology.’ Word Structure, : –. Corver, Norbert (). ‘Predicate movement in pseudopartitives constructions’, in Artemis Alexiadou and Chris Wilder (eds), Possessors, Predicates and Movement in the Determiner Phrase. Amsterdam/Philadelphia: John Benjamins, –. Coseriu, Eugene (). ‘Inhaltliche Wortbildungslehre (am Beispiel des Typs Coupe-papier)’, in Herbert E. Brekle and Dieter Kastovsky (eds), Perspektiven der Wortbildungsforschung. Bonn: Grundmann, –. Coulson, Seana, and Todd Oakley (). ‘Blending and coded meaning: Literal and figurative meaning in cognitive semantics.’ Journal of Pragmatics : –. Coulson, Seana, Jonathan W. King, and Marta Kutas (). ‘Expect the unexpected: Event-related brain response to morphosyntactic violations.’ Language and Cognitive Processes : –. Couper-Kuhlen, Elizabeth (). ‘Grammaticalization and conversation’, in Narrog and Heine (), –. Couper-Kuhlen, Elizabeth, and Tsuyoshi Ono (). ‘Incrementing in conversation: A comparison of practices in English, German and Japanese.’ Pragmatics (): –. Couper-Kuhlen, Elizabeth, and Margret Selting (). Interactional Linguistics: Studying Language in Social Interaction. Cambridge: Cambridge University Press. Cowart, Wayne (). Experimental Syntax: Applying Objective Methods to Sentence Judgments. Thousand Oaks, CA: SAGE. Craenenbroeck, Jeroen van, and Jason Merchant (). ‘Ellipsis phenomena’, in Marcel den Dikken (ed), The Cambridge Handbook of Generative Syntax. Cambridge: Cambridge University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Craenenbroeck, Jeroen van, and Tanja Temmerman (eds) (). The Oxford Handbook of Ellipsis. Oxford: Oxford University Press. Crain, Stephen, and Janet Dean Fodor (). ‘How can grammars help parsers?’, in David R. Dowty, Lauri Karttunen, and Arnold M. Zwicky (eds), Natural Language Parsing: Psycholinguistic, Computational, and Theoretical Perspectives. Cambridge: Cambridge University Press, –. Cristofaro, Sonia (). Subordination. Oxford: Oxford University Press. Crnič, Luka, Emmanuel Chemla, and Danny Fox (). ‘Scalar implicatures of embedded disjunction.’ Natural Language Semantics (): –. Croft, William (). ‘A conceptual framework for grammatical categories (or: a taxonomy of propositional acts).’ Journal of Semantics : –. Croft, William (). Syntactic Categories and Grammatical Relations: The Cognitive Organization of Information. Chicago: University of Chicago Press. Croft, William (). ‘Parts of speech as typological universals and as language particular categories’, in Petra Maria Vogel and Bernard Comrie (eds), Approaches to the Typology of Word Classes. Berlin: Mouton de Gruyter, –. Croft, William (). Radical Construction Grammar: Syntactic Theory in Typological Perspective. Oxford: Oxford University Press. Croft, William (). Typology and Universals, nd edn. Cambridge: Cambridge University Press. Croft, William (). ‘Lexical rules vs. constructions: A false dichotomy’, in Hubert Cuyckens, Thomas Berg, René Dirven, and Klaus-Uwe Panther (eds), Motivation in Language: Studies in Honor of Günter Radden. Amsterdam/Philadelphia: John Benjamins, –. Croft, William (). Typology and Universals. Cambridge: Cambridge University Press. Croft, William (a). ‘Construction grammar’, in Hubert Cuyckens and Dirk Geeraerts (eds), The Oxford Handbook of Cognitive Linguistics. Oxford: Oxford University Press, –. Croft, William (b). ‘The origins of grammar in the verbalization of experience.’ Cognitive Linguistics : –. Croft, William (). Typology and Universals, nd edn. Cambridge: Cambridge University Press. Croft, William (). Morphosyntax. MS. Albuquerque: University of New Mexico. Croft, William (). ‘Grammar: Functional approaches’, in James D. Wright (ed), International Encyclopedia of the Social and Behavioral Sciences, nd edn, vol. . Oxford: Elsevier, –. Croft, William, and D. Alan Cruse (). Cognitive Linguistics. Cambridge: Cambridge University Press. Cruse, D. A. (). Lexical Semantics. Cambridge: Cambridge University Press. Cruttenden, Alan (). Language in Infancy and Childhood. Manchester: Manchester University Press. Cruttenden, Alan (). Intonation. Cambridge: Cambridge University Press. Cruttenden, Alan (). ‘The de-accenting of given information: a cognitive universal?’, in Bernini, Giuliano and Schwartz, Marcia L. (eds), Pragmatic Organization of Discourse in the Languages of Europe. Berlin: Mouton de Gruyter, –. Crystal, David (). The Cambridge Encyclopedia of the English Language. Cambridge: Cambridge University Press. Crystal, David (). A Dictionary of Linguistics and Phonetics, th edn. Oxford: Blackwell. Crystal, David (). Language and the Internet. Cambridge: Cambridge University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Crystal, David (a). Think on My Words: Exploring Shakespeare’s Language. Cambridge: Cambridge University Press. Crystal, David (b). A Dictionary of Linguistics and Phonetics, th edn. Oxford: Blackwell. Crystal, David (). Just a Phrase I’m Going Through. London: Routledge. Crystal, David, and Derek Davy (). Investigating English Style. London: Longman. Culicover, Peter W. (). Syntactic Nuts: Hard Cases, Syntactic Theory and Language Acquisition. Oxford: Oxford University Press. Culicover, Peter W. (). Natural Language Syntax. Oxford: Oxford University Press. Culicover, Peter W. and Ray S. Jackendoff (). ‘The view from the periphery: The English comparative correlative.’ Linguistic Inquiry : –. Culicover, Peter W. and Ray Jackendoff (). Simpler Syntax. Oxford: Oxford University Press. Culicover, Peter W., and Michael S. Rochemont (). ‘Stress and focus in English.’ Language : –. Culpeper, Jonathan (). ‘Keyness: Words, parts-of-speech and semantic categories in the character-talk of Shakespeare’s Romeo and Juliet.’ International Journal of Corpus Linguistics (): –. Culpeper, Jonathan, and Merja Kytö (). Early Modern English Dialogues. Spoken Interaction as Writing. Cambridge: Cambridge University Press. Cumming, Susanna, Tsuyoshi Ono, and Ritva Lauri (). ‘Discourse, grammar and interaction’, in Van Dijk, Teun A. (ed), Discourse Studies: A Multidisciplinary Introduction, vol. , nd edn. London: SAGE, –. Curme, George O. (). English Grammar: The Principles and Practice of English Grammar Applied to Present-Day Usage. College Outline Series. New York: Barnes and Noble. Published online by HathiTrust Digital Library: http://hdl.handle.net//mdp.. Curzan, Anne (). Fixing English: Prescriptivism and Language History. Cambridge: Cambridge University Press. Cutler, Anne, and Dennis Norris (). ‘The role of strong syllables in segmentation for lexical access’. Journal of Experimental Psychology: Human Perception and Performance, : –. Dąbrowska, Ewa (). ‘Questions with long-distance dependencies: A usage-based perspective.’ Cognitive Linguistics (): –. Dąbrowska, Ewa, and Dagmar Divjak (eds) (). The Handbook of Cognitive Linguistics. Berlin/New York: Mouton de Gruyter. Dahl, Östen (). ‘On generics’, in Edward L. Keenan (ed), Formal Semantics of Natural Language. Cambridge: Cambridge University Press, –. Dahl, Östen (). ‘Some arguments for higher nodes in syntax: A reply to Hudson’s “Constituency and dependency”.’ Linguistics : –. Dahl, Östen (). Tense and Aspect Systems. Oxford: Blackwell. Dahl, Östen (ed) (). Tense and Aspect in the Languages of Europe. (Empirical approaches to language typology –.) Berlin/New York: Mouton de Gruyter. Dahl, Östen, and Viveka Velupillai (). ‘The perfect’, in Matthew Dryer and Martin Haspelmath (eds), The World Atlas of Language Structures Online. Munich: Max Planck Digital Library. Available online at http://wals.info/chapter/. Dalrymple, Mary (ed) (). Semantics and Syntax in Lexical Functional Grammar: The Resource Logic Approach. Cambridge, MA: MIT Press. Dalrymple, Mary (). Lexical Functional Grammar. San Diego, CA: Academic Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Dalrymple, Mary (). ‘Morphology in the LFG architecture’, in Miriam Butt and Tracy Holloway King (eds), Proceedings of the LFG Conference. Stanford: CSLI Publications (http://csli-publications.stanford.edu/). Dalrymple, Mary, Ronald M. Kaplan, and Tracy Holloway King (). ‘Linguistic generalizations over descriptions’, in Miriam Butt and Tracy Holloway King (eds), Proceedings of the LFG Conference. Stanford, CA: CSLI Publications, –. Dalrymple, Mary, John Lamping, and Vijay Saraswat (). ‘LFG Semantics via Constraints’, in Proceedings of the Sixth Meeting of the European ACL, –. European Chapter of the Association for Computational Linguistics, University of Utrecht. Dalrymple, Mary, Ronald M. Kaplan, John T. Maxwell III, and Annie Zaenen (eds) (). Formal Issues in Lexical-Functional Grammar. Stanford, CA: CSLI Publications. Daneš, Frantisek (). ‘A three-level approach to syntax’, Travaux linguistiques de Prague : –. Davidse, Kristin (). ‘A constructional approach to clefts.’ Linguistics (): –. Davidse, Kristin, Lieven Vandelanotte, and Hubert Cuyckens (eds) (). Subjectification, Intersubjectification and Grammaticalization. Berlin/New York: Mouton de Gruyter. Davidsen-Nielsen, Niels (). Tense and Mood in English: A Comparison with Danish. Berlin/New York: Mouton de Gruyter. Davidson, Jonathan (). Early Train. Sheffield, Yorks.: smith|doorstop. Davies, Eirlys (). The English Imperative. Beckenham: Croom Helm. Davies, Mark (–). British National Corpus (from Oxford University Press). Available online at https://www.english-corpora.org/bnc/ (last accessed May ). Davies, Mark (–). The Corpus of Contemporary American English (COCA): Million Words, –Present. Available online at https://www.english-corpora.org/coca/ (last accessed May ). Davies, Mark (–). Corpus of Online Registers of English (CORE). Available online at https://www.english-corpora.org/core/ (last accessed May ). Davis, S. M., and M. H. Kelly (). ‘Knowledge of the English noun–verb stress difference by native and nonnative speakers.’ Journal of Memory and Language, : –. Davis, Steven (ed) (). Pragmatics: A Reader. Oxford: Oxford University Press. Davis, Steven, and Brendan S. Gillon (eds) (). Semantics: A Reader. Oxford: Oxford University Press. Davydova, Julia (). ‘Detecting historical continuity in a linguistically diverse urban area: The present perfect in modern Singapore English’, in Joana Duarte and Ingrid Gogolin (eds), Linguistic Superdiversity in Urban Areas. Amsterdam/Philadelphia: John Benjamins. Davydova, Julia, Michaela Hilbert, Lukas Pietsch, and Peter Siemund (). ‘Comparing varieties of English: Problems and perspectives’, in Peter Siemund (ed), Linguistic Universals and Language Variation. Berlin/New York: Mouton de Gruyter, –. Déchaine, Rose-Marie (). Predicates across Categories. Ph.D. thesis. Amherst, MA: University of Massachusetts. Déchaine, Rose-Marie, and Martina Wiltschko (). ‘On pro-nouns and other “pronouns”’, in Martine Coene and Yves D’hulst (eds), From NP to DP, Vol : The Syntax and Semantics of Noun Phrases. Amsterdam/Philadelphia: John Benjamins, –. Declerck, Renaat (). Studies on Copular Sentences, Clefts and Pseudo-clefts. Leuven: Leuven University Press. Declerck, Renaat (). Tense in English. Its Structure and Use in Discourse. London: Routledge.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Declerck, Renaat, and Susan Reed (). Conditionals: A Comprehensive Empirical Analysis. (Topics in English Linguistics .) Berlin/New York: Mouton de Gruyter. Declerck, Renaat, in collaboration with Susan Reed, and Bert Cappelle (). The Grammar of the English Verb Phrase. Volume . The Grammar of the English Tense System. (Topics in English Linguistics, –.) Berlin/New York: Mouton de Gruyter. Degand, Liesbeth, and Jacqueline Evers-Vermeul (). ‘Grammaticalization or pragmaticalization of discourse markers? More than a terminological issue.’ Journal of Historical Pragmatics (): –. Degand, Liesbeth, and Anne-Marie Simon-Vandenbergen (eds) (). ‘Grammaticalization, pragmaticalization, and (inter)subjectification: Methodological issues in the study of discourse markers.’ Special issue of Linguistics, : . Dehé, Nicole (). Parentheticals in Spoken English: The Syntax–Prosody Relation (Studies in English Language.) Cambridge: Cambridge University Press. Dehé, Nicole, and Yordanka Kavalova (eds) (). Parentheticals. Amsterdam/Philadelphia: John Benjamins, –. Delahunty, Gerald P. (). Topics in the Syntax and Semantics of English Cleft-Sentences. Bloomington, IN: Indiana University Linguistics Club. Delais-Roussarie, Elisabeth, Brechtje Post, Mathieu Avanzi, Carolin Buthke, Albert Di Cristo, Ingo Feldhausen, Sun-Ah Jun, Philippe Martin, T. Meisenburg, Annie Rialland Raféu Sichel-Bazin and Hi-Yon Yoo (). ‘Intonational Phonology of French: Developing a ToBI system for French’, in Pilar Prieto and Sonia Frota (eds), Intonation in Romance. Oxford: Oxford University Press. de Marneffe, Marie-Catherine, and Christopher D. Manning (). ‘The Stanford typed dependencies representation’, in Proceedings of the Workshop on Cross-Framework and Cross-Domain Parser Evaluation, –. http://nlp.stanford.edu/~manning/papers/ dependencies-coling.pdf. Den Dikken, Marcel (). Relators and Linkers: The Syntax of Predication, Predicate Inversion, and Copulas. Cambridge, MA: MIT Press. Denison, David (). ‘Auxiliary + impersonal in Old English.’ Folia Linguistica Historica : –. Denison, David (). English Historical Syntax: Verbal Constructions. London: Longman. Denison, David, and Alison Cort (). ‘Better as a verb’, in Kristin Davidse, Lieven Vandelanotte, and Hubert Cuyckens (eds), Subjectification, Intersubjectification and Grammaticalization. Berlin/New York: Mouton de Gruyter, –. Depraetere, Ilse (). ‘On the necessity of distinguishing (un)boundedness and (a)telicity.’ Linguistics and Philosophy : –. Depraetere, Ilse, and Chad Langford (). Advanced English Grammar: A Linguistic Approach. London/New York: Bloomsbury Publishing. Depraetere, Ilse, and Susan Reed (). ‘Towards a more explicit taxonomy of root possibility in English.’ English Language and Linguistics (): –. Depraetere Ilse, and An Verhulst (). ‘Source of modality: A reassessment.’ English Language and Linguistics (): –. De Smet, Hendrik (). ‘Constrained confusion: The gerund/participle distinction in Late Modern English’, in Marianne Hundt (ed) (), Late Modern English Syntax. Cambridge: Cambridge University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Diessel, Holger (). ‘Construction grammar and first language acquisition’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Dik, Simon C. (). The Theory of Functional Grammar: The Structure of the Clause. Dordrecht: Foris Publications. Dik, Simon C. (). ‘Functional grammar’, in Flip G. Droste, and John E. Joseph (eds), Linguistic Theory and Grammatical Description. Amsterdam/Philadelphia: John Benjamins, –. Dik, Simon C. (). The Theory of Functional Grammar. volumes. Berlin/New York: Mouton de Gruyter. Dikker, Suzanne, Hugh Rabagliati, and Liina Pylkkänen (). ‘Sensitivity to syntax in visual cortex.’ Cognition : –. Dineen, David A. (ed). Presupposition, vol. of Syntax and Semantics. New York: Academic Press, –. Di Sciullo, A.-M., and Williams E. (). On the Definition of Word. Cambridge, MA: MIT Press. Divjak, Dagmar (). ‘Frequency and entrenchment’, in Ewa Dąbrowska and Dagmar Divjak (eds), The Handbook of Cognitive Linguistics. Berlin/New York: Mouton de Gruyter, –. Dixon, R. M. W., and Alexandra Y. Aikhenvald (). ‘Word: a typological framework’, in Dixon R. M. W. and Alexandra Y. Aikhenvald (eds), Word: A Cross-Linguistic Typology. Cambridge: Cambridge University Press, –. Dixon, R. M. W., and Alexandra Y. Aikhenvald (eds) (). Complementation: A CrossLinguistic Typology. Oxford: Oxford University Press. Don, Jan (). Morphological Theory and the Morphology of English. Edinburgh: Edinburgh University Press. Don, Jan, and Marian Erkelens (). ‘Possible phonological cues in categorial acquisition: Evidence from adult categorization.’ Studies in Language, : –. Dons, Ute (). Descriptive Adequacy of Early Modern English Grammars. Berlin/New York: Mouton de Gruyter. Dorgeloh, Heidrun (). Inversion in Modern English: Form and Function. Amsterdam/ Philadelphia: John Benjamins. Dorgeloh, Heidrun (). ‘Conjunction in sentence and discourse: Sentence initial and and Discourse Structure.’ Journal of Pragmatics : –. Dorgeloh, Heidrun (). ‘Inversion in descriptive and narrative discourse: A text-typological account following functional principles.’ Cahiers de Recherche : –. Dorgeloh, Heidrun (). ‘“If it didn’t work the first time, we can try it again: Conditionals as a grounding device in a genre of illness discourse.’ Communication and Medicine : –. Dorgeloh, Heidrun (). ‘The interrelationship of register and genre in medical discourse’, in Christoph Schubert and Christina Sanchez-Stockhammer (eds), Variational Text Linguistics: Revisiting Register in English. (Topics in English Linguistics .) Berlin/New York: Mouton de Gruyter, –. Dorgeloh, Heidrun, and Anja Wanner (). ‘Too abstract for agents? The syntax and semantics of agentivity in abstracts of English research articles’, in Holden Härtl and Heike Tappe (eds), Mediating between Concepts and Language-Processing Structures. (Trends in Linguistics Studies and Monographs .) Berlin/New York: Mouton de Gruyter, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Dorgeloh, Heidrun, and Anja Wanner (). ‘Formulaic argumentation in scientific discourse’, in Roberta Corrigan, Edith A. Moravcsik, Hamid Ouli, and Kathleen M. Wheatley (eds), Formulaic Language. Volume : Acquisition, Loss, Psychological Reality, and Functional Explanations. (Typological Studies in Language .) Amsterdam/Philadelphia: John Benjamins, –. Dorgeloh, Heidrun, and Anja Wanner (eds) (). Syntactic Variation and Genre. (Topics in English Linguistics .) Berlin/New York: Mouton de Gruyter. Douthwaite, John (). Towards a Linguistic Theory of Foregrounding. Alessandria: Edizioni dell’Orso. Downing, Angela (). ‘An alternative approach to theme: A systemic-functional perspective.’ Word (): –. Downing, Angela (). English Grammar: A University Course, rd edn. London/New York: Routledge. Dowty, David R. (). Word Meaning and Montague Grammar: The Semantics of Verbs and Times in Generative Semantics and in Montague’s PTQ. Dordrecht: D. Reidel. Dowty, David R. (). ‘The effects of aspectual class on the temporal structure of discourse: Semantics or pragmatics?’ Linguistics and Philosophy : –. Dowty, David R. (). ‘Thematic proto-roles and argument structure.’ Language (): –. Dowty, David R. (). ‘The dual analysis of adjuncts and complements in categorial grammar’, in Ewald Lang, Claudia Maienborn, and Cathrine Fabricius-Hansen (eds), Modifying Adjuncts. Berlin/New York: Mouton de Gruyter, –. Dowty, David R., Robert E. Wall, and Stanley Peters (). Introduction to Montague Semantics. Dordrecht: Kluwer. Dreschler, Gretha Anthonia (). Passives and the Loss of Verb Second: A Study of Syntactic and Information-Structural Factors. Ph.D. thesis. Nijmegen: Radboud University. Dronkers, Nina F., David P. Wilkins, Robert D. Van Valin Jr., Brenda B. Redfern, and Jeri J. Jaeger (). ‘Lesion analysis of the brain areas involved in language comprehension.’ Cognition : –. Drubig, Hans Bernhard (). ‘On the discourse function of subject verb inversion.’ Studies in Descriptive Linguistics : –. Dryer, Matthew S. (). ‘Focus, pragmatic presupposition, and activated propositions.’ Journal of Pragmatics : –. Dryer, Matthew S. (). ‘Are grammatical relations universal?’, in Joan L. Bybee, John Haiman, and Sandra A. Thompson (eds), Essays on Language Function and Language Type. Amsterdam/Philadelphia: John Benjamins, –. Dryer, Matthew S. (). ‘Order of subject, object, and verb’, in Martin Haspelmath, Matthew S. Dryer, David Gil, and Bernard Comrie (eds), The World Atlas of Language Structures. Oxford: Oxford University Press, –. Du Bois, John W. (). ‘The discourse basis of ergativity.’ Language (): –. Du Bois, John W. (). ‘Discourse and grammar’, in Michael Tomasello (ed), The New Psychology of Language: Cognitive and Functional Approaches to Language Structure, vol. . Mahwah, NJ: Lawrence Erlbaum Associates, –. Du Bois, John W. (). ‘Towards a dialogic syntax.’ Cognitive Linguistics : –. Duffley, Patrick J. (). The English Infinitive. London: Longman.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Duffley, Patrick J. (). The English Gerund-participle: A Comparison with the Infinitive. Frankfurt: Peter Lang. Duffley, Patrick J. (). Reclaiming Control as a Semantic and Pragmatic Phenomenon. Amsterdam/Philadelphia: John Benjamins. Dunmore, Helen (). The Malarkey. Newcastle: Bloodaxe Durieux, Gert, and Steven Gillis (). ‘Predicting grammatical classes from phonological cues: An empirical test’, in J. Weissenborn and B. Höhle (eds), Approaches to Bootstrapping: Phonological, Lexical, Syntactic, and Neurophysiological Aspects of Early Language Acquisition. Amsterdam/Philadelphia: John Benjamins. Dürscheid, Christa (). Medienkommunikation im Kontinuum von Mündlichkeit und Schriftlichkeit: Theoretische und Empirische Probleme. Frankfurt: Peter Lang. Edelman, Shimon, and Morton H. Christiansen (). ‘How seriously should we take Minimalist syntax?’ Trends in Cognitive Sciences : –. Edwards, Viv (). ‘The grammar of southern British English’, in James Milroy and Lesley Milroy (eds), Real English: The Grammar of English Dialects in the British Isles. London: Longman, –. Egan, Thomas (). Non-finite Complementation: A Usage-based Study of Infinitive and -ing Clauses in English. Amsterdam: Rodopi. Ehrlich, Susan (). Point of View: A Linguistic Analysis of Literary Style. London: Routledge. Ehrlich, Victor () []. Russian Formalism: History and Doctrine. The Hague: Mouton. Einstein, Albert (). Ideas and Opinions. New York: Crown Publishers. Elbourne, Paul D. (). Situations and Individuals. Cambridge, MA: MIT Press. Elbourne, Paul D. (). Meaning: A Slim Guide to Semantics. Oxford: Oxford University Press. Elbourne, Paul D. (). ‘Incomplete descriptions and indistinguishable participants.’ Natural Language Semantics (): –. Ellis, Nick C. (). ‘Construction grammar and second language acquisition’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Elsness, Johan (). The Perfect and the Preterite in Contemporary and Earlier English. Berlin/New York: Mouton de Gruyter. Elsness, Johan (). ‘The present perfect and the preterite’, in Günter Rohdenburg and Julia Schlüter (eds), One Language, Two Grammars? Differences between British and American English. Cambridge: Cambridge University Press, –. Embick, David (). The Morpheme: A Theoretical Introduction. Berlin/New York: Mouton de Gruyter. Embick, David, and Rolf Noyer (). ‘Distributed morphology and the syntax–morphology interface’, in Gillian Ramchand and Charles Reiss (eds), The Oxford Handbook of Linguistic Interfaces. Oxford: Oxford University Press, –. Emmott, Catherine (). Narrative Comprehension: A Discourse Perspective. Oxford: Oxford University Press. Emonds, Joseph E. (). ‘Evidence that indirect object movement is a structure-preserving rule.’ Foundations of Language : –. Emonds, Joseph (). A Transformational Approach to English Syntax. New York: Academic Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Emonds, Joseph (). A Unified Theory of Syntactic Categories. Dordrecht: Foris Publications. Emons, Rudolf (). Valenzen Englischer Prädikatsverben. Tübingen: Niemeyer. Emons, Rudolf (). Valenzgrammatik für das Englische. Tübingen: Niemeyer. Engel, Ulrich (). ‘Bemerkungen zur Dependenzgrammatik.’ Sprache der Gegenwart. Jahrbuch des Instituts für deutsche Sprache, –. Engel, Ulrich (). Syntax der Deutschen Gegenwartssprache. Berlin: Schmidt. Engel, Ulrich (). ‘Dependenz ohne Konstituenz: Zur Dogmenbildung in der Linguistik.’ Neuphilologische Mitteilungen (): –. Engel, Ulrich, and Helmut Schumacher (). Kleines Valenzlexikon deutscher Verben. Tübingen: Narr. Engelberg, Stefan, Svenja König, Kristel Proost, and Edeltraut Winkler (). ‘Argumentstrukturmuster als Konstruktionen? Identität – Verwandtschaft – Idiosynkrasien’, in Stefan Engelberg, Anke Holler, and Kristel Proost (eds), Sprachliches Wissen Zwischen Lexikon und Grammatik. Berlin/New York: Mouton de Gruyter. Englebretson, Robert (). ‘Genre and grammar: Predicative and attributive adjectives in spoken English’, in Matthew L. Juge and Jeri L. Moxley (eds), Proceedings of the Twenty-Third Annual Meeting of the Berkeley Linguistics Society: General Session and Parasession on Pragmatics and Grammatical Structure. Berkeley, CA: Berkeley Linguistics Society. Epstein, Samuel David, Erich M. Groat, Ruriko Kawashima, and Hisatsugu Kitahara (). A Derivational Approach to Syntactic Relations. Oxford: Oxford University Press. Erteschik-Shir, Nomi (). ‘Discourse constraints on dative movement’, in Talmy Givón (ed), Syntax and Semantics. New York: Academic Press, –. Erteschik-Shir, Nomi (). Information Structure: The Syntax-Discourse Interface. Oxford: Oxford University Press. Evans, Nicholas (). ‘Insubordination and its uses’, in Irina Nikolaeva (ed), Finiteness: Theoretical and Empirical Foundations. Oxford: Oxford University Press, –. Evans, Nicholas, and Stephen Levinson (). ‘The myth of language universals: Language diversity and its importance for cognitive science.’ Behavioral and Brain Sciences (): –. Evans, Vyvyan (). ‘From the spatial to the non-spatial: The “state” lexical concepts of in, on and at’, in Vyvyan Evans and Paul Chilton (eds), Language, Cognition and Space. London: Equinox, –. Evans, Vyvyan, and Melanie Green (). Cognitive Linguistics: An Introduction. Edinburgh: Edinburgh University Press. Fabb, Nigel (). ‘English suffixation is constrained only by selectional restrictions.’ Natural Language and Linguistic Theory : –. Fairclough, Norman (). Discourse and Social Change. Cambridge: Polity Press. Faltz, Leonard M. (). Reflexivization: A Study in Universal Syntax. New York: Garland. Farmer, Thomas A., Morten H. Christiansen, and Padraic Monaghan (). ‘Phonological typicality influences on-line sentence comprehension.’ PNAS : –. Fauconnier, Gilles (). Mappings in Thought and Language. Cambridge: Cambridge University Press. Fauconnier, Gilles, and Mark Turner (). The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities. New York: Basic Books. Fawcett, Robin P. (). ‘The English personal pronouns: An exercise in linguistic theory’, in James D. Benson, Michael J. Cummings, and William S. Greaves (eds), Linguistics in a Systemic Perspective. Amsterdam/Philadelphia: John Benjamins, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Fawcett, Robin P. (). A Theory of Syntax for Systemic Functional Linguistics. Amsterdam/ Philadelphia: John Benjamins. Fawcett, Robin P. (). Invitation to Systemic Functional Linguistics through the Cardiff Grammar: An Extension and Simplification of Halliday’s Systemic Functional Grammar, rd edn. London/Oakville: Equinox. Fawcett, Robin P. (). The Functional Syntax Handbook: Analyzing English at the Level of Form. London: Equinox. Featherston, Sam (). ‘The Decathlon Model of empirical syntax’, in Marga Reis and Stephan Kepser (eds), Linguistic Evidence: Empirical, Theoretical, and Computational Perspectives. Berlin/New York: Mouton de Gruyter, –. Featherston, Sam (). ‘Data in generative grammar: The stick and the carrot.’ Theoretical Linguistics : – Fellbaum, Christiane (ed) (). WordNet: An Electronic Lexical Database. Cambridge, MA: MIT Press. Fens-de Zeeuw, Lyda (). Lindley Murray (–), Quaker and Grammarian. Utrecht: LOT. Ferguson, Charles A. (). ‘Dialect, register, and genre: Working assumptions about conventionalization’, in Douglas Biber and Edward Finegan (eds), Sociolinguistic Perspectives on Register. New York: Oxford University Press, –. Ferreira, Fernanda (). ‘Psycholinguistics, formal grammars, and cognitive science.’ The Linguistic Review : –. Ferris, D. Connor (). The Meaning of Syntax: A Study in the Adjectives of English. London: Longman. Féry, Caroline, and Vieri Samek-Lodovici (). ‘Focus projection and prosodic prominence in nested foci.’ Language, : –. Fiebach, Christian J., Matthias Schlesewsky, and Angela D. Friederici (). ‘Separating syntactic memory costs and syntactic integration costs during parsing: The processing of German wh-questions.’ Journal of Memory and Language : –. Fiengo, Robert (). Asking Questions: Using Meaningful Structures to Imply Ignorance. Oxford: Oxford University Press. Fiengo, Robert, and Robert May (). Indices and Identity. Cambridge, MA: MIT Press. Fillmore, Charles J. (). ‘The case for case’, in Emmon Bach and Robert T. Harms (eds), Universals in Linguistic Theory. New York: Holt, Rinehart and Winston, –. Fillmore, Charles J. (). ‘Frame semantics and the nature of language.’ Annals of the New York Academy of Sciences (): –. Fillmore, Charles J. (). ‘The mechanisms of “construction grammar”’, in Shelley Axmaker, Annie Jassier, and Helen Singmaster (eds), General Session and Parasession on Grammaticalization. Proceedings of the Fourteenth Annual Meeting. Berkeley, CA: Berkeley Linguistics Society, –. Fillmore, Charles J. (). ‘Inversion and constructional inheritance’, in Gert Webelhuth, Jean-Pierre Koenig, and Andreas Kathol (eds), Lexical and Constructional Aspects of Linguistic Explanation (Studies in Constraint-Based Lexicalism). Stanford: CSLI Publications, –. Fillmore, Charles J. (). ‘Frames, constructions and FrameNet’, in Thomas Herbst, HansJörg Schmid, and Susen Faulhaber (eds), Constructions – Collocations – Patterns. Berlin/ Boston: Mouton de Gruyter, –. Fillmore, Charles J., Paul Kay, and Mary Catherine O’Connor (). ‘Regularity and idiomaticity in grammatical constructions: The case of let alone.’ Language (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Filppula, Markku (). A Grammar of Irish English: Language in Hibernian Style. London: Routledge. Filppula, Markku (). ‘Irish English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Findlay, Jamie Yates (). The Prepositional Passive: A Lexical Functional Account. MA thesis. Oxford: University of Oxford. Findlay, Jamie Yates (). ‘Mapping Theory Without Argument Structure.’ Journal of Language Modelling (): –. Firbas, Jan (). Functional Sentence Perspective in Written and Spoken Communication. Cambridge: Cambridge University Press. Firth, John Rupert (). Papers in Linguistics –. London: Oxford University Press. Fischer, Kerstin, and Anatol Stefanowitsch (eds) (). Konstruktionsgrammatik: Von der Anwendung zur Theorie. Tübingen: Stauffenburg. Fischer, Olga (). ‘The development of the modals in English: Radical versus gradual changes’, in David Hart (ed), English Modality in Context: Diachronic Perspectives. Bern: Lang, –. Fischer, Olga (). Morphosyntactic Change: Functional and Formal Perspectives. Oxford: Oxford University Press. Fischer, Olga, and Wim van der Wurff (). ‘Syntax’, in Richard Hogg and David Denison (eds), A History of the English Language. Cambridge: Cambridge University Press, –. Fitting, Melvin (). ‘Intensional logic’, in Edward N. Zalta (ed), The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford: CSLI, Summer edn. Fleischman, Suzanne (). The Future in Thought and Language. Cambridge: Cambridge University Press. Fleischman, Suzanne (). ‘Language and medicine’, in Deborah Schiffrin, Deborah Tannen, and Heidi E. Hamilton (eds), The Handbook of Discourse Analysis. Malden, MA: Blackwell, –. Fligelstone, Steven, Mike Pacey, and Paul Rayson (). ‘How to generalize the task of annotation’, in Roger Garside, Geoff Leech, and Anthony McEnery (eds), Corpus Annotation: Linguistic Information from Computer Text Corpora. London: Longman, –. Fligelstone, Steven, Paul Rayson, and Nicholas Smith (). ‘Template analysis: Bridging the gap between grammar and the lexicon’, in Jenny Thomas and Mick Short (eds), Using Corpora for Language Research. London: Longman, –. Fodor, Jerry A. (). The Language of Thought. Cambridge, MA: Harvard University Press. Fodor, Jerry A. (). The Modularity of Mind. Cambridge, MA: MIT Press. Foley, William A., and Robert D. Van Valin Jr. (). Functional Syntax and Universal Grammar. Cambridge: Cambridge University Press. Folli, Raffaella, and Heidi Harley (). ‘Causation, obligation, and argument structure: On the nature of little v.’ Linguistic Inquiry : –. Foolen, Ad (). Review of Sweetser (). Lingua : –. Ford, Cecilia E. (). Grammar in Interaction: Adverbial Clauses in American English Conversations. Cambridge: Cambridge University Press. Fowler, Roger (). Linguistics and the Novel. London: Methuen. Fowler, Roger (). Language in the News. London: Routledge. Fowler, Roger, Bob Hodge, Gunther Kress, and Tony Trew (). Language and Control. London: Routledge. Fox, Barbara A. (). ‘Principles shaping grammatical practices: An exploration.’ Discourse Studies (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Francis, Gill (). ‘A corpus-driven approach to grammar’, in Mona Baker, Gill Francis, and Elena Tognini-Bonelli (eds), Text and Technology. Amsterdam/Philadelphia: John Benjamins, –. Francis, W. Nelson (). ‘Otto Jespersen as grammarian’, in Arne Juul and Hans F. Nielsen (eds), Otto Jespersen: Facets of His Life and Work. Amsterdam/Philadelphia: John Benjamins, –. Franke, Michael (). ‘Quantity implicatures, exhaustive interpretation, and rational conversation.’ Semantics and Pragmatics (): –. Frawley, William (). Linguistic Semantics. Hillsdale: Lawrence Erlbaum. Frawley, William (ed) (). The Expression of Modality. Berlin/New York: Mouton de Gruyter. Frazier, Lyn, and Giovanni B. Flores d’Arcais (). ‘Filler driven parsing: A study of gap filling in Dutch.’ Journal of Memory and Language : –. Freed, Alice F. (). The Semantics of English Aspectual Complementation. Dordrecht: D. Reidel. Frege, Gottlob (). ‘Über Sinn und Bedeutung.’ Zeitschrift für Philosophie und Philosophische Kritik : –. Translated as Frege (). Frege, Gottlob (–/). ‘Thought/Der Gedanke’: The Frege Reader. Oxford: Blackwell, –. Frege, Gottlob (). ‘On sense and reference’, in Peter T. Geach and Max Black (eds), Translations from the Philosophical Writings of Gottlob Frege. Oxford: Blackwell, –. Translation of Frege (). Freidin, Robert (). Generative Grammar: Theory and its History. London: Routledge. Fried, Mirjam (). ‘Principles of constructional change’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Fried, Mirjam (). ‘Construction grammar’, in Artemis Alexiadou and Tibor Kiss (eds), Handbook of Syntax, nd edn. Berlin/New York: Mouton de Gruyter, –. Friederici, Angela D. (). ‘Towards a neural basis of auditory sentence processing.’ Trends in Cognitive Sciences : –. Friederici, Angela D., and Stefan Frisch (). ‘Verb argument structure processing: The role of verb-specific and argument-specific information.’ Journal of Memory and Language : –. Friederici, Angela D., Anja Hahne, and Axel Mecklinger (). ‘Temporal structure of syntactic parsing: Early and late event-related brain potential effects.’ Journal of Experimental Psychology: Learning, Memory, and Cognition : –. Friederici, Angela D., Martin Meyer, and D. Yves von Cramon (). ‘Auditory language comprehension: An event-related fMRI study on the processing of syntactic and lexical information.’ Brain and Language : –. Friederici, Angela D., Erdmut Pfeifer, and Anja Hahne (). ‘Event-related brain potentials during natural speech processing: Effects of semantic, morphological and syntactic violations.’ Cognitive Brain Research : –. Friederici, Angela D., Jörg Bahlmann, Stefan Heim, Ricarda I. Schubotz, and Alfred Anwander (). ‘The brain differentiates human and non-human grammars: Functional localization and structural connectivity.’ Proceedings of the National Academy of Sciences of the United States of America : . Fries, Charles C. (). The Structure of English. New York: Harcourt Brace.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Fries, Peter H. (). ‘On the status of theme in English: Arguments from discourse.’ Forum Linguisticum (): –. Frisch, Stefan, Matthias Schlesewsky, Douglas Saddy, and Annegret Alpermann (). ‘The P as an indicator of syntactic ambiguity.’ Cognition : B–B. Fukui, Naoki, and Margaret Speas (). ‘Specifiers and projections.’ MIT Working Papers in Linguistics : –. Gabelentz, Georg von der (). Die Sprachwissenschaft, nd edn. Tübingen: Polyfoto Dr. Vogt. Gabrielatos, Costas and Paul Baker (). ‘Fleeing, sneaking, flooding: A corpus analysis of discursive constructions of refugees and asylum seekers in the UK press -.’ Journal of English Linguistics : –. Gagné, Christina (). ‘Relational and lexical priming during the interpretation of nounnoun combinations.’ Journal of Experimental Psychology: Learning, Memory and Cognition : –. Gahl, Susanne, and Susan Marie Garnsey (). ‘Knowledge of grammar includes knowledge of syntactic probabilities.’ Language : –. Gamut, L. T. F. (a). Intensional Logic and Logical Grammar, vol. of Logic, Language, and Meaning. Chicago: University of Chicago Press. Gamut, L. T. F. (b). Introduction to Logic, vol. of Logic, Language, and Meaning. Chicago: University of Chicago Press. Garnsey, Susan M., Michael Tanenhaus, and Robert M. Chapman (). ‘Evoked potential and the study of sentence comprehension.’ Journal of Psycholinguistic Research : –. Garside, Roger (). ‘The CLAWS word-tagging system’, in Roger Garside, Geoff Leech, and Geoff Sampson (eds), The Computational Analysis of English: A Corpus-based Approach. London: Longman. Garside, Roger, and Nicholas Smith (). ‘A hybrid grammatical tagger: CLAWS,’ in Roger Garside, Geoff Leech, and Anthony McEnery (eds), Corpus Annotation: Linguistic Information from Computer Text Corpora. London: Longman, –. Gast, Volker (). ‘I gave it him – on the motivation of the alternative double object construction in varieties of British English’, in Anna Siewierska and Willem Hollmann (eds), Ditransitivity: Special Issue of Functions of Language (): –. Gavins, Joanna (). Text World Theory: An Introduction. Edinburgh: Edinburgh University Press. Gazdar, Gerald (a). ‘Unbounded dependencies and coordinate structure.’ Linguistic Inquiry : –. Gazdar, Gerald (b). ‘Speech act assignment’, in Aravind K. Joshi, Bonnie Webber, and Ivan A. Sag (eds), Elements of Discourse Understanding. Cambridge: Cambridge University Press. Gazdar, Gerald, Geoffrey K. Pullum, and Ivan A. Sag (). ‘Auxiliaries and related phenomena in a restrictive theory of grammar.’ Language : –. Gazdar, Gerald, Ewan H. Klein, Geoffrey K. Pullum, and Ivan A. Sag (). Generalized Phrase Structure Grammar. Oxford: Blackwell, and Cambridge, MA: Harvard University Press. Geach, Peter T. (). Reference and Generality: An Examination of Some Medieval and Modern Theories. Ithaca, NY: Cornell University Press. rd revised edn, . Gee, James Paul (). Introducing Discourse Analysis: From Grammar to Society. London/ New York: Routledge.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Geeraerts, Dirk (). Theories of Lexical Semantics. Oxford: Oxford University Press. Geeraerts, Dirk, and Hubert Cuyckens (eds) (). The Oxford Handbook of Cognitive Linguistics. Oxford: Oxford University Press. Geis, Michael (). Adverbial Subordinate Clauses in English. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Geluykens, Ronald (). From Discourse Process to Grammatical Construction: On Left Dislocation in English. Amsterdam/Philadelphia: John Benjamins. Geluykens, Ronald (). The Pragmatics of Discourse Anaphora in English. Evidence from Conversational Repair. Berlin/New York: Mouton de Gruyter. Gensler, Orin (). ‘Object ordering in verbs marking two pronominal objects: Nonexplanation and explanation.’ Linguistic Typology (): –. Georgakopoulou, Alexandra, and Dionysis Goutsos (). ‘Revisiting discourse boundaries: The narrative and non-narrative modes.’ Text: Interdisciplinary Journal for the Study of Discourse : –. Gerdes, Kim, Eva Hajičová, and Leo Wanner (eds) (a). Dependency Linguistics: Recent Advances in Linguistic Theory Using Dependency Structures. Amsterdam/Philadelphia: John Benjamins. Gerdes, Kim, Eva Hajičová, and Leo Wanner (b). ‘Foreword’, in Kim Gerdes, Eva Hajičová, and Leo Wanner (eds), Dependency Linguistics: Recent Advances in Linguistic Theory Using Dependency Structures. Amsterdam/Philadelphia: John Benjamins. Gerken, L., P. W. Jusczyk, and D. R. Mandel (). ‘When prosody fails to cue syntactic structure: -month-olds’ sensitivity to phonological versus syntactic phrases.’ Cognition : –. Gerwin, Johanna (). ‘Give it me!: Pronominal ditransitives in English dialects.’ English Language and Linguistics (): –. Geurts, Bart (). ‘Scalar implicatures and local pragmatics.’ Mind and Language (): –. Geurts, Bart, and Nausicaa Pouscoulous (). ‘Embedded implicatures?!?’ Semantics and Pragmatics (): –. Geurts, Bart, and Bob van Tiel (). ‘Embedded scalars.’ Semantics and Pragmatics (): –. Giannakidou, Anastasia (). ‘The prospective as nonveridical: Polarity items, speaker commitment and projected truth’, in Jack Hoeksema and Dicky Gilbers (eds), Black Book: A Festschrift for Frans Zwarts. Groningen: University of Groningen, –. Gibson, Edward, and Evelina Fedorenko (). ‘Weak quantitative standards in linguistics research.’ Trends in Cognitive Sciences : –. Gibson, Edward, and Evelina Fedorenko (). ‘The need for quantitative methods in syntax and semantics research.’ Language and Cognitive Processes : –. Giegerich, Heinz J. (). Lexical Strata in English: Morphological Causes, Phonological Effects. Cambridge: Cambridge University Press. Giegerich, Heinz J. (). ‘Compound or phrase? English noun-plus-noun constructions and the stress criterion.’ English Language and Linguistics : –. Giegerich, Heinz J. (). ‘Associative adjectives in English and the lexicon-syntax interface.’ Journal of Linguistics : –. Giegerich, Heinz J. (). ‘The morphology of -ly and the categorial status of “adverbs” in English.’ English Language and Linguistics : –. Ginzburg, Jonathan (). The Interactive Stance: Meaning for Conversation. Oxford: Oxford University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Ginzburg, Jonathan, and Raquel Fernández (). ‘Computational models of dialogue’, in Alexander Clark, Chris Fox, and Shalom Lappin (eds), The Handbook of Computational Linguistics and Natural Language Processing. Oxford: Blackwell, –. Ginzburg, Jonathan, and Ivan A. Sag (). Interrogative Investigations: The Form, Meaning and Use of English Interrogatives. Stanford: CSLI Publications. Giorgi, Alessandra, and Fabio Pianesi (). Tense and Aspect: From Semantics to Morphosyntax. Oxford: Oxford University Press. Girard, Jean-Yves (). ‘Linear logic.’ Theoretical Computer Science (): –. Gisborne, Nikolas (). ‘Dynamic modality.’ SKASE Journal of Theoretical Linguistics : –. Gisborne, Nikolas (). ‘Dependencies are constructions’, in Graeme Trousdale and Nikolas Gisborne (eds), Constructional Approaches to English Grammar. Berlin/New York: Mouton de Gruyter, –. Gisborne, Nikolas (). The Event Structure of Perception Verbs. Oxford/New York: Oxford University Press. Givón, Talmy (). ‘Historical syntax and synchronic morphology.’ Chicago Linguistics Society : –. Givón, Talmy (a). ‘From discourse to syntax: grammar as a processing strategy’, in Talmy Givón (ed), Syntax and Semantics, Vol. . Discourse and Syntax. New York: Academic Press, –. Givón, Talmy (b). On Understanding Grammar. New York: Academic Press. Givón, Talmy (). ‘The binding hierarchy and the typology of complements.’ Studies in Language : –. Givón, Talmy (). ‘Topic continuity in discourse: An introduction’, in Talmy Givón (ed), Topic Continuity in Discourse: A Quantitative Cross-Language Study. Amsterdam/Philadelphia: John Benjamins, –. Givón, Talmy (a). Syntax: A Functional-Typological Introduction. Vol. I. Amsterdam/ Philadelphia: John Benjamins. Givón, Talmy (b). Syntax: A Functional-Typological Introduction. Vol. II. Amsterdam/ Philadelphia: John Benjamins. Givón, Talmy (). English Grammar. volumes. Amsterdam/Philadelphia: John Benjamins. Givón, Talmy (). ‘Irrealis and the subjunctive.’ Studies in Language : –. Givón, Talmy (a). Syntax: An Introduction. Vol. . Amsterdam/Philadelphia: John Benjamins. Givón, Talmy (b). Syntax: An Introduction. Vol. . Amsterdam/Philadelphia: John Benjamins. Givón, Talmy (). ‘The adaptive approach to grammar’, in Bernd Heine and Heiko Narrog (eds), The Oxford Handbook of Linguistic Analysis. Oxford: Oxford University Press, –. Givón, Talmy (). ‘The intellectual roots of functionalism in linguistics’, in Shannon Bischoff and Carmen Jany (eds), Functional Approaches to Language. Berlin/New York: Mouton de Gruyter, –. Godfrey, Elizabeth, and Sali Tagliamonte (). ‘Another piece of the verbal -s story: Evidence from Devon in Southwest England.’ Language Variation and Change (): –. Goldberg, Adele E. (). Constructions: A Construction Grammar Approach to Argument Structure. Chicago: Chicago University Press. Goldberg, Adele E. (). ‘Jackendoff and construction-based grammar.’ Cognitive Science : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Goldberg, Adele E. (). ‘Patterns of experience in patterns of language’, in Michael Tomasello (ed), The New Psychology of Language. Mahwah, NJ: Erlbaum, –. Goldberg, Adele E. (). ‘Constructions: A new theoretical approach to language.’ Trends in Cognitive Sciences (): –. Goldberg, Adele (a). Constructions at Work: The Nature of Generalization in Language. Oxford/New York: Oxford University Press. Goldberg, Adele E. (b). ‘The inherent semantics of argument structure: The case of the English ditransitive construction’, in Dirk Geeraerts (ed), Cognitive Linguistics: Basic Readings. Berlin/New York: Mouton de Gruyter, –. Goldberg, Adele E. (). Explain Me This: Creativity, Competition, and the Partial Productivity of Constructions. Princeton, NJ: Princeton University Press. Goldberg, Adele E., and Farrell Ackerman (). ‘The pragmatics of obligatory adjuncts.’ Language : –. Goldberg, Adele E., and Ray Jackendoff (). ‘The English resultative as a family of constructions.’ Language (): –. Gómez-González, María Ángeles (). The Theme–Topic Interface: Evidence from English. Amsterdam/Philadelphia: John Benjamins. Goossens, Louis (). ‘Patterns of meaning extension, “parallel chaining”, subjectification, and modal shifts’, in Antonio Barcelona (ed), Metaphor and Metonymy at the Crossroads: A Cognitive Perspective. Berlin/New York: Mouton de Gruyter, –. Gotti, Maurizio (). ‘A new genre for a specialized community: The rise of the experimental essay’, in Heidrun Dorgeloh and Anja Wanner (eds), Syntactic Variation and Genre. Berlin/New York: Mouton de Gruyter, –. Grabe, Esther, Greg Kochanski, and John Coleman (). ‘The intonation of native accent varieties in the British Isles: Potential for miscommunication?’, in: Katarzyna DziubalskaKolaczyk and Joanna Przedlacka (eds), English Pronunciation Models: A Changing Scene. Oxford: Peter Lang. Grafmiller, Jason (). ‘Variation in English genitives across modality and genres.’ English Language and Linguistics : –. Graustein, Gottfried, and Gerhard Leitner (eds) (). Reference Grammars and Modern Linguistic Theory. Tübingen: Max Niemeyer. Gray, Bethany, and Douglas Biber (). ‘Stance markers’, in Karin Aijmer and Christoph Rühlemann (eds), Corpus Pragmatics: A Handbook. Cambridge: Cambridge University Press, –. Greaves, Paul (). Grammatica Anglicana. Canterbury: Legatt. Greenbaum, Sidney, and Randolph Quirk (). Elicitation Experiments in English. Coral Gables, FL: University of Miami Press. Greenberg, Joseph (). Language Universals: With Special Reference to Feature Hierarchies. The Hague: Mouton. Gregory, Michelle L., and Laura A. Michaelis (). ‘Topicalization and left-dislocation: A functional opposition revisited.’ Journal of Pragmatics : –. Grewe, Tanja, Ina Bornkessel, Stefan Zysset, Richard Wiese, D. Yves von Cramon, and Matthias Schlesewsky (). ‘The emergence of the unmarked: A new perspective on the language-specific function of Broca’s area.’ Human Brain Mapping : –. Grice, H. Paul (). ‘Logic and conversation’, in Peter Cole and Jerry L. Morgan (eds), Studies in Syntax and Semantics III: Speech Acts. New York: Academic Press, –. Reprinted in Grice (: –).
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Grice, H. Paul (). Studies in the Way of Words. Cambridge, MA: Harvard University Press. Gries, Stefan Th. (). ‘Syntactic priming: A corpus-based approach.’ Journal of Psycholinguistic Research (): –. Gries, Stefan Th. (). ‘Frequencies, probabilities, and association measures in usage-/ exemplar-based linguistics.’ Studies in Language (): –. Gries, Stefan Th. (). ‘-something years of work on collocations: What is or should be next . . . ’ International Journal of Corpus Linguistics (): –. Gries, Stefan Th. (). ‘The most underused statistical method in corpus linguistics: Multilevel (and mixed-effects) models.’ Corpora (): –. Gries, Stefan Th., and Anatol Stefanowitsch (a). ‘Extending collostructional analysis. A corpus-based perspective on “alternations”.’ International Journal of Corpus Linguistics (): –. Gries, Stefan Th., and Anatol Stefanowitsch (b). ‘Covarying collexemes in the Intocausative’, in Michel Achard and Suzanne Kemmer (eds), Language, Culture, and Mind. Center for the Study of Language and Information Publications, Stanford, –. Gries, Stefan Th., and Stefanie Wulff (). ‘Do foreign language learners also have constructions? Evidence from priming, sorting, and corpora.’ Annual Review of Cognitive Linguistics : –. Grimshaw, Jane (). Argument Structure. Cambridge, MA/London: MIT Press. Grimshaw, Jane, and Sten Vikner (). ‘Obligatory adjuncts and the structure of events’, in Eric Reuland and Werner Abraham (eds), Knowledge and Language. Dordrecht: Kluwer Academic Publishers, –. Grodzinsky, Yosef (). ‘Language deficits and the theory of syntax.’ Brain and Language : –. Grodzinsky, Yosef, and Andrea Santi (). ‘The battle for Broca’s region.’ Trends in Cognitive Sciences : –. Groenendijk, Jeroen, and Martin Stokhof (). ‘Dynamic predicate logic.’ Linguistics and Philosophy : –. Gropen, Jess, Steven Pinker, Michelle Hollander, Richard Goldberg, and Ronald Wilson (). ‘The learnability and acquisition of the dative alternation in English.’ Language (): –. Groß, Thomas, and Timothy Osborne (). ‘Toward a Practical Dependency Grammar Theory of Discontinuities.’ SKY Journal of Linguistics : –. Grossman, Eitan, and Stéphane Polis (). ‘On the pragmatics of subjectification: The grammaticalization of verbless allative futures (with a case study in Ancient Egyptian)’. Paper presented at Gramis (International Conference on Grammaticalization and (Inter)Subjectification. Royal Flemish Academy, Brussels, – November . Grosu, A. (), ‘A unified theory of “standard” and “transparent” free relatives.’ Natural Language and Linguistic Theory : –. Gruber, Jeffrey S. (). Studies in Lexical Relations. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Guéron, Jacqueline (ed) (). Sentence and Discourse (Oxford Studies in Theoretical Linguistics). Oxford/New York: Oxford University Press. Guéron, Jacqueline, and Jacqueline Lecarme (eds) (). Time and Modality. Houten, NL: Springer Media BV. Guevara, Emiliano, and Sergio Scalise (). ‘Searching for universals in compounding’, in Sergio Scalise, Elisabetta Magni, and Antonietta Bisetto (eds), Universals of Language Today. Berlin: Springer, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Guion, Susan G., J. J. Clark, Tetsuo Harada, and Ratree P. Wayland (). ‘Factors affecting stress placement for English nonwords include syllabic structure, lexical class, and stress patterns of phonologically similar words.’ Language and Speech, : –. Gumperz, John J. (). Discourse Strategies. Cambridge: Cambridge University Press. Gundel, Jeanette K. (). The Role of Topic and Comment in Linguistic Theory. Bloomington, IN: Indiana University Linguistics Club. Gundel, Jeanette K. (). ‘Shared knowledge and topicality.’ Journal of Pragmatics : –. Gundel, Jeanette K. (). ‘Universals of topic-comment structure’, in Michael Hammond, Edith Moravcsik, and Jessica Wirth (eds), Studies in Syntactic Typology. Amsterdam/ Philadelphia: John Benjamins, –. Gundel, Jeanette K., Nancy Hedberg, and Ron Zacharski (). ‘Cognitive status and the form of referring expressions in discourse.’ Language (): –. Gunter, Thomas C., Laurie A. Stowe, and Gusbertus Mulder (). ‘When syntax meets semantics.’ Psychophysiology : –. Günther, Christine (). ‘Noun ellipsis in English: Adjectival modifiers and the role of context.’ English Language and Linguistics (): –. Gussenhoven, Carlos (). On the Grammar and Semantics of Sentence Accents. Dordrecht: Foris Publications. Gussenhoven, Carlos (). The Phonology of Tone and Intonation. Cambridge: Cambridge University Press. Gwynne, N. M. (). Gwynne’s Grammar: The Ultimate Introduction to Grammar and the Writing of Good English. London: Idler Books. Haas, W. (). Review of John Lyons, Introduction to Theoretical Linguistics. Journal of Linguistics : –. Haegeman, Liliane (). The Syntax of Negation. Cambridge: Cambridge University Press. Haegeman, Liliane (). Thinking Syntactically: A Guide to Argumentation and Analysis. (Blackwell Textbooks in Linguistics .) Oxford: Blackwell. Haegeman, Liliane (). Adverbial Clauses, Main Clause Phenomena, and Composition of the Left Periphery. Oxford: Oxford University Press. Haegeman, Liliane, and Jacqueline Guéron (). English Grammar: A Generative Perspective. Oxford: Blackwell. Haegeman, Liliane, and Tabea Ihsane (). ‘Subject ellipsis in embedded clauses in English.’ Journal of English Language and Linguistics : –. Hagège, Claude (). ‘Towards a typology of interrogative verbs.’ Linguistic Typology : –. Hagoort, Peter (). ‘The fractionation of spoken language understanding by measuring electrical and magnetic brain signals.’ Philosophical Transactions of the Royal Society B: Biological Sciences : . Hagoort, Peter, Colin Brown, and Jolanda Groothusen (). ‘The syntactic positive shift (SPS) as an ERP measure of syntactic processing.’ Language and Cognitive Processes : –. Hagoort, Peter, Marlies Wassenaar, and Colin M. Brown (). ‘Syntax-related ERP-effects in Dutch.’ Cognitive Brain Research : –. Hahne, Anja, and Angela D. Friederici (). ‘Electrophysiological evidence for two steps in syntactic analysis: Early automatic and late controlled processes.’ Journal of Cognitive Neuroscience : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Hahne, Anja, and Angela D. Friederici (). ‘Differential task effects on semantic and syntactic processes as revealed by ERPs.’ Cognitive Brain Research : –. Haig, Geoffrey, and Stefan Schnell (). ‘The discourse basis of ergativity revisited.’ Language (): –. Hale, Kenneth, and Samuel Jay Keyser (). ‘On argument structure and the lexical expression of syntactic relations’, in Kenneth Hale and Samuel Jay Keyser (eds), The View from Building : Essays in Linguistics in Honor of Sylvain Bromberger. Cambridge, MA: MIT Press, –. Hale, Kenneth, and Samuel Jay Keyser (). Prolegomenon to a Theory of Argument Structure. Cambridge, MA: MIT Press. Hall, Alison (). ‘Free enrichment or hidden indexicals?’ Mind and Language (): –. Halle, Morris (). The Sound Pattern of Russian: A Linguistic and Acoustical Investigation. Berlin/New York: Mouton de Gruyter. Halle, Morris, and Alec Marantz (). ‘Distributed morphology and the pieces of inflection.’ in Kenneth Hale and Samuel Jay Keyser (eds), The View from Building : Essays in Honor of Sylvain Bromberger. Cambridge, MA: MIT Press, –. Halle, Morris, and Alec Marantz (). ‘Some key features of distributed morphology.’ MIT Working Papers in Linguistics : –. Halle, Morris, and Jean-Roger Vergnaud (). An Essay on Stress. Cambridge, MA: MIT Press. Halliday, M. A. K. (). ‘Categories of the theory of grammar.’ Word : –. Halliday, M. A. K. (a). ‘Notes on transitivity and theme in English: Part .’ Journal of Linguistics : –. Halliday, M. A. K. (b). Intonation and Grammar in British English. The Hague: Mouton. Halliday, M. A. K. (a). ‘A brief sketch of systemic grammar’, in Gunther Kress (ed), Halliday: System and Function in Language. London: Oxford University Press, –. Halliday, M. A. K. (b). ‘The form of functional grammar’, in Gunther Kress (ed), Halliday: System and Function in Language. London: Oxford University Press, –. Halliday, M. A. K. (). An Introduction to Functional Grammar. London: Edward Arnold. Halliday, M. A. K. (). ‘Corpus studies and probabilistic grammar’, in Karin Aijmer and Bengt Altenberg (eds), English Corpus Linguistics. London: Longman, –. Halliday, M. A. K. (). ‘Language as system and language as instance: The corpus as a theoretical construct’, in Jan Svartvik (ed), Trends in Linguistics. Studies and Monographs. Directions in Corpus Linguistics. Berlin/New York: Mouton de Gruyter, –. Halliday, M. A. K. (). An Introduction to Functional Grammar, nd edn. London: Edward Arnold. Halliday, M. A. K., () rd edn, revised by Christian M. I. M. Matthiessen. An Introduction to Functional Grammar. London: Edward Arnold. Halliday, M. A. K. (). Halliday’s Introduction to Functional Grammar, th edn, revised by Christian M. I. M. Matthiessen. London: Routledge. Halliday, M. A. K., and Ruqaiya Hasan (). Cohesion in English. London: Longman. Hamawand, Zeki (). Atemporal Complement Clauses in English: A Cognitive Grammar Analysis. Munich: Lincom Europa. Hamblin, Charles L. (). Imperatives. London: Blackwell. Hamilton, William L., Jure Leskovec, and Dan Jurafsky (nd). ‘Diachronic word embeddings reveal statistical laws of semantic change.’ https://web.stanford.edu/~jurafsky/pubs/paperhist_vec.pdf.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Han, Chung-Hye (). ‘Imperatives’, in Klaus von Heusinger, Claudia Maienborn, and Paul Portner (eds), Semantics . Handbooks of Linguistics and Communication Science (). Berlin/New York: Mouton de Gruyter, –. Harley, Heidi (). ‘The morphology of nominalizations and the syntax of vP’, in Anastasia Giannakidou and Monika Rathert (eds), Quantification, Definiteness, and Nominalization. Oxford: Oxford University Press, –. Harley, Heidi (). ‘A minimalist approach to argument structure’, in Cedric Boeckx (ed), The Oxford Handbook of Linguistic Minimalism. Oxford: Oxford University Press, –. Harley, Heidi (). ‘The syntax/morphology interface,’ in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis: An International Handbook, Handbücher zur Sprach- und Kommunikationswissenschaft. Berlin/New York: Mouton de Gruyter, –. Harley, Heidi, and Rolf Noyer (). ‘Distributed morphology.’ Glot International, Vol. (): –. Harman, Gilbert H. (). ‘Generative grammars without transformation rules: A defense of phrase structure.’ Language : –. Harnish, Robert (). ‘Mood, meaning and speech acts’, in Savas Tsohatzidis (ed), Foundations of Speech Act Theory: Philosophical and Linguistic Perspectives. London: Routledge, –. Harris, Randy Allen (). The Linguistics Wars. Oxford: Oxford University Press. Harris, Zellig S. (). Methods in Structural Linguistics. Chicago: University of Chicago Press. Harrison, Chloe, Louise Nuttall, Peter Stockwell, and Wenjuan Yuan (eds) (). Cognitive Grammar in Literature. Amsterdam/Philadelphia: John Benjamins. Harrison, Tony (). Selected Poems. London: Penguin Books. Hart, Christopher (). Discourse, Grammar and Ideology: Functional and Cognitive Perspectives. London: Bloomsbury Publishing. Hart, Christopher (). ‘Event-frames affect blame assignment and perception of aggression in discourse on political protests: An experimental case study in critical discourse analysis.’ Applied Linguistics : –. Harvey, Thomas W. (). A Practical Grammar of the English Language for the Use of Schools of Every Grade. New York/Cincinnati: Van Antwerp, Bragg and Co. (Revised edition .) Hasan, Ruqaiya (). ‘The grammarian’s dream: Lexis as most delicate grammar’, in M. A. K. Halliday and Robin Fawcett (eds), New Developments in Systemic Linguistics : Theory and Description. London: Pinter, –. Haspelmath, Martin (). ‘Optimality and diachronic adaptation.’ Zeitschrift für Sprachwissenschaft : –. Haspelmath, Martin (). ‘The European linguistic area: Standard Average European’, in Martin Haspelmath, Ekkehard König, Wolfgang Oesterreicher, and Wolfgang Raible (eds), Language Typology and Language Universals. Berlin/New York: Mouton de Gruyter, –. Haspelmath, Martin (). Understanding Morphology. London: Edward Arnold. Haspelmath, Martin (). ‘Frequency vs. iconicity in explaining grammatical asymmetries.’ Cognitive Linguistics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Haspelmath, Martin (). ‘Comparative concepts and descriptive categories in crosslinguistic studies.’ Language : –. Haspelmath, Martin (a). ‘Negative indefinite pronouns and predicate negation’, in Matthew Dryer and Martin Haspelmath (eds), The World Atlas of Language Structures Online. Munich: Max Planck Digital Library, chapter . Available online at http://wals.info/chapter/ (last accessed April ). Haspelmath, Martin (b). ‘Ditransitive constructions: The verb “give”’, in Matthew Dryer and Martin Haspelmath (eds), The World Atlas of Language Structures Online. Munich: Max Planck Digital Library, chapter . Available online at http://wals.info/chapter/ (last accessed April ). Hatcher, Anna Granville (). ‘An introduction to the analysis of English noun compounds.’ Word : –. Hawkins, John A. (). A Comparative Typology of English and German: Unifying the Contrasts. London: Croom Helm. Hawkins, John A. (). ‘Syntactic weight versus information structure in word order variation.’ Linguistische Berichte : –. Hawkins, John A. (). ‘Processing complexity and fill-gap dependencies across languages.’ Language : –. Hawkins, John A. (). Efficiency and Complexity in Grammars. Oxford: Oxford University Press. Hay, Jennifer (). Causes and Consequences of Word Structure. Ph.D. thesis. Chicago: Northwestern University. Hay, Jennifer (). ‘From speech perception to morphology: Affix ordering revisited.’ Language, : –. Hay, Jennifer, and Ingo Plag (). ‘What constrains possible suffix combinations? On the interaction of grammatical and processing restrictions in derivational morphology.’ Natural Language and Linguistic Theory : –. Hayes, Bruce, and Colin Wilson (). ‘A maximum entropy model of phonotactics and phonotactic learning.’ Linguistic Inquiry : –. Hayes, Bruce, Bruce Tesar, and Kie Zuraw (). OTSoft ., software package, http://www. linguistics.ucla.edu/people/hayes/otsoft/. Hedberg, Nancy (). ‘Multiple focus and cleft sentences’, in Katharina Hartmann and Tonjes Veenstra (eds), Cleft Structures. Amsterdam/Philadelphia: John Benjamins. Hedberg, Nancy, and Lorna Fadden (). ‘The information structure of it-clefts, wh-clefts and reverse wh-clefts in English’, in Nancy Hedberg and Ron Zacharski (eds), The Grammar–Pragmatics Interface: Essays in Honor of Jeanette K. Gundel. Amsterdam/ Philadelphia: John Benjamins, –. Heger, Klaus (). Monem, Wort, Satz und Text. Tübingen: Niemeyer. Heim, Irene (). The Semantics of Definite and Indefinite Noun Phrases. Ph.D. thesis. Amherst, MA: University of Massachusetts. Heim, Irene (). ‘File change semantics and the familiarity theory of definiteness’, in Rainer Bäuerle, Christoph Schwarze, and Arnim von Stechow (eds), Meaning, Use and Interpretation of Language. Berlin/New York: Mouton de Gruyter, –. Reprinted in Portner and Partee (: –). Heim, Irene (). ‘E-Type pronouns and donkey anaphora.’ Linguistics and Philosophy (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Heim, Irene, and Angelika Kratzer (). Semantics in Generative Grammar. Oxford: Blackwell. Heine, Bernd (). Auxiliaries: Cognitive Forces and Grammaticalization. Oxford: Oxford University Press. Heine, Bernd (). ‘Agent-oriented vs. epistemic modality: Some observations on German modals’, In Bybee and Fleischman (eds) (), –. Heine, Bernd (). ‘On the role of context in grammaticalization’, in Ilse Wischer and Gabriele Diewald (eds), New Reflections on Grammaticalization. Amsterdam/Philadelphia: John Benjamins, –. Heine, Bernd (). ‘On discourse markers: Grammaticalization, pragmaticalization, or something else?’ Linguistics (): –. Heine, Bernd, and Tania Kuteva (). World Lexicon of Grammaticalization. Cambridge: Cambridge University Press. Heine, Bernd, and Tania Kuteva (). The Genesis of Grammar: A Reconstruction. Oxford: Oxford University Press. Heine, Bernd, and Heiko Narrog (eds) (). The Oxford Handbook of Linguistic Analysis, nd edn. Oxford: Oxford University Press. Heine, Bernd, Ulrike Claudi, and Friederike Hünnemeyer (). Grammaticalization: A Conceptual Framework. Chicago: University of Chicago Press. Helbig, Gerhard (). Probleme der Valenz- und Kasustheorie. Tübingen: Niemeyer. Helbig, Gerhard, and Wolfgang Schenkel (). Wörterbuch zur Valenz und Distribution deutscher Verben, nd edn. Leipzig: Bibliographisches Institut. [First edition: .] Hengeveld, Kees (). ‘Layers and operators in Functional Grammar.’ Journal of Linguistics : –. Hengeveld, Kees (). Non-Verbal Predication: Theory, Typology, Diachrony. Berlin/New York: Mouton de Gruyter. Hengeveld, Kees (). ‘The grammaticalization of tense and aspect’, in Heiko Narrog and Bernd Heine (eds), The Oxford Handbook of Grammaticalization. Oxford: Oxford University Press, –. Hengeveld, Kees (). ‘A hierarchical approach to grammaticalization’, in Kees Hengeveld, Heiko Narrog, and Hella Olbertz (eds), The Grammaticalization of Tense, Aspect, Modality, and Evidentiality: A Functional Perspective. Berlin/New York: Mouton de Gruyter, –. Hengeveld, Kees, and J. Lachlan Mackenzie (). Functional Discourse Grammar: A Typologically-based Theory of Language Structure. Oxford: Oxford University Press. Herbst, Thomas (). Untersuchungen zur Valenz englischer Adjektive und ihrer Nominalisierungen. Tübingen: Narr. Herbst, Thomas (). ‘A valency model for nouns in English.’ Journal of Linguistics : –. Herbst, Thomas (). ‘The status of generalisations: Valency and argument structure constructions.’ Zeitschrift für Anglistik und Amerikanistik (): –. Herbst, Thomas (). ‘The valency approach to argument structure constructions’, in Thomas Herbst, Hans-Jörg Schmid, and Susen Faulhaber (eds), Constructions – Collocations – Patterns. Berlin/New York: Mouton de Gruyter, –. Herbst, Thomas (). ‘Why construction grammar catches the worm and corpus data can drive you crazy: Accounting for idiomatic and non-idiomatic idiomaticity.’ Journal of Social Sciences (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Herbst, Thomas, and Susen Schüller (). Introduction to Syntactic Analysis: A Valency Approach. Tübingen: Narr. Herbst, Thomas, and Peter Uhrig (–). Erlangen Valency Patternbank: A Corpus-based Research Tool for Work on Valency and Argument Structure Constructions. www.patternbank.fau.de (last accessed June ). Herbst, Thomas, David Heath, and Hans-Martin Dederding (eds) (). Grimm’s Grandchildren: Current Topics in German Linguistics. Longman Linguistics Library . London: Longman. Herbst, Thomas, David Heath, Ian Roe, and Dieter Götz (eds) (). A Valency Dictionary of English. Berlin/New York: Mouton de Gruyter. Heringer, Hans Jürgen (). Theorie der deutschen Syntax. München: Hueber. [nd edn ] Heringer, Hans Jürgen (). Deutsche Syntax Dependentiell. Tübingen: Stauffenburg. Hermann, Johann Gottfried Jakob (). De Emendanda Ratione Graecae Grammaticae. Leipzig: Apvd Gerhard Fleischer. Herring, Susan C. (). ‘Computer-mediated discourse’, in Deborah Schiffrin, Deborah Tannen, and Heidi E. Hamilton (eds), The Handbook of Discourse Analysis. Oxford: Blackwell, –. Herrmann, Tanja (). Relative Clauses in Dialects of English: A Typological Approach. PhD thesis. Freiburg: University of Freiburg. Hewitt, George (). Abkhaz. Amsterdam: North-Holland. Hewson, John (). Article and Noun in English. The Hague: Mouton. Hewson, John (a). ‘Determiners as heads.’ Cognitive Linguistics : –. Hewson, John (b). ‘The roles of subject and verb in a dependency grammar’, in Werner Bahner, Joachim Schildt, and Dieter Viehweger (eds), Proceedings of the Fourteenth International Congress of Linguists, Vol. III. Berlin: Akademie-Verlag, –. Hewson, John, and Vit Bubenik (). Tense and Aspect in Indo-European Languages. Amsterdam/Philadelphia: John Benjamins. Heycock, Carolyn, and Anthony Kroch (). ‘Topic, focus and syntactic representations’, in Line Mikkelsen and Christopher Potts (eds), WCCFL : Proceedings of the st West Coast Conference on Formal Linguistics. Somerville, MA: Cascadilla Press, –. Hickey, Raymond (ed) (a). Areal Features of the Anglophone World. Berlin/New York: Mouton de Gruyter. Hickey, Raymond (ed) (b) Standards of English: Codified Varieties Around the World. Cambridge: Cambridge University Press. Hill, Archibald A. (). ‘Grammaticality.’ Word : –. Hilpert, Martin (). Germanic Future Constructions: A Usage-based Approach to Language Change. Amsterdam/Philadelphia: John Benjamins. Hilpert, Martin (). ‘Die englischen Modalverben im Daumenkino: Zur dynamischen Visualisierung von Phänomenen des Sprachwandels.’ Zeitschrift für Literaturwissenschaft und Linguistik : –. Hilpert, Martin (). Construction Grammar and its Application to English. Edinburgh: Edinburgh University Press. Hilpert, Martin (). ‘Change in modal meanings: Another look at the shifting collocates of may.’ Constructions and Frames (): –. Hiltunen, Ristu (). ‘The grammar and structure of legal texts’, in Lawrence M. Solan and Peter M. Tiersma (eds), The Oxford Handbook of Language and Law. Oxford: Oxford University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Himmelmann, Nikolaus, and Eva Schultze-Berndt (). ‘Issues in the syntax and semantics of participant-oriented adjuncts: An introduction’, in Nikolaus Himmelmann and Eva Schultze-Berndt (eds), Secondary Predication and Adverbial Modification: The Typology of Depictives. Oxford: Oxford University Press, –. Hinojosa, José, Manuel Martín-Loeches, Pilar Casado, Francisco Muñoz, and Francisco Rubia (). ‘Similarities and differences between phrase structure and morphosyntactic violations in Spanish: An event-related potentials study.’ Language and Cognitive Processes : –. Hinrichs, Lars, Benedikt Szmrecsanyi, and Axel Bohmann (). ‘Which-hunting and the Standard English relative clause.’ Language (): –. Hinzen, Wolfram, Edouard Machery, and Markus Werning (). The Oxford Handbook of Compositionality. Oxford: Oxford University Press. Hirschberg, Julia (). A Theory of Scalar Implicature. Ph.D. thesis. Pennsylvania, PA: University of Pennsylvania. Hirtle, Walter H. (a). Language in the Mind: An Introduction to Guillaume’s Theory. Montreal: McGill-Queen’s University Press. Hirtle, Walter H. (b). Lessons on the English Verb. Montreal: McGill-Queen’s University Press. Hjelsmlev, Louis (). Prolegomena to a Theory of Language. Translated by Francis J. Whitfield. Madison: University of Wisconsin Press. Ho, Yufang (). Corpus Stylistics in Principles and Practice: A Stylistic Exploration of John Fowles’ The Magus. London: Bloomsbury Publishing. Hockett, Charles F. (). A Course in Modern Linguistics. New York: Macmillian. Hodson, Jane (). Dialect in Film and Literature. Basingstoke: Palgrave Macmillan. Hoey, Michael (). Lexical Priming: A New Theory of Words and Language. London: Routledge. Hoffmann, Thomas, and Graeme Trousdale (eds) (). The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press. Hofmeister, Philip, and Ivan A. Sag (). ‘Cognitive constraints and island effects.’ Language : –. Hogg, Richard M. (). ‘Phonology and morphology’, in Norman Blake (ed), The Cambridge History of the English Language, Vol. II –. Cambridge: Cambridge University Press, –. Hogg, Richard M., and R. D. Fulk (). A Grammar of Old English, Volume : Morphology. Oxford: Blackwell. Höhle, Barbara, Jürgen Weissenborn, Dorothea Kiefer, Antje Schulz, and Michaela Schmitz (). ‘Functional elements in infants’ speech processing: The role of determiners in the syntactic categorization of lexical elements.’ Infancy : –. Hollmann, Willem B. (). ‘Revising Talmy’s typological classification of complex events’, in Hans C. Boas (ed), Contrastive Construction Grammar. Amsterdam/Philadelphia: John Benjamins, –. Hollmann, Willem B. (). ‘Word classes: Towards a more comprehensive usage-based account.’ Studies in Language, : –. [Reprinted in Nikolas Gisborne and Willem B. Hollmann (eds), Theory and Data in Cognitive Linguistics. Amsterdam/Philadelphia: John Benjamins, –.] Hollmann, Willem B. (). ‘Nouns and verbs in Cognitive Grammar: Where is the “sound” evidence?’ Cognitive Linguistics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Hollmann, Willem B. (). ‘What do adjectives sound like?’ Papers from the th National Conference of the Japanese Cognitive Linguistics Association : –. Hommerberg, Charlotte, and Gunnel Tottie (). ‘Try to or try and? Verb complementation in British and American English.’ ICAME Journal : –. Hopper, Paul J. (). ‘Aspect and foregrounding in discourse’, in Talmy Givón (ed), Discourse and Syntax. New York: Academic Press, –. Hopper, Paul J. (). ‘Emergent grammar.’ Berkeley Linguistics Society : –. Hopper, Paul (). ‘Emergent grammar and temporality in interactional linguistics’, in Peter Auer and Stephan Pfänder (eds), Constructions: Emerging and Emergent. Berlin/New York: Mouton De Gruyter, –. Hopper, Paul J. (). ‘Emergent grammar’, in James Gee and Michael Handford (eds), The Routledge Handbook of Discourse Analysis. New York: Routledge, –. Hopper, Paul J., and Sandra Thompson (). ‘The discourse basis for lexical categories in universal grammar.’ Language : –. Hopper, Paul J., and Elisabeth C. Traugott (). Grammaticalization, nd edn. Cambridge: Cambridge University Press. Horn, Laurence R. (). On the Semantic Properties of Logical Operators in English. Ph.D. thesis. Los Angeles, CA: University of California, Los Angeles. Horn, Laurence R. (). A Natural History of Negation. Chicago: University of Chicago Press. Horn, Laurence R., and Gregory Ward (eds) (). The Handbook of Pragmatics. Oxford: Blackwell. Hornstein, Norbert (). As Time Goes By: Tense in Universal Grammar. Cambridge, MA: MIT Press. Hoye, Leo Francis (). ‘ “You may think that; I couldn’t possibly comment!” Modality studies: Contemporary research and future directions’, Part II. Journal of Pragmatics : –. Huang, C.-T. James (). Logical Relations in Chinese and the Theory of Grammar. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Huang, C.-T. James (). ‘Reconstruction and the nature of vP: Some theoretical consequences.’ Linguistic Inquiry : –. Huang, Yan (). The Oxford Handbook of Pragmatics. Oxford: Oxford University Press. Huddleston, Rodney (). ‘Two approaches to the analysis of tags.’ Journal of Linguistics : –. Huddleston, Rodney (). The Sentence in Written English: A Syntactic Study Based on the Analysis of Scientific Texts. Cambridge: Cambridge University Press. Huddleston, Rodney (). ‘Some theoretical issues in the description of the English verb.’ Lingua : –. Huddleston, Rodney (). Introduction to the Grammar of English. Cambridge: Cambridge University Press. Huddleston, Rodney (). Review of Randolph Quirk, Sidney Greenbaum, Geoffrey Leech and Jan Svartvik (). Language : –. Huddleston, Rodney (). ‘Further remarks on Halliday’s Functional Grammar: A reply to Matthiessen and Martin.’ Occasional Papers in Systemic Linguistics : –. Huddleston, Rodney (). ‘On Halliday’s Functional Grammar: A reply to Martin and to Martin and Matthiessen.’ Occasional Papers in Systemic Linguistics : –. Huddleston, Rodney, and Geoffrey K. Pullum (). The Cambridge Grammar of the English Language. In collaboration with Laurie Bauer, Betty Birner, Ted Briscoe, Peter Collins,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
David Denison, David Lee, Anita Mittwoch, Geoffrey Nunberg, Frank Palmer, John Payne, Peter Peterson, Lesley Stirling, Gregory Ward. Cambridge: Cambridge University Press. Huddleston, Rodney, and Geoffrey K. Pullum (). ‘The classification of finite subordinate clauses’, in Gunnar Bergh, Jennifer Herriman, and Mats Mobärg (eds), An International Master of Syntax and Semantics: Papers Presented to Aimo Seppänen on the Occasion of his th Birthday (Gothenburg Studies in English, ). Göteborg, Sweden: Acta Universitatis Gothoburgensis, –. Huddleston, Rodney, and Geoffrey K. Pullum (). A Student’s Introduction to English Grammar. Cambridge: Cambridge University Press. Huddleston, Rodney, Geoffrey K. Pullum, and Peter Peterson (). ‘Relative constructions and unbounded dependencies’, in Rodney Huddleston and Geoffrey K. Pullum (eds), The Cambridge Grammar of the English Language. Cambridge: Cambridge University Press, –. Hudson, Richard A. (). Arguments for a Non-transformational Grammar. Chicago/ London: The University of Chicago Press. Hudson, Richard A. (). Word Grammar. Oxford: Blackwell. Hudson, Richard A. (). ‘Zwicky on Heads.’ Journal of Linguistics : –. Hudson, Richard A. (). ‘Extraction and grammatical relations.’ Lingua : –. Hudson, Richard A. (). An English Word Grammar. Oxford: Blackwell. Hudson, Richard A. (). ‘Do we have heads in our minds?’, in Greville G. Corbett, Norman M. Fraser, and Scott McGlashan (eds), Heads in Grammatical Theory. Cambridge: Cambridge University Press, –. Hudson, Richard A. (a). ‘Case-agreement, PRO and structure sharing.’ Research in Language : –. Hudson, Richard A. (b). ‘Word grammar’, in Vilmos Ágel, Ludwig M. Eichinger, HansWerner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds), Dependenz und Valenz: Ein internationales Handbuch der zeitgenössischen Forschung. Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: Mouton de Gruyter, –. Hudson, Richard A. (). Language Networks: The New Word Grammar. Oxford: Oxford University Press. Hudson, Richard A. (). ‘Word grammar and construction grammar’, in Graeme Trousdale and Nikolas Gisborne (eds), Constructional Approaches to English Grammar. Berlin/ New York: Mouton de Gruyter, –. Hudson, Richard A. (a). An Introduction to Word Grammar. Cambridge: Cambridge University Press. Hudson, Richard A. (b). ‘The canon.’ The Times Higher Education Supplement ( July): . Hughes, Arthur, Peter Trudgill, and Dominic Watt (). English Accents and Dialects: An Introduction to Social and Regional Varieties of English in the British Isles. London: Hodder Arnold. Hughes, Rebecca (). English in Speech and Writing. London: Routledge. Humphries, Colin, Jeffrey R. Binder, David A. Medler, and Einat Liebenthal (). ‘Syntactic and semantic modulation of neural activity during auditory sentence comprehension.’ Journal of Cognitive Neuroscience : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Humphries, Colin, Tracy Love, David Swinney, and Gregory Hickok (). ‘Response of anterior temporal cortex to syntactic and prosodic manipulations during sentence processing.’ Human Brain Mapping : –. Hundt, Marianne (a). ‘It is important that this study (should) be based on the analysis of parallel corpora: On the use of the mandative subjunctive in four varieties of English’, in Hans Lindquist, Staffan Klintborg, Magnus Levin, and Maria Estling (eds), The Major Varieties of English. Växjö: Växjö University Press, –. Hundt, Marianne (b). New Zealand English Grammar: Fact or Fiction? Amsterdam/ Philadelphia: John Benjamins. Hundt, Marianne (). ‘Animacy, agentivity, and the spread of the Progressive in modern English.’ English Language and Linguistics : –. Hundt, Marianne (). English Mediopassive Constructions: A Cognitive, Corpus-Based Study of Their Origin, Spread and Current Status. Amsterdam: Rodopi. Hundt, Marianne (). ‘Text corpora’, in Anke Lüdeling and Merja Kytö (eds), Corpus Linguistics [HSK .]. Berlin/New York: Mouton de Gruyter, –. Hundt, Marianne (). ‘Colonial lag, colonial innovation, or simply language change?’, in Günter Rohdenburg and Julia Schlüter (eds), One Language, Two Grammars: Differences between British and American English. Cambridge: Cambridge University Press, –. Hundt, Marianne (). ‘The demise of the being to V construction.’ Transactions of the Philological Society : –. Hundt, Marianne (). ‘Error, feature, (incipient) change – or something else altogether? On the role of low-frequency deviant patterns for the description of Englishes’, in Elena Seoane and Cristina Suárez-Gómez (eds), World Englishes: New Theoretical and Methodological Considerations. Amsterdam/Philadelphia: John Benjamins, –. Hundt, Marianne, and Anne-Christine Gardner (). ‘Corpus-based approaches: Watching English change’, in Laurel J. Brinton (ed), English Historical Linguistics: Approaches and Perspectives. Cambridge: Cambridge University Press, –. Hundt, Marianne, and Geoffrey Leech (). ‘Small is beautiful: On the value of standard reference corpora for observing recent grammatical change’, in Terttu Nevalainen and Elizabeth Closs Traugott (eds), The Oxford Handbook of the History of English. Oxford: Oxford University Press, –. Hundt, Marianne, and Christian Mair (). ‘“Agile” and “Uptight” genres: The corpusbased approach to language change in progress.’ International Journal of Corpus Linguistics : –. Hundt, Marianne, Sandra Mollin, and Simone E. Pfenninger (eds) (). The Changing English Language: Psycholinguistic Perspectives. Cambridge: Cambridge University Press. Hunston, Susan, and Gill Francis (). Pattern Grammar: A Corpus-Driven Approach to the Lexical Grammar of English. Amsterdam/Philadelphia: John Benjamins. Hunter, John (). ‘A grammatical essay in the nature, import, and effect of certain conjunctions.’ Transactions of the Royal Society of Edinburgh : –. Huntley, Martin (). ‘The semantics of English imperatives.’ Linguistics and Philosophy (): –. Isel, Frédéric, Anja Hahne, Burkhard Maess, and Angela D. Friederici (). ‘Neurodynamics of sentence interpretation: ERP evidence from French.’ Biological Psychology : –. Iverson, Gregory K., and Sang-Cheol Ahn (). ‘English voicing in dimensional theory.’ Language Sciences : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Jackendoff, Ray S. (). Semantic Interpretation in Generative Grammar. Cambridge, MA: MIT Press. Jackendoff, Ray S. (). ‘The base rules for prepositional phrases’, in Stephen R. Anderson and Paul Kiparsky (eds), A Festschrift for Morris Halle. New York: Holt, Rinehart and Winston, –. Jackendoff, Ray S. (). X-Bar Syntax: A Study of Phrase Structure. Cambridge, MA: MIT Press. Jackendoff, Ray S. (). Semantic Structures. Cambridge, MA: MIT Press. Jackendoff, Ray S. (). Foundations of Language. Oxford: Oxford University Press. Jackendoff, Ray S. (). Language, Consciousness, Culture: Essays on Mental Structure. Cambridge, MA: MIT Press. Jackendoff, Ray S. (). ‘Compounding in the parallel architecture and conceptual semantics’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Compounding. Oxford: Oxford University Press, –. Jackendoff, Ray S. (). Meaning and the Lexicon: The Parallel Architecture –. Oxford: Oxford University Press. Jacobs, Joachim (). ‘Integration’, in Marga Reis (ed), Wortstellung und Informationsstruktur. Tübingen: Niemeyer, –. Jacobs, Joachim (). Kontra Valenz. Trier: Wissenschaftlicher Verlag Trier. Jacobs, Joachim (). ‘Valenzbindung oder Konstruktionsbindung? Eine Grundfrage der Grammatiktheorie.’ ZGL: –. Jacobson, Pauline (). ‘Towards a variable-free semantics.’ Linguistics and Philosophy (): –. Jacobson, Pauline (). ‘The (Dis)organization of the Grammar: Years.’ Linguistics and Philosophy (–): –. Jacobson, Pauline (). Compositional Semantics: An Introduction to the Syntax/Semantics Interface. Oxford: Oxford University Press. James, Francis (). Semantics of the English Subjunctive. Vancouver: University of British Columbia Press. Jansen, Wouter (). ‘Phonological ‘voicing’, phonetic voicing, and assimilation in English.’ Language Sciences : –. Janssen, Theo M. V. (). ‘Compositionality’, in van Benthem and ter Meulen (), –. Jarrett, Gene Andrew (). Companion to African American Literature. Hoboken, NJ: Wiley-Blackwell. Jary, Mark, and Mikhail Kissine (). Imperatives. Cambridge: Cambridge University Press. Jaszczolt, Kasia M. (). Meaning in Linguistic Interaction: Semantics, Metasemantics, Philosophy of Language. Oxford: Oxford University Press. Jeffries, Lesley (). The Language of Twentieth–Century Poetry. Basingstoke: Palgrave Macmillan. Jeffries, Lesley (). ‘Don’t throw out the baby with the bathwater: In defence of theoretical eclecticism in stylistics.’ Poetics and Linguistics Association Occasional Papers , –. Jeffries, Lesley (). ‘Poetry: Stylistic aspects’, in Keith Brown (ed), Encyclopedia of Language and Linguistics, nd edn. Waltham, MA: Elsevier, –. Jeffries, Lesley (a). Critical Stylistics. Basingstoke: Palgrave Macmillan. Jeffries, Lesley (b). ‘“The Unprofessionals”: Syntactic iconicity and reader interpretation in contemporary poems’, in Dan McIntyre and Beatrix Busse (eds), Language and Style. Basingstoke: Palgrave Macmillan, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Jeffries, Lesley (). ‘Textual meaning and its place in a theory of language.’ Topics in Linguistics. (), –. Jeffries, Lesley, and Dan McIntyre (). Stylistics. Cambridge: Cambridge University Press. Jespersen, Otto (). Growth and Structure of the English Language. Leipzig: Teubner. Reprinted with Foreword by Randolph Quirk by University of Chicago Press, . Jespersen, Otto (–). A Modern English Grammar on Historical Principles. Part I: Sounds and Spellings; Part II: Syntax Vol. ; Part III, Syntax Vol. ; Part IV: Syntax Vol. . Heidelberg: Winter. Part V: Syntax Vol. ; Part VI, Morphology (with Paul Christophersen, Nils Haislund, and Knud Schibsbye); Part VII: Syntax (with Nils Haislund). Copenhagen: Munksgaard. Republished in by George Allan and Unwin. Jespersen, Otto (). Negation in English and Other Languages, nd edn. Copenhagen: Munksgaard. (Reprinted .) Jespersen, Otto (/). The Philosophy of Grammar. London: George Allen and Unwin. Jespersen, Otto (). A Linguist’s Life: An English Translation of Otto Jespersen’s Autobiography with Notes, Photos and a Bibliography. Edited by Arne Juul, Hans F. Nielsen, and Jrgen Erik Nielsen and translated by David Stoner. Denmark: Odense University. Reprinted . Ježek, Elisabetta (). The Lexicon. Oxford: Oxford University Press. Johannessen, Janne Bondi (). Coordination. Oxford: Oxford University Press. Johnson, David E., and Paul M. Postal (). Arc Pair Grammar. Princeton, NJ: Princeton University Press. Johnson, Keith (). ‘Massive reduction in conversational American English’, in Kiyoko Yoneyama and Kikuo Maekawa (eds), Spontaneous Speech: Data and Analysis. Proceedings of the st Session of the th International Symposium. Tokyo: The National Institute for Japanese Language, –. Johnson, Kyle (). ‘What VP ellipsis can do, and what it can’t but not why’, in Mark Baltin and Chris Collins (eds), The Handbook of Contemporary Syntactic Theory. Malden, MA: Blackwell, –. Jonson, Ben (). The English Grammar. London: Bishop. Jurafsky, Daniel (). An On-line Computational Model of Human Sentence Interpretation: A Theory of the Representation and Use of Linguistic Knowledge. Ph.D. thesis. Berkeley, CA: University of California at Berkeley. Jusczyk, Peter W., Anne Cutler, and Nancy J. Redanz (). ‘Infants’ preference for the predominant stress patterns of English words.’ Child Development : –. Just, Marcel A., Patricia A. Carpenter, and Jacqueline D. Woolley (). ‘Paradigms and processes in reading comprehension.’ Journal of Experimental Psychology: General : –. Just, Marcel A., Patricia A. Carpenter, Timothy A. Keller, William F. Eddy, and Keith R. Thulborn (). ‘Brain activation modulated by sentence comprehension.’ Science : . Kaan, Edith (). ‘Investigating the effects of distance and number interference in processing subject-verb dependencies: An ERP study.’ Journal of Psycholinguistic Research : –. Kaan, Edith, and Tamara Y. Swaab (). ‘Electrophysiological evidence for serial sentence processing: A comparison between non-preferred and ungrammatical continuations.’ Cognitive Brain Research : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Kaan, Edith, Anthony Harris, Edward Gibson, and Phillip Holcomb (). ‘The P as an index of syntactic integration difficulty.’ Language and Cognitive Processes : –. Kachru, Braj B. (). ‘Standards, codification and sociolinguistic realism: The English language in the outer circle’, in Randolph Quirk and Henry Widdowson (eds), English in the World: Teaching and Learning the Language and Literatures. Cambridge: Cambridge University Press and the British Council, –. Kachru, Braj B. (). ‘The English language in the outer circle’, in Kingsley Bolton and Braj B. Kachru (eds), World Englishes, Volume . London: Routledge. Kahan, Jeffrey (). ‘“I tell you what mine author says”: A Brief History of Stylometrics.’ ELH: English Literary History (): –. Kahane, Sylvain (). ‘The Meaning-Text Theory’, in Vilmos Ágel, Ludwig M. Eichinger, Hans-Werner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds), Dependenz und Valenz: Ein internationales Handbuch der zeitgenössischen Forschung. Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: Mouton de Gruyter, –. Kahane, Sylvain (). ‘Why to choose dependency rather than constituency for syntax: A formal point of view’, in Juri D. Apresjan, Marie-Claude L’Homme, Leonid Iomdin, Jasmina Milicevic, Alain Polguère, and Leo Wanner (eds), Meanings, Texts, and Other Exciting Things: A Festschrift to Commemorate the th Anniversary of Professor Igor Alexandrovic Mel´cuk, –. Kahane, Sylvain, and Timothy Osborne (). ‘Translator’s Introduction’, in Lucien Tesnière (ed), Elements of Structural Syntax. Translated by Timothy Osborne and Sylvain Kahane. Amsterdam/Philadelphia: John Benjamins, xxix–lxxiii. Kaisse, Ellen M. (). Connected Speech. New York: Academic Press. Kaleta, Agnieszka (). ‘The binding hierarchy and infinitival complementation in English and in Polish: A contrastive study’, in Grzegorz Drożdż (ed), Studies in Lexicogrammar: Theory and Applications. Amsterdam/Philadelphia: John Benjamins, –. Kallel, Amel (). The Loss of Negative Concord in Standard English: A Case of Lexical Reanalysis. Newcastle upon Tyne: Cambridge Scholars. Kaltenböck, Gunther (). ‘Using non-extraposition in spoken and written texts: A functional perspective’, in Karin Aijmer and Anna-Brita Stenström (eds), Discourse Patterns in Spoken and Written Corpora. Amsterdam/Philadelphia: John Benjamins, –. Kaltenböck, Gunther (). ‘It-extraposition in English: A functional view.’ International Journal of Corpus Linguistics (): –. Kaltenböck, Gunther (). ‘Explaining diverging evidence: The case of clause-initial I think’, in Doris Schönefeld (ed), Converging Evidence: Methodological and Theoretical Issues for Linguistic Research. Amsterdam/Philadelphia: John Benjamins, –. Kaltenböck, Gunther, Bernd Heine, and Tania Kuteva (). ‘On thetical grammar.’ Studies in Language (): –. Kamp, Hans (). ‘A theory of truth and semantic representation’, in Jeroen Groenendijk, Theo M. B. Janssen, and Martin Stokhof (eds), Formal Methods in the Study of Language. Amsterdam: Mathematical Centre Tracts, –. Kamp, Hans, and Uwe Reyle (). From Discourse to Logic: Introduction to Modeltheoretic Semantics of Natural Language, Formal Logic and Discourse Representation Theory. Dordrecht: Kluwer. Kaplan, David (). ‘Demonstratives’, in Joseph Almog, John Perry, and Harvey Wettstein (eds), Themes from Kaplan. Oxford: Oxford University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Kaplan, Ronald M., and Joan Bresnan (). ‘Lexical-functional grammar: A formal system for grammatical representation’, in Joan Bresnan (ed), The Mental Representation of Grammatical Relations. Cambridge, MA: MIT Press, –. Karlsson, Fred (). Finnish Grammar, nd edn. Helsinki: Werner Söderström Osakeyhtiö. Karttunen, Lauri (). ‘Presuppositions of compound sentences.’ Linguistic Inquiry (): –. Karttunen, Lauri (). ‘Presupposition and linguistic context.’ Theoretical Linguistics (): –. Karttunen, Lauri, and Stanley Peters (). ‘Conventional implicature’, in Choon-Kyu Oh and David A. Dineen (eds), Presupposition, vol. of Syntax and Semantics, –. New York: Academic Press. Kasher, Asa (ed) (). Pragmatics, vol. I–VI. New York: Routledge. Kastovsky, Dieter (). ‘Hans Marchand and the Marchandeans’, in Pavol Štekauer and Rochelle Lieber (eds), Handbook of Word-Formation. New York, NY: Springer, –. Katamba, Francis, and John Stonham (). Morphology, nd edn. Basingstoke: Palgrave Macmillan. Katz, Jonah, and Elizabeth Selkirk (). ‘Contrastive focus vs. discourse-new: Evidence from phonetic prominence in English.’ Language : –. Kay, Paul (). ‘The limits of (construction) grammar’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Kay, Paul, and Charles J. Fillmore (). ‘Grammatical constructions and linguistic generalization: The What’s X doing Y? construction.’ Language (): –. Kaye, Jonathan (). Phonology: A Cognitive View. New Jersey, NY: Lawrence Erlbaum. Kaye, Jonathan (). ‘Derivations and interfaces’, in Jacques Durand and Francis Katamba (eds), Frontiers of Phonology: Atoms, Structures, Derivations. Harlow: Longman. Kayne, Richard S. (). ‘Connectedness.’ Linguistic Inquiry (): –. Kayne, Richard S. (). Connectedness and Binary Branching. Dordrecht: Foris Publications. Kayne, Richard S. (). The Antisymmetry of Syntax. Cambridge, MA: MIT Press. Kazanina, Nina, Ellen Lau, Moti Lieberman, Masaya Yoshida, and Colin Phillips (). ‘The effect of syntactic constraints on the processing of backwards anaphora.’ Journal of Memory and Language : –. Kearns, Kate (). Semantics. Basingstoke: Palgrave Macmillan. Kearns, Kate (). Semantics, nd edn. Basingstoke: Palgrave Macmillan. Keenan, Edward, and Bernard Comrie (). ‘Noun phrase accessibility and universal grammar.’ Linguistic Inquiry : –. Keenan-Ochs, Elinor, and Bambi Schieffelin (). ‘Topic as a discourse notion’, in Charles N. Li (ed), Subject and Topic. New York: Academic Press, –. Kehoe, Andrew, and Matt Gee (). ‘Social tagging: A new perspective on textual “aboutness’, in Paul Rayson, Sebastian Hoffmann, and Geoffrey Leech (eds), Methodological and Historical Dimensions of Corpus Linguistics. (Studies in Variation, Contacts and Change in English .) Helsinki: Research Unit for Variation, Contacts, and Change in English. http:// www.helsinki.fi/varieng/journal/volumes//kehoe_gee/, last accessed April . Keizer, Evelien (). ‘Postnominal PP complements and modifiers: A cognitive distinction.’ English Language and Linguistics (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Keizer, Evelien (). The English Noun Phrase: The Nature of Linguistic Categorization. Cambridge: Cambridge University Press. Keizer, Evelien (). ‘English proforms: An alternative account.’ English Language and Linguistics (): –. Keizer, Evelien (). A Functional Discourse Grammar for English. Oxford: Oxford University Press. Keizer, Evelien (). ‘We teachers, you fools: Pro + N(P) constructions in Functional Discourse Grammar.’ Language Sciences : –. Keller, Frank (). Gradience in Grammar: Experimental and Computational Aspects of Degrees of Grammaticality. Ph.D. thesis. Edinburgh: University of Edinburgh. Kelly, Michael H. (). ‘Using sound to solve syntactic problems: The role of phonology in grammatical category assignments.’ Psychological Review : –. Kelly, Michael H. (). ‘The role of phonology in grammatical category assignment’, in James L. Morgan and Katherine Demuth (eds), From Signal to Syntax. Hillsdale: Erlbaum, –. Kelly, Michael H., and J. Kathryn Bock (). ‘Stress in time.’ Journal of Experimental Psychology: Human Perception and Performance : . Kempson, Ruth M. (). Presupposition and the Delimitation of Semantics. Cambridge: Cambridge University Press. Kempson, Ruth M. (). ‘Syntax as the dynamics of language understanding’, in Keith Allan (ed), The Routledge Handbook of Linguistics. London: Routledge, –. Kempson, Ruth M., Wilfried Meyer-Viol, and Dov Gabbay (). Dynamic Syntax: The Flow of Language Understanding. Oxford: Blackwell. Kenesei, István (). ‘Semiwords and affixoids: The territory between word and affix.’ Acta Linguistica Hungarica : –. Kennedy, Christopher (). Projecting the Adjective: The Syntax and Semantics of Gradability and Comparison. New York: Garland. Kennedy, Christopher (). ‘Vagueness and grammar: The semantics of relative and absolute gradable adjectives.’ Linguistics and Philosophy (): –. Kennedy, Christopher, and Louise McNally (). ‘Scale structure, degree modification, and the semantics of gradable predicates.’ Language (): –. Kenstowicz, Michael (). Phonology in Generative Grammar. Oxford: Wiley-Blackwell. Kho, Kuan H., Peter Indefrey, Peter Hagoort, C. W. M. van Veelen, Peter C. van Rijen, and Nick F. Ramsey (), ‘Unimpaired sentence comprehension after anterior temporal cortex resection.’ Neuropsychologia : –. Kilgariff, Adam, Vít Baisa, Jan Bušta, Miloš Jakubíček, Vojtěch Kovář, Jan Michelfeit, Pavel Rychlý, and Vít Suchomel (). The Sketch Engine: Ten Years On. Brighton: Lexical Computing. Available at: www.sketchengine.co.uk/wp-content/uploads/The_ Sketch_Engine_.pdf. Kim, Albert, and Lee Osterhout (). ‘The independence of combinatory semantic processing: Evidence from event-related potentials.’ Journal of Memory and Language : –. Kim, Jong-Bok, and Ivan A. Sag (). ‘Negation without head-movement.’ Natural Language and Linguistic Theory (): –. Kim, Jong-Bok, and Peter Sells (). ‘English binominal NPs: A construction-based perspective.’ Journal of Linguistics (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Kiparsky, Paul (a). ‘Word-formation and the Lexicon’, in Fred Ingemann (ed), Proceedings of the Mid-America Linguistics Conference. Lawrence: University of Kansas, –. Kiparsky, Paul (b). ‘From cyclic phonology to lexical phonology’, in Harry van der Hulst and Norval Smith (eds), The Structure of Phonological Representations. Dordrecht: Foris Publications. Kiparsky, Paul (). ‘Some consequences of lexical phonology.’ Phonology : –. Kiparsky, Paul, and Carol Kiparsky (). ‘Fact’, in Manfred Bierwisch and Karl Erich Heidolph (eds), Progress in Linguistics. The Hague: Mouton, –. Kiss, Katalin É. (). ‘Two subject positions in English.’ The Linguistic Review : –. Kissine, Mikhail (). ‘Sentences, utterances and speech acts’, in Keith Allan and Kasia M. Jaszczolt (eds), The Cambridge Handbook of Pragmatics. Cambridge: Cambridge University Press, –. Kitagawa, Yoshihisa (). Subjects in Japanese and English. Ph.D. thesis. Amherst, MA: University of Massachusetts at Amherst. Klavans, Judith L. (). ‘Some problems in a theory of clitics.’ Bloomington, IN: Indiana University Linguistics Club. Klein, Dan, and Christopher D. Manning (). ‘Corpus-based induction of syntactic structure: Models of dependency and constituency.’ Proceedings of the nd Annual Meeting of the Association for Computational Linguistics (ACL ), –. http://nlp. stanford.edu/~manning/papers/factored-induction-camera.pdf. Klein, Ewan (). ‘A semantics for positive and comparative adjectives.’ Linguistics and Philosophy (): –. Klein, Ewan (). ‘The interpretation of adjectival comparatives.’ Journal of Linguistics (): –. Klein, Ewan (). ‘Comparatives’, in Arnim von Stechow and Dieter Wunderlich (eds), Semantics: An International Handbook of Contemporary Research. Berlin/New York: Mouton de Gruyter, –. Klein, Ewan, and Ivan A. Sag (). ‘Type-driven translation.’ Linguistics and Philosophy (): –. Klima, Edward S. (). Studies in Diachronic Syntax. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Klotz, Michael (). ‘Foundations of dependency and valency theory’, in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis. An International Handbook. Volume . Berlin/Munich, Boston: Mouton de Gruyter, –. Klotz, Michael () ‘Explaining explain: Some remarks on verb complementation, argument structure and the history of two English verbs.’ English Studies (): –. Klotz, Michael, and Thomas Herbst (). English Dictionaries: A Linguistic Introduction. Berlin: Schmidt. Kluender, Robert, and Marta Kutas (a). ‘Bridging the gap: Evidence from ERPs on the processing of unbounded dependencies.’ Journal of Cognitive Neuroscience : –. Kluender, Robert, and Marta Kutas (b). ‘Subjacency as a processing phenomenon.’ Language and Cognitive Processes : –. Koch, Peter, and Wulf Oesterreicher (). ‘Sprache der Nähe–Sprache der Distanz. Mündlichkeit und Schriftlichkeit im Spannungsfeld von Sprachtheorie und Sprachgeschichte.’ Romanistisches Jahrbuch : –. Kokkonidis, Miltiadis (). ‘First-order glue.’ Journal of Logic, Language and Information (): –. König, Ekkehard (). The Meaning of Focus Particles: A Comparative Perspective. London: Routledge.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
König, Ekkehard, and Peter Siemund (). ‘Intensifiers and reflexives: A typological perspective’, in Zygmunt Frajzyngier and Traci Curl (eds), Reflexives: Forms and Functions. Amsterdam/Philadelphia: John Benjamins, –. König, Ekkehard, and Peter Siemund (). ‘Speech act distinctions in grammar,’ in Timothy Shopen (ed), Language Typology and Syntactic Description, nd edn, Vol. : Clause Structure. Cambridge: Cambridge University Press, –. König, Ekkehard, and Peter Siemund (). ‘Satztyp und Typologie’, in Jörg Meibauer, Markus Steinbach, and Hans Altmann (eds), Satztypen des Deutschen. Berlin/New York: Mouton de Gruyter, –. König, Esther, Wolfgang Lezius, and Holger Voormann (). TIGERSearch User’s Manual. IMS, University of Stuttgart, Stuttgart. Available at: www.ims.uni-stuttgart.de/forschung/ ressourcen/werkzeuge/TIGERSearch/manual.html. Koontz-Garboden, Andrew (). ‘Verbal derivation’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Derivational Morphology. Oxford: Oxford University Press, –. Koopman, Hilda, and Dominique Sportiche (). ‘The position of subjects.’ Lingua : –. Koopman, Hilda, Dominique Sportiche, and Edward Stabler (). An Introduction to Syntactic Analysis and Theory. Oxford: Blackwell. Korta, Kepa, and John Perry (). ‘Pragmatics’, in Edward N. Zalta (ed), The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, CSLI, Stanford University, Winter edn. Kortmann, Bernd (). ‘New prospects for the study of English dialect syntax: Impetus from syntactic theory and language typology’, in Sjef Barbiers, Leonie Cornips, and Susanne van der Kleij (eds), Syntactic Microvariation. Amsterdam: Meertens Institute, –. Kortmann, Bernd (). ‘Synopsis: Morphological and syntactic variation in the British Isles’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Kortmann, Bernd (). ‘Syntactic variation in English: A global perspective’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –. Kortmann, Bernd (). ‘How powerful is geography as an explanatory factor of variation? Areal features in the Anglophone world’, in Peter Auer, Martin Hilpert, Anja Stukenbrock, and Benedikt Szmrecsanyi (eds), Space in Language and Linguistics: Geographical, Interactional, and Cognitive Perspectives. Berlin/New York: Mouton de Gruyter, –. Kortmann, Bernd, and Kerstin Lunkenheimer (eds) (a). The Mouton World Atlas of Variation in English. Berlin/Boston: Mouton de Gruyter. Kortmann, Bernd, and Kerstin Lunkenheimer (b). ‘Introduction’, in Bernd Kortmann and Kerstin Lunkenheimer (eds), The Mouton World Atlas of Variation in English. Berlin/ Boston: Mouton de Gruyter, –. Kortmann, Bernd, and Kerstin Lunkenheimer (eds) (). The Electronic World Atlas of Varieties of English .. [eWAVE .]. Leipzig: Max Planck Institute for Evolutionary Anthropology. http://ewave-atlas.org. Kortmann, Bernd, and Verena Schröter (). ‘Varieties of English’, in Raymond Hickey (ed), The Cambridge Handbook of Areal Linguistics. Cambridge: Cambridge University Press, –. Kortmann, Bernd, and Benedikt Szmrecsanyi (). ‘Global synopsis: Morphological and syntactic variation in English’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Kortmann, Bernd, and Benedikt Szmrecsanyi (). ‘Parameters of morphosyntactic variation in world Englishes: Prospects and limitations of searching for universals’, in Peter Siemund (ed), Linguistic Universals and Language Variation. Berlin/New York: Mouton de Gruyter, –. Kortmann, Bernd, and Benedikt Szmrecsanyi (eds) (). Linguistic Complexity: Second Language Acquisition, Indigenization, Contact. Berlin/New York: Mouton de Gruyter. Kortmann, Bernd, and Christoph Wolk (). ‘Morphosyntactic variation in the Anglophone world: A global perspective’, in Bernd Kortmann and Kerstin Lunkenheimer (eds), The Mouton World Atlas of Variation in English. Berlin/Boston: Mouton de Gruyter, –. Kortmann, Bernd, Kate Burridge, Rajend Mesthrie, Edgar W. Schneider, and Clive Upton (eds) (). A Handbook of Varieties of English. Volume : Morphology and Syntax. Berlin/ New York: Mouton de Gruyter. Koster, Jan (). ‘Why subject sentences don’t exist’, in S. Jay Keyser (ed), Recent Transformational Studies in European Languages. Cambridge, MA: MIT Press, –. Kratzer, Angelika (). ‘Severing the external argument from its verb’, in Johan Rooryck and Laurie Zaring (eds), Phrase Structure and the Lexicon. Dordrecht: Kluwer, –. Kratzer, Angelika, and Elizabeth Selkirk (). ‘Default phrase stress, prosodic phrasing and the spellout edge: The case of verbs.’ The Linguistic Review : –. Kreidler, Charles W. (). ‘English word stress: A theory of word-stress patterns in English by Ivan Poldauf and W. R. Lee [Review].’ Language : –. Kress, Gunther, and Robert Hodge (). Language as Ideology. London: Routledge. Kreyer, Rolf (). Inversion in Modern Written English: Syntactic Complexity, Information Status and the Creative Writer. Tübingen: Narr. Krifka, Manfred (). ‘Quantifying into question acts’, in Tanya Matthews and Devon Strolovitch (eds), Proceedings of Semantics and Linguistic Theory . Ithaca, NY: CLC Publications. Krifka, Manfred (). ‘Quantifying into question acts.’ Natural Language Semantics (): –. Krifka, Manfred (). ‘Basic notions of information structure.’ Acta Linguistica Hungarica (–): –. Krifka, Manfred (). ‘Questions’, in Klaus von Heusinger, Claudia Maienborn, and Paul Portner (eds), Semantics . Handbook of Linguistics and Communication Science .. Berlin/New York: Mouton de Gruyter, –. Krug, Manfred G. (). Emerging English Modals: A Corpus-based Study of Grammaticalization. Berlin/New York: Mouton de Gruyter. Kruisinga, Etsko (–). A Handbook of Present-Day English. Groningen: Noordhoff. Kruisinga, Etsko (). A Handbook of Present-Day English, Part II. English Accidence and Syntax, vol. , th edn. Groningen: Noordhoff. Kučera, Henry, and W. Nelson Francis (). Computational Analysis of Present-Day American English. Providence, MA: Brown University Press. Kuhn, Thomas (). ‘The historical structure of scientific discovery,’ in Thomas Kuhn, The Essential Tension: Selected Studies in Scientific Tradition and Change. Chicago: University of Chicago Press, –. Kuno, Susumo (). ‘Functional sentence perspective: A case study from Japanese and English.’ Linguistic Inquiry : –. Kunter, Gero (). Compound Stress in English. Berlin/New York: Mouton de Gruyter. Kuperberg, Gina R. (). ‘Neural mechanisms of language comprehension: Challenges to syntax.’ Brain Research : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Kuperberg, Gina R., Tatiana Sitnikova, David Caplan, and Phillip J. Holcomb (). ‘Electrophysiological distinctions in processing conceptual relationships within simple sentences.’ Cognitive Brain Research : –. Kutas, Marta, and Kara D. Federmeier (). ‘Electrophysiology reveals semantic memory use in language comprehension.’ Trends in Cognitive Sciences : –. Kutas, Marta, and Steven A. Hillyard (). ‘Event-related brain potentials to semantically inappropriate and surprisingly large words.’ Biological Psychology : –. Kutas, Marta, Cyma Van Petten, and Robert Kluender (). ‘Psycholinguistics electrified II: –’, in Matthew J. Traxler and Morton Ann Gernsbacher (eds), Handbook of Psycholinguistics, nd edn. Amsterdam: Elsevier, –. Kuteva, Tania (). Auxiliation: An Enquiry into the Nature of Grammaticalization, nd edn. Oxford: Oxford University Press. Labov, William (). The Social Stratification of English in New York City. Washington, D.C.: Centre for Applied Linguistics. Labov, William (a). ‘Negative attraction and negative concord in English grammar.’ Language : –. Labov, William (b). Language in the Inner City: Studies in Black English Vernacular. Philadelphia: University of Pennsylvania Press. Ladd, D. Robert (). The Structure of Intonational Meaning: Evidence from English. Bloomington: Indiana University Press. Ladd, D. Robert (). Intonational Phonology. Cambridge: Cambridge University Press. Ladd, D. Robert (). Simultaneous Structure in Phonology. Oxford: Oxford University Press. Ladd, D. Robert, and Rachel Morton (). ‘The perception of intonational emphasis: Continuous or categorical?’ Journal of Phonetics : –. Lakatos, Imre (). Criticism and the Growth of Knowledge. New York: Cambridge University Press. Lakoff, George (). On the Nature of Syntactic Irregularity. Ph.D. thesis. Bloomington, IN: Indiana University Press. Published as Lakoff (a). Lakoff, George (a). Irregularity in Syntax. New York: Holt, Rinehart and Winston. Lakoff, George (b). ‘Linguistics and natural logic.’ Synthese (–): –. Lakoff, George (). ‘Pragmatics in natural logic’, in Edward L. Keenan (ed), Formal Semantics of Natural Language. Cambridge: Cambridge University Press, –. Lakoff, George (). ‘Categories: An essay in cognitive linguistics’, in Linguistics in the Morning Calm. Korea: Linguistic Society of Korea, –. Lakoff, George (). Women, Fire, and Dangerous Things: What Categories Reveal About the Mind. Chicago: Chicago University Press. Lakoff, George (). ‘The contemporary theory of metaphor’, in Andrew Ortony (ed), Metaphor and Thought, nd edn. Cambridge: Cambridge University Press, –. Lakoff, George (). ‘Cognitive phonology’, in John Goldsmith (ed), The Last Phonological Rule: Reflections on Constraints and Derivations. Chicago: Chicago University Press, –. Lakoff, George, and Mark Johnson (/). Metaphors We Live By. Chicago: Chicago University Press. Lakoff, Robin (). ‘The way we were; Or; the real actual truth about generative semantics: A memoir.’ Journal of Pragmatics (): –. Lambek, Joachim (). ‘The mathematics of sentence structure.’ American Mathematical Monthly (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Lambrecht, Knud (). Information Structure and Sentence Form: Topic, Focus, and the Mental Representations of Discourse Referents. Cambridge: Cambridge University Press. Lambrecht, Knud (). ‘A framework for the analysis of cleft constructions.’ Linguistics (): –. Landau, Idan (). Control in Generative Grammar: A Research Companion. Cambridge: Cambridge University Press. Langacker, Ronald W. (). ‘Space grammar, analysability, and the English passive.’ Language (): –. Langacker, Ronald W. (). Foundations of Cognitive Grammar, Vol. I. Theoretical Prerequisites. Stanford: Stanford University Press. Langacker, Ronald W. (). ‘A usage-based model’, in Brygida Rudzka-Ostyn (ed), Topics in Cognitive Linguistics. Amsterdam/Philadelphia: John Benjamins, –. Langacker, Ronald W. (). Foundations of Cognitive Grammar, Vol. II. Descriptive Application. Stanford: Stanford University Press. Langacker, Ronald W. (). ‘Constituency, dependency, and conceptual grouping.’ Cognitive Linguistics : –. Langacker, Ronald W. (). Grammar and Conceptualization. Berlin/New York: Mouton de Gruyter. Langacker, Ronald W. ( []). Concept, Image, and Symbol: The Cognitive Basis of Grammar. Berlin/New York: Mouton de Gruyter. Langacker, Ronald W. (). ‘Construction grammars: Cognitive, radical, and less so’, in Francisco J. Ruiz de Mendoza Ibáñez and M. Sandra Peña Cervel (eds), Cognitive Linguistics: Internal Dynamics and Interdisciplinary Interaction, Berlin/New York: Mouton de Gruyter, –. Langacker, Ronald W. (a). Cognitive Grammar: A Basic Introduction. Oxford: Oxford University Press. Langacker, Ronald W. (b). ‘Sequential and summary scanning: A reply.’ Cognitive Linguistics : –. Langacker, Ronald W. (). Investigations in Cognitive Grammar. Berlin/New York: Mouton de Gruyter. Langacker, Ronald W. (). ‘Modals: striving for control’, in Juana I. Marín-Arrese, Marta Carretero, Jorge Arús Hita, and Johan van der Auwera (eds), English Modality: Core, Periphery and Evidentiality. Berlin/New York: Mouton de Gruyter [TIELS ], –. Langacker, Ronald W. (). ‘How to build an English clause.’ Journal of Foreign Language Teaching and Applied Linguistics (): –. Langacker, Ronald W. (). ‘Trees, chains, assemblies, and windows’. Keynote paper presented at the th International Conference on Construction Grammar at the Federal University of Juiz de For, – October . Langendoen, D. Terence, and Harris Savin (). ‘The projection problem for presuppositions’, in Charles J. Fillmore and D. Terence Langendoen (eds), Studies in Linguistic Semantics. New York: Holt, Rinehart and Winston, –. Lapointe, Steven (). A Theory of Grammatical Agreement. Ph.D. thesis. Amherst, MA: University of Massachusetts. Lappin, Shalom, and Chris Fox (eds) (). The Handbook of Contemporary Semantic Theory, nd edn. Oxford: Wiley-Blackwell. Larkin, Philip (). High Windows. London: Faber and Faber. Larson, Richard (). ‘On the double object construction.’ Linguistic Inquiry : –. Lasnik, Howard (). ‘Clause-mate conditions’. Korean Linguistics Today and Tomorrow: Proceedings of the Association for Korean Linguistics, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Lasnik, Howard, and Terje Lohndal (). ‘Government–binding/principles and parameters theory.’ Wiley Interdisciplinary Reviews: Cognitive Science –: –. Lasnik, Howard, and Terje Lohndal (). ‘Brief overview of the history of generative syntax’, in Marcel den Dikken (ed), The Cambridge Handbook of Generative Syntax. Cambridge: Cambridge University Press, –. Lasnik, Howard, and Mamoru Saito (). ‘On the nature of proper government.’ Linguistic Inquiry : –. Lass, Roger (). ‘Phonology and morphology’, in Norman Blake (ed), The Cambridge History of the English Language, Vol. II –. Cambridge: Cambridge University Press, –. Lass, Roger (). ‘Phonology and morphology’, in Roger Lass (ed), The Cambridge History of the English Language, Vol. III –. Cambridge: Cambridge University Press, –. Lass, Roger (). ‘Phonology and morphology’, in Richard Hogg and David Denison (eds), A History of the English Language. Cambridge: Cambridge University Press, –. Lau, Ellen F., Clare Stroud, Silke Plesch, and Colin Phillips (). ‘The role of structural prediction in rapid syntactic analysis.’ Brain and Language : –. Lee, Seung-Ah (). ‘Ing forms and the progressive puzzle: A construction-based approach to English progressives.’ Journal of Linguistics : –. Leech, Geoffrey N. (). Towards a Semantic Description of English. London: Longman. Leech, Geoffrey N. (). Principles of Pragmatics. London: Longman. Leech, Geoffrey N. (a). ‘Grammars of spoken English: New outcomes of corpus-oriented research.’ Language Learning (): –. Leech, Geoffrey N. (b). ‘Same grammar or different grammar? Contrasting approaches to the grammar of spoken English discourse’, in Srikant Sarangi and Malcolm Coulthard (eds), Discourse and Social Life. London: Longman, –. Leech, Geoffrey N. (). ‘Modality on the move: The English modal auxiliaries –’, in Roberta Fachinetti, Manfred Krug, and Frank Palmer (eds), Modality in Contemporary English. Berlin/New York: Mouton de Gruyter, –. Leech, Geoffrey N. (). Meaning and the English Verb, rd edn. London: Longman. Leech, Geoffrey N. (). Language in Literature: Style and Foregrounding. London: Pearson Education. Leech, Geoffrey N. (). ‘Where have all the modals gone? On the recent loss of frequency of English modal auxiliaries.’ Guest lecture, English Department, Zürich, April . Leech, Geoffrey N., and Roger Garside (). ‘Running a grammar factory: The production of syntactically annotated corpora or “treebanks”, in Stig Johansson and Anna-Britte Stenström (eds), English Computer Corpora: Selected Papers and Research Guide. Berlin/ New York: Mouton de Gruyter, –. Leech, Geoffrey N., and Mick Short (). Style in Fiction. London: Longman. Leech, Geoffrey N., and Jan Svartvik (). A Communicative Grammar of English, rd edn. Abingdon/New York: Routledge. Leech, Geoffrey N., Marianne Hundt, Christian Mair, and Nicholas Smith (). Change in Contemporary English: A Grammatical Study. Cambridge: Cambridge University Press. Lees, Robert B. (). The Grammar of English Nominalizations. Bloomington: Indiana University Press. Lehmann, Christian (). ‘Synchronic variation and diachronic change.’ Lingua e Stile (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Lehmann, Christian (). ‘Towards a typology of clause linkage’, in John Haiman and Sandra A. Thompson (eds), Clause Combining in Discourse and Grammar (Typological Studies in Language, ). Amsterdam/Philadelphia: John Benjamins, –. Leino, Jaakko (). ‘Information structure’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Leitner, Gerhard (). ‘English grammaticology.’ International Review of Applied Linguistics in Language Teaching : –. Lerner, Gene H. (). ‘Assisted storytelling: Deploying shared knowledge as a practical matter.’ Qualitative Sociology (): –. Lev, Iddo (). Packed Computation of Exact Meaning Representations. Ph.D. thesis. Stanford, CA: Stanford University. Levi, Judith (). The Syntax and Semantics of Complex Nominals. New York: Academic Press. Levin, Beth (). English Verb Classes and Alternations. Chicago: University of Chicago Press. Levin, Beth, and Malka Rappaport Hovav (). ‘Non-event -er nominals: A probe into argument structure.’ Linguistics, : –. Levin, Beth, and Malka Rappaport Hovav (). ‘Wiping the slate clean: A lexical semantic exploration.’ Cognition : –. Levin, Beth, and Malka Rappaport Hovav (). Unaccusativity: At the Syntax-Lexical Semantics Interface. Cambridge, MA: MIT Press. Levin, Beth, and Malka Rappaport Hovav (). ‘Morphology and the lexicon’, in Andrew Spencer and Arnold M. Zwicky (eds), Handbook of Morphology. Oxford: Blackwell, –. Levine, Robert D. (). ‘Auxiliaries: To’s company.’ Journal of Linguistics : –. Levine, Robert D. (). Syntactic Analysis: An HPSG-based Approach. Cambridge: Cambridge University Press. Levinson, Stephen C. (). ‘Pragmatics and social deixis: Reclaiming the notion of conventional implicature’, in Christine Chiarello, John Kingston, Eve E. Sweetser, James Collins, Haruko Kawasaki, John Manley-Buser, Dorothy W. Marshak, Catherine O’Connor, David Shaul, Marta Tobey, Henry Thompson and Katherine Turner (eds), Proceedings of the th Annual Meeting of the Berkeley Linguistics Society. Berkeley: Berkeley Linguistics Society, –. Levinson, Stephen C. (). Pragmatics. Cambridge: Cambridge University Press. Levinson, Stephen C. (). Presumptive Meanings. Cambridge, MA: MIT Press. Levinson, Stephen C. (). ‘Speech acts’, in Yan Huang (ed), The Oxford Handbook of Pragmatics. Oxford: Oxford University Press, –. Levy, Yonata (). ‘It’s frogs all the way down.’ Cognition : –. Lewis, David (). ‘General semantics.’ Synthese (–): –. Lewis, Mark (). An Essay to Facilitate the Education of Youth, by Bringing Down the Rudiments of Grammar to the Sense of Seeing. London: Parkhurst. Lewis, Shevaun, and Colin Phillips (). ‘Aligning grammatical theories and language processing models.’ Journal of Psycholinguistic Research : –. Li, Juan (). ‘Collision of language in news discourse: A functional–cognitive perspective on transitivity.’ Critical Discourse Studies (): –. Liberman, Mark, and Janet Pierrehumbert (). ‘Intonational variance under changes in pitch range and length’, in Mark Aronoff and Richard T. Oehrle (eds), Language Sound Structure: Studies in Phonology Presented to Morris Halle by his teacher and students. Cambridge, MA: MIT Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Liberman, Mark, and Alan Prince (). ‘On stress and linguistic rhythm.’ Linguistic Inquiry : –. Lieber, Rochelle (). Morphology and Lexical Semantics. Cambridge: Cambridge University Press. Lieber, Rochelle (). Introducing Morphology. Cambridge: Cambridge University Press. Lieber, Rochelle (). ‘Semantics of derivational morphology’, in Claudia Maienborn, Klaus von Heusinger, and Paul Portner (eds), Semantics: An International Handbook of Natural Language Meaning, vol. . Berlin/New York: Mouton de Gruyter, –. Lieber, Rochelle (). ‘Theoretical approaches to derivation’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Derivational Morphology. Oxford: Oxford University Press, –. Lieber, Rochelle (). ‘The semantics of transposition.’ Morphology : –. Lieber, Rochelle (). English Nouns: The Ecology of Nominalization. Cambridge: Cambridge University Press. Lieber, Rochelle, and Sergio Scalise (). ‘The Lexical Integrity Hypothesis in a new theoretical universe’, in Geert Booij, Luca Ducceschi, Bernard Fradin, Emiliano Guevara, Angela Ralli and Sergio Scalise (eds), On-line Proceedings of the Fifth Mediterranean Morphology Meeting (MMM) Fréjus – September . Bologna: Università degli Studi di Bologna, –. Lieber, Rochelle, and Pavol Štekauer (eds) (). The Oxford Handbook of Derivational Morphology. Oxford: Oxford University Press. Lieven, Elena (). ‘Building language competence in first language acquisition.’ European Review (): –. Lieven, Elena (). ‘Usage-based approaches to language development: Where do we go from here?’ Language and Cognition (): –. Lieven, Elena (). ‘Developing language from usage: Explaining errors’, in Marianne Hundt, Sandra Mollin, and Simone E. Pfenninger (eds), The Changing English Language: Psycholinguistic Perspectives. Cambridge: Cambridge University Press. Lieven, Elena, and Michael Tomasello (). ‘Children’s first language acquisition from a usage-based perspective’, in Peter Robinson and Nick C. Ellis (eds), The Handbook of Cognitive Linguistics and Second Language Acquisition. New York: Routledge, –. Lieven, Elena, Julian M. Pine, and Gillian Baldwin (). ‘Lexically-based learning and early grammatical development.’ Journal of Child Language (): –. Lightfoot, David (). Principles of Diachronic Syntax. Cambridge: Cambridge University Press. Lightfoot, David (). The Development of Language: Acquisition, Change, and Evolution. Malden, MA: Blackwell. Lindström, Jan, Yael Maschler, and Simona Pekarek Doehler (). ‘A cross-linguistic perspective on grammar and negative epistemics in talk-in-interaction.’ Journal of Pragmatics : –. Linell, Per (). Approaching Dialogue: Talk, Interaction and Contexts in Dialogical Perspectives. Amsterdam/Philadelphia: John Benjamins. Linell, Per (). The Written Language Bias in Linguistics: Its Nature, Origins and Transformations. London: Routledge. Linn, Andrew (). ‘English grammar writing’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Littlemore, Jeannette, and John R. Taylor (eds) (). The Bloomsbury Companion to Cognitive Linguistics. London: Bloomsbury Publishing. Lobeck, Anne (). ‘Ellipsis in DP’, in Martin Everaert and Henk van Riemsdijk (eds), The Blackwell Companion to Syntax, vol. . Oxford: Blackwell, –. Löbel, Elisabeth (). ‘Q as a functional category’, in Christa Bhatt, Elisabeth Löbel, and Claudia Schmidt (eds), Syntactic Phrase Structure Phenomena. Amsterdam/Philadelphia: John Benjamins, –. Löbner, Sebastian (). ‘Definites.’ Journal of Semantics : –. Lohndal, Terje (). Phrase Structure and Argument Structure: A Case Study of the Syntax Semantics Interface. Oxford: Oxford University Press. Longacre, Robert E. (). ‘Some fundamental insights of tagmemics.’ Language : –. Longobardi, Giuseppe (). ‘Reference and proper names: A theory of N-movement in syntax and logical form.’ Linguistic Inquiry : –. López-Couso, María José and Belén Méndez-Naya (). ‘On the use of the subjunctive and modals in Old and Middle English dependent commands and requests: Evidence from the Helsinki Corpus.’ Neuphilologische Mitteilungen : –. Los, Bettelou (). ‘Generative approaches to English historical linguistics’, in Alexander Bergs and Laurel J. Brinton (eds), English Historical Linguistics. An International Handbook, vol. . Berlin/New York: Mouton de Gruyter, –. Los, Bettelou (). A Historical Syntax of English. Edinburgh: Edinburgh University Press. Lounsbury, Floyd (). Oneida Verb Morphology. Yale: Yale University Press. Lowe, John J. (a). ‘Complex predicates: An LFGþGlue analysis.’ Journal of Language Modelling (): –. Lowe, John J. (b). Participles in Rigvedic Sanskrit: The Syntax and Semantics of Adjectival Verb Forms. Oxford: Oxford University Press. Lowth, Robert (). A Short Introduction to English Grammar. London: A. Millar and R. and J. Dodsley. Lowth, Robert (). A Short Introduction to English Grammar: With Critical Notes (nd edn., corrected). London: A. Millar and R. and J. Dodsley. Reprinted , with Introduction by David A. Reibel. London: Routledge/Thoemmes. (Original work published .) Luck, Steven (). An Introduction to the Event-Related Potential Technique. Cambridge, MA: MIT Press. Lunkenheimer, Kerstin (a). ‘Tense and aspect’, in Raymond Hickey (ed), Areal Features of the Anglophone World. Berlin/Boston: Mouton de Gruyter, –. Lunkenheimer, Kerstin (b). ‘Typological profile: L varieties’, in Bernd Kortmann and Kerstin Lunkenheimer (eds), The Mouton World Atlas of Variation in English. Berlin/ Boston: Mouton de Gruyter, –. Lyons, John (). Introduction to Theoretical Linguistics. Cambridge: Cambridge University Press. Lyons, John (). Semantics, vols. Cambridge: Cambridge University Press. Lyons, John (). Linguistic Semantics: An Introduction. Cambridge: Cambridge University Press. Mackenzie, J. Lachlan (). ‘The study of semantic alternations in a dialogic Functional Discourse Grammar’, in Pilar Guerrero Medina (ed), Morphosyntactic Alternations in English: Functional and Cognitive Perspectives. London: Equinox, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Mackenzie, J. Lachlan (). ‘Functional linguistics’, in Keith Allen (ed), The Routledge Handbook of Linguistics. London/New York: Routledge, –. Macnamara, John (). Names for Things: A Study of Human Learning. Cambridge, MA: MIT Press. MacWhinney, Brian (). ‘The acquisition of morphophonology.’ Monographs of the Society for Research in Child Development, –. MacWhinney, Brian (). ‘Emergentist approaches to language’, in Joan L. Bybee and Paul Hopper (eds), Frequency and the Emergence of Linguistic Structure. Amsterdam/Philadelphia: John Benjamins, –. MacWhinney, Brian, Andrej Malchukov, and Edith Moravcsik (eds) (). Competing Motivations in Grammar and Usage. Oxford: Oxford University Press. Maddieson, Ian (). ‘Consonant-vowel ratio’, in Matthew Dryer and Martin Haspelmath (eds), The World Atlas of Language Structures Online. Munich: Max Planck Digital Library. Available online at http://wals.info/chapter/ (last accessed April ). Madlener, Karin (). Frequency Effects in Instructed Second Language Acquisition. Berlin/ Boston: Mouton de Gruyter. Mahlberg, Michaela (). Corpus Stylistics and Dickens’s Fiction. New York/London: Routledge. Mahlberg, Michaela, and Catherine Smith (). ‘Dickens, the suspended quotation and the corpus.’ Language and Literature (): –. Mahlberg, Michaela, Peter Stockwell, Johan de Joode, Catherine Smith, and Matthew Brook O’Donnell (). ‘CLiC Dickens: Novel uses of concordances for the integration of corpus.’ stylistics and cognitive poetics.’ Corpora (): –. Maiden, Martin (). ‘Some lessons from history: Morphomes in diachrony’, in Ana R. Luís and Ricardo Bermúdez-Otero (eds), The Morphome Debate: Diagnosing and Analysing Morphomic Patterns. Oxford: Oxford University Press, –. Maienborn, Claudia, Klaus von Heusinger, and Paul Portner (eds) (a). Semantics: An International Handbook of Natural Language Meaning, vol. . Berlin/New York: Mouton de Gruyter. Maienborn, Claudia, Klaus von Heusinger, and Paul Portner (eds) (b). Semantics: An International Handbook of Natural Language Meaning, vol. . Berlin/New York: Mouton de Gruyter. Maienborn, Claudia, Klaus von Heusinger, and Paul Portner (eds) (). Semantics: An International Handbook of Natural Language Meaning, vol. . Berlin/New York: Mouton de Gruyter. Mair, Christian (). Infinitival Complement Clauses in English. Cambridge: Cambridge University Press. Mair, Christian (). ‘Gerundial complements after begin and start: Grammatical and sociolinguistic factors, and how they work against each other’, in Günther Rohdenburg and Britta Mondorf (eds), Determinants of Grammatical Variation in English (Topics in English Linguistics, ). Berlin/New York: Mouton de Gruyter, –. Mair, Christian (a). ‘Do we got a difference? Divergent developments of semi-auxiliary (have) got (to) in British and American English’, in Marianne Hundt (ed), Late Modern English Syntax. Cambridge: Cambridge University Press, –. Mair, Christian (b). ‘Globalisation and the transnational impact of non-standard varieties’, in Eugene Green and Charles F. Meyer (eds), The Variability of Current World Englishes. Berlin/Boston: Mouton de Gruyter, –. Mair, Christian, and Geoffrey Leech (). ‘Current changes in English Syntax’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Malchukov, Andrej L. (). ‘Incompatible categories: Resolving the “present perfective paradox”’, in Lotte Hogeweg, Helen de Hoop, and Andrej L. Malchukov (eds), Crosslinguistic Semantics of Tense, Aspect, and Modality (Linguistik Aktuell, ). Amsterdam/ Philadelphia: John Benjamins, –. Maling, Joan (). ‘Transitive adjectives: A case of categorial reanalysis’, in Frank Heny and Barry Richards (eds), Linguistic Categories: Auxiliaries and Related Puzzles, vol. . Dordrecht: Reidel, –. Mann, William C., and Sandra A. Thompson (). ‘Rhetorical Structure Theory: Towards a functional theory of text organization.’ Text (): –. Marantz, Alec (). ‘The minimalist program’, in Gert Webelhuth (ed), Government and Binding Theory and Minimalist Program. Oxford: Blackwell, –. Marantz, Alec (). ‘No escape from syntax: Don’t try morphological analysis in the privacy of your own lexicon.’ University of Pennsylvania Working Papers in Linguistics: vol. (), article : –. Marantz, Alec (). ‘Linguistics as cognitive science: Back to our roots.’ th Annual Joshua and Verona Whatmough Lecture in Linguistics, Harvard University, April . [Available on YouTube]. Marchand, Hans (). The Categories and Types of Present-Day English Word-Formation: A Synchronic Diachronic Approach, st edn. Wiesbaden, Germany: Harrassowitz. Marchand, Hans (). The Categories and Types of Present-Day English Word-Formation: A Synchronic Diachronic Approach, nd edn. Munich: C. H. Beck’sche Verlagsbuchhandlung. Marcus, Mitchell, Mary Ann Marcinkiewicz, and Beatrice Santorini (). ‘Building a large annotated corpus of English: The Penn Treebank.’ Computational Linguistics (): –. Martí, Luisa (). ‘Unarticulated constituents revisited.’ Linguistics and Philosophy (): –. Martin, J. R. (). ‘Meaning matters: A short history of systemic functional linguistics.’ WORD (): –. Martin, J. R. (). ‘Systemic functional linguistics’, in Ken Hyland and Brian Paltridge (eds). Continuum Companion to Discourse Analysis. London: Continuum, –. Martínez Insúa, Ana E. (). Existential There-constructions in Contemporary British English. München: Lincom. Mastop, R. J. (). What Can You Do?: Imperative Mood in Semantic Theory. Amsterdam: Institute for Logic and Computation. Matthews, George H. (). Hidatsa Syntax. London: Mouton. Matthews. P. H. (). Inflectional Morphology. Cambridge: Cambridge University Press. Matthews, P. H. (). Syntax. Cambridge: Cambridge University Press. Matthews, P. H. (). Morphology, nd edn. Cambridge: Cambridge University Press. Matthews, P. H. (). Grammatical Theory in the United States from Bloomfield to Chomsky. Cambridge: Cambridge University Press. Mattys, Sven L., Laurence White, and James F. Melhorn (). ‘Integration of multiple segmentation cues: A hierarchical framework.’ Journal of Experimental Psychology: General : –. Mazoyer, Bernard M., Nathalie Tzourio, Victor Frak, André Syrota, Noriko Murayama, Olivier Levrier, Georges Salamon, Stanislas Dehaene, Laurent Cohen, and Jacques Mehler (). ‘The cortical representation of speech.’ Journal of Cognitive Neuroscience : –. McArthur, Tom (). ‘Standard English’, in Tom McArthur (ed), The Oxford Companion to the English Language. Oxford/New York: Oxford University Press, –. McCawley, James D. (a). ‘Lexical insertion in a transformational grammar without deep structure’, in Bill J. Darden, Charles J. N. Bailey, and Alice Davison (eds), Papers from the
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Fourth Regional Meeting of the Chicago Linguistic Society. Chicago, IL: Chicago Linguistic Society, –. McCawley, James D. (b). ‘The role of semantics in a grammar’, in Emmon Bach and Robert T. Harms (eds), Universals in Linguistic Theory. New York: Holt, Rinehart and Winston, –. McCawley, James D. (). The Syntactic Phenomena of English. Vol. I. Chicago/London: University of Chicago Press. McCloskey, Jim (). ‘Subjecthood and subject positions’, in Liliane Haegeman (ed), Elements of Grammar. Dordrecht: Kluwer, –. McCoard, Robert W. (). The English Perfect: Tense Choice and Pragmatic Inferences. Amsterdam: North-Holland. McCormick, Kay (). ‘Cape Flats English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. McEnery, Tony, and Andrew Hardie (). Corpus Linguistics: Method, Theory and Practice. Cambridge: Cambridge University Press. McGilvray, James (). Chomsky: Language, Mind, Politics, nd edn. Cambridge: Polity Press. McGinnis-Archibald, Martha (). ‘Distributed morphology’, in Andrew Hippisley and Gregory Stump (eds), The Cambridge Handbook of Morphology. Cambridge: Cambridge University Press, –. McGregor, William B. (). Semiotic Grammar. Oxford: Clarendon Press. McIntyre, Dan (). Point of View in Plays: A Cognitive Stylistic Approach to Viewpoint in Drama and Other Text-types. Amsterdam/Philadelphia: John Benjamins. McKinnon, Richard, and Lee Osterhout (). ‘Constraints on movement phenomena in sentence processing: Evidence from event-related brain potentials.’ Language and Cognitive Processes : –. McMahon, April (). Change, Chance, and Optimality, Oxford: Oxford University Press. McWhorter, John (). ‘The world’s simplest grammars are Creole grammars.’ Linguistic Typology : –. Melchers, Gunnel (). ‘English spoken in Orkney and Shetland: Morphology, syntax and lexicon’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Mel’čuk, Igor (). ‘Levels of dependency in linguistic description: Concepts and problems’, in Vilmos Ágel, Ludwig M. Eichinger, Hans-Werner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds), Dependenz und Valenz: Ein internationales Handbuch der zeitgenössischen Forschung/Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: Mouton de Gruyter, – . Mel’čuk, Igor (). ‘Dependency in natural language’, in Alan Polguère and Igor Mel’čuk (eds), Dependency in Linguistic Description. Amsterdam/Philadelphia: John Benjamins. –. Melloni, Chiara (). Event and Result Nominals: A Morpho-semantic Approach. Bern: Peter Lang Verlag. Merchant, Jason (). The Syntax of Silence: Sluicing, Islands, and Identity in Ellipsis. Ph.D. thesis. Santa Crux, CA: University of California at Santa Cruz. Merchant, Jason (). The Syntax of Silence. Oxford: Oxford University Press. Merchant, Jason (). ‘Voice and ellipsis.’ Linguistic Inquiry : –. Miall, David, and Don Kuiken (). ‘Foregrounding, defamiliarization, and affect: Response to literary stories.’ Poetics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Michael, Ian (). English Grammatical Categories and the Tradition to . Cambridge: Cambridge University Press. Michaelis, Laura, and Knud Lambrecht (). ‘The exclamative sentence type in English.’ Conceptual Structure, Discourse and Language. Stanford: CSLI Publications. Miestamo, Matti (). ‘Negation – an overview of typological research.’ Language and Linguistics Compass (): –. Miestamo, Matti (). ‘Negatives without negators’, in Jan Wohlgemuth and Michael Cysouw (eds), Rethinking Universals: How Rarities Affect Linguistic Theory (Empirical Approaches to Language Typology, ). Berlin/New York: Mouton de Gruyter, –. Millar, Neil (). ‘Modal verbs in TIME. Frequency changes –.’ International Journal of Corpus Linguistics (): –. Miller, Carolyn (). ‘Genre as social action.’ Quarterly Journal of Speech : –. Miller, D. Gary (). English Lexicogenesis. Oxford: Oxford University Press. Miller, George A. (). ‘WordNet: A lexical database for English.’ Communications of the ACM (): –. Miller, George A., and Noam Chomsky (). ‘Finitary models of language users’, in R. Duncan Luce, Robert R. Bush, and Eugene Galanter (eds), The Handbook of Mathematical Psychology, vol. , New York: Wiley, –. Miller, James Edward (). A Critical Introduction to Syntax. London: Continuum. Miller, Jim (). ‘Scottish English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar W. Schneider, and Clive Upton (eds) (), –. Miller, Jim (). ‘Spoken and written English’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –. Miller, Jim, and Regina Weinert (). Spontaneous Spoken Language: Syntax and Discourse. Oxford: Clarendon Press. Minkova, Donka, and Robert Stockwell (). English Words: History and Structure, nd edn. Cambridge: Cambridge University Press. Mintz, Toben H. (). ‘Frequent frames as a cue for grammatical categories in child directed speech.’ Cognition : –. Moens, Marc, and Mark Steedman (). ‘Temporal ontology and temporal reference.’ Computational Linguistics : –. Mompean, Jose A. (). Cognitive Linguistics and Phonology, in Jeannette Littlemore and John Taylor (eds), The Bloomsbury Companion to Cognitive Linguistics. London: Bloomsbury Publishing, –. Monaghan, Charles (). The Murrays of Murray Hill. Brooklyn, NY: Urban History Press. Monaghan, Padraic, Nick Chater, and Morten H. Christiansen (). ‘The differential role of phonological and distributional cues in grammatical categorisation.’ Cognition : –. Mondorf, Britta (). ‘Genre effects in the replacement of reflexives by particles’, in Heidrun Dorgeloh and Anja Wanner (eds), Syntactic Variation and Genre. Berlin/New York: Mouton de Gruyter, –. Montague, Richard (). ‘English as a formal language’, in Bruno Visentini et al. (eds), Linguaggi nella Società e nella Tecnica. Milan: Edizioni di Communità, –. Reprinted in Montague (: –). Montague, Richard (). ‘The proper treatment of quantification in ordinary English’, in Jaakko Hintikka, Julian Moravcsik, and Patrick Suppes (eds), Approaches to Language. Dordrecht: Reidel, –. Reprinted in Montague (: –). Montague, Richard (). Formal Philosophy: Selected Papers of Richard Montague. New Haven, CT: Yale University Press. Edited and with an introduction by Richmond H. Thomason.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Montalbetti, Mario (). After Binding. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Montgomery, Michael (). ‘Appalachian English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Moreton, Elliott (). ‘Non-computable functions in optimality theory’, in John J. McCarthy (ed), Optimality Theory in Phonology: A Reader. Oxford: Blackwell, –. Morgan, Jerry L. (). ‘On the treatment of presupposition in transformational grammar’, in Robert I. Binnick, Alice Davison, Georgia M. Green, and Jerry L. Morgan (eds), Papers from the Fifth Regional Meeting of the Chicago Linguistic Society. Chicago, IL: Chicago Linguistic Society, –. Morrill, Glyn V. (). Type Logical Grammar. Dordrecht: Kluwer. Morrill, Glyn V. (). Categorial Grammar: Logical Syntax, Semantics, and Processing. Oxford: Oxford UniversityPress. Moss, Lesley (). Corpus Stylistics and Henry James’ Syntax. Ph.D. thesis. London: University College London. Mufwene, Salikoko S. (). The Ecology of Language Evolution. Cambridge: Cambridge University Press. Mufwene, Salikoko S. (). ‘Gullah: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Mukherjee, Joybrato (). ‘Corpus linguistics and English reference grammars.’ Language and Computers : –. Mulder, Jean, and Sandra A. Thompson (). ‘The grammaticalization of but as final particle in English conversation’, in Laury, Ritva (ed), Crosslinguistic Studies of Clause Combining. Amsterdam/Philadelphia: John Benjamins. Mulder, Jean, Sandra A. Thompson, and Cara Perry Williams (). ‘Final but in Australian English conversation’, in Pam Peters, Peter Collins, and Adam Smith (eds), Comparative Studies in Australian and New Zealand English: Grammar and Beyond. Amsterdam/ Philadelphia: John Benjamins, –. Müller, Peter O., Ingeborg Ohnheiser, Susan Olsen and Franz Rainer (eds) (). WordFormation: An International Handbook of the Languages of the World. volumes. Berlin/ New York: Mouton de Gruyter. Müller, Stefan (). ‘HPSG – A synopsis’, in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis: An International Handbook, Handbücher zur Sprach- und Kommunikationswissenschaft. Berlin/New York: Mouton de Gruyter, –. Müller, Stefan (a). Grammatical Theory: From Transformational Grammar to Constraintbased Approaches. Berlin: Language Science Press. http://langsci-press.org/catalog/book/. Müller, Stefan (b). ‘Flexible phrasal constructions, constituent structure and (crosslinguistic) generalizations: A discussion of template-based phrasal LFG approaches’, in Doug Arnold, Miriam Butt, Berthold Crysmann, Tracy Holloway King, and Stefan Müller (eds), Proceedings of the Joint Conference on Head-Driven Phrase Structure Grammar and Lexical Functional Grammar. Stanford, CA: CSLI Publications. Müller, Stefan (). A Lexicalist Account of Argument Structure: Template-Based Phrasal LFG Approaches and a Lexical HPSG Alternative. Berlin: Language Science Press. Müller, Stefan, and Stephen Wechsler (). ‘Lexical approaches to argument structure.’ Theoretical Linguistics (–): –. Müller, Stefan, and Stephen Wechsler (). ‘The lexical-constructional debate’, in Wechsler. Oxford: Oxford University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Münte, Thomas F., and Hans J. Heinze (). ‘ERP negativities during syntactic processing of written words’, in Hans J. Heinze, Thomas F. Münte, and G. R. Mangun (eds), Cognitive Electrophysiology. Boston: Birkhauser, –. Münte, Thomas F., Mike Matzke, and Sönke Johannes (). ‘Brain activity associated with syntactic incongruencies in words and pseudo-words.’ Journal of Cognitive Neuroscience : –. Murphy, M. Lynne (). Semantic Relations and the Lexicon: Antonymy, Synonymy, and Other Paradigms. Cambridge: Cambridge University Press. Murray, Lindley (). English Grammar, Adapted to the Different Classes of Learners. York: Wilson, Spencer and Mawman. Murray, Lindley (). An English Grammar: Comprehending the Principles and Rules of the Language, Illustrated by Appropriate Exercises, and a Key to those Exercises, Vols. –. England: Thomas Wilson and Sons. Murray, Lindley (). Memoirs of the Life of Lindley Murray, in a Series of Letters. York: Longman and Rees. Murray, Thomas, and Beth Lee Simon (). ‘Colloquial American English: Grammatical features’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Muysken, Pieter (). Functional Categories. Cambridge: Cambridge University Press. Nagle, Stephen J. (). ‘The English double modal conspiracy.’ Diachronica : –. Narrog, Heiko (a). ‘On defining modality again.’ Language Sciences : –. Narrog, Heiko (b). ‘Modality, mood, and change of modal meanings: A new perspective.’ Cognitive Linguistics : –. Narrog, Heiko (a). Modality, Subjectivity, and Semantic Change. Oxford: Oxford University Press. Narrog, Heiko (b). ‘Modality and speech act orientation’, in Johan van der Auwera and Jan Nuyts (eds), Grammaticalization and (Inter)subjectification. [Koninklijke Vlaamse Academie van België voor Wetenschappen en Kunsten.] Brussels: Universa, –. Narrog, Heiko, and Bernd Heine (eds) (). The Oxford Handbook of Grammaticalization. Oxford: Oxford University Press. Nathan, Geoffrey S. (). Phonology: A Cognitive Grammar Introduction. Amsterdam/ Philadelphia: John Benjamins. Neale, Stephen (). Descriptions. Cambridge, MA: MIT Press. Neale, Stephen (). ‘On being explicit: Comments on Stanley and Szabó, and on Bach.’ Mind and Language (–): –. Neidle, Carol (). ‘Lexical functional grammar’, in R. E. Asher and M. Y. Simpson (eds), The Encyclopedia of Language and Linguistics. New York: Pergamon Press: –. Nelson, Gerald, Sean Wallis, and Bas Aarts (). Exploring Natural Language: Working With the British Component of the International Corpus of English. Amsterdam/Philadelphia: John Benjamins. Nesfield, J. C. (). Outline of English Grammar. London: Macmillan. Nespor, Marina, and Irene Vogel (). Prosodic Phonology. Dordrecht: Foris Publications. Nevalainen, Terttu (). ‘Negative concord as an English “Vernacular Universal”: Social history and linguistic typology.’ English Language and Linguistics (): –. Nevalainen, Terttu (). ‘Descriptive adequacy of the S-curve model in diachronic studies of language change’, in Christina Sanchez-Stockhammer (ed), Can We Predict Linguistic
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Change? (Studies in Variation, Contact and Change in English, ). Erlangen-Nürnberg: Friedrich-Alexander University. http://www.helsinki.fi/varieng/series/volumes//nevalainen/. Nevalainen, Terttu, and Elisabeth Closs Traugott (eds) (). The Oxford Handbook of the History of English. Oxford: Oxford University Press. Neville, Helen, Janet L. Nicol, Andrew Barss, Kenneth I. Forster, and Merrill F. Garrett (). ‘Syntactically based sentence processing classes: Evidence from event-related brain potentials.’ Journal of Cognitive Neuroscience : –. Nevins, Andrew, and Jeffrey Parrott (). ‘Variable rules meet impoverishment theory: Patterns of agreement leveling in English varieties.’ Lingua (): –. Newlyn, Lucy (). Ginnel. Manchester: Carcanet. Newmeyer, Frederick J. (). Linguistic Theory in America. New York: Academic Press. Newmeyer, Frederick J. (). Language Form and Language Function. Cambridge, MA: MIT Press. Newmeyer, Frederick J. (). ‘Grammar is grammar and usage is usage.’ Language : –. Newmeyer, Frederick J. (). ‘What conversational English tells us about the nature of grammar: A critique of Thompson’s analysis of object complements’, in Kasper Boye and Elizabeth Engberg-Pedersen (eds), Usage and Structure: A Festschrift for Peter Harder. Berlin/New York: Mouton de Gruyter, –. Newton, John (). School Pastime for Young Children: Or the Rudiments of Grammar. London: Walton. Nikolaeva, Irina (ed) (). Finiteness: Theoretical and Empirical Foundations. Oxford: Oxford University Press. Noonan, Michael (). ‘Non-structuralist syntax’, in Michael Darnell, Edith Moravcsik, Fritz Newmeyer, Michael Noonan, and Kathleen Wheatley (eds), Functionalism and Formalism in Linguistics. Amsterdam/Philadelphia: John Benjamins, –. Nordlinger, Rachel, and Louisa Sadler (). ‘Nominal tense in crosslinguistic perspective.’ Language : –. Nordlinger, Rachel, and Louisa Sadler (). ‘Morphology in LFG and HPSG’, in Jenny Audring and Francesca Masini (eds), The Oxford Handbook of Morphological Theory. Oxford: Oxford University Press. Nordlinger, Rachel, and Elisabeth Closs Traugott (). ‘Scope and the development of epistemic modality: Evidence from ought to.’ English Language and Linguistics : –. Nouwen, Rick (forthcoming). ‘E-type pronouns: Congressmen, sheep and paychecks’, in Daniel Gutzmann, Lisa Matthewson, Cécile Meier, Hotze Rullmann, and Thomas Ede Zimmerman (eds), The Companion to Semantics. Oxford: Wiley-Blackwell. Nouwen, Rick, Adrian Brasoveanu, Jan van Eijck, and Albert Visser (). ‘Dynamic semantics’, in Edward N. Zalta (ed), The Stanford Encyclopedia of Philosophy. Stanford: Metaphysics Research Lab, CSLI, Stanford University, Winter edn. Noyes, Alfred (). ‘The Highwayman.’ Blackwood’s Magazine vol. CLXXX: –. Edinburgh: William Blackwood and Sons. Nunberg, Geoffrey, Ivan A. Sag, and Thomas Wasow (). ‘Idioms.’ Language : –. Nuyts, Jan (). Epistemic Modality, Language, and Conceptualization: A Cognitive-Pragmatic Perspective. Amsterdam/Philadelphia: John Benjamins. Nuyts, Jan (). ‘Modality: Overview and linguistic issues’, in Frawley (ed), –. OED = Oxford English Dictionary (OED Online). –. Oxford: Oxford University Press. http://www.oed.com/.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OED (). The Oxford English Dictionary, st edn (extended). Oxford: Oxford University Press. Öhl, Peter (). ‘Acquisition based and usage based explanations of grammaticalisaton. An integrative approach’, in Sylvie Hancil and Ekkehard König (eds), Grammaticalization: Theory and Data. Amsterdam/Philadelphia: John Benjamins. Vol. of Studies in Language Companion Series. O’Keefe, Anne, and Michael McCarthy (eds) (). The Routledge Handbook of Corpus Linguistics. London/New York: Routledge. Orasan, Constantin (). ‘Patterns in scientific abstracts’, in Proceedings of Corpus Linguistics . Lancaster: Lancaster University, –. Orton, Harold, and Eugen Dieth (eds) (–). Survey of English Dialects, volumes. Leeds: E.J. Arnold and Son Ltd. Osborne, Timothy (). ‘Dependency grammar’, in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis. An International Handbook. Volume , Berlin/Munich, Boston: Mouton de Gruyter, –. Osborne, Timothy, and Thomas Groß (). ‘Constructions are catenae: Construction grammar meets dependency grammar.’ Cognitive Linguistics (), –. Osborne, Timothy, Michael Putnam, and Thomas Gross (). ‘Bare phrase structure, labelless structures, and specifier-less syntax: Is minimalism becoming a dependency grammar?’ The Linguistic Review : –. Osterhout, Lee, and Peter Hagoort (). ‘A superficial resemblance does not necessarily mean you are part of the family: Counterarguments to Coulson, King and Kutas () in the P/SPS-P debate.’ Language and Cognitive Processes : –. Osterhout, Lee, and Phillip J. Holcomb (). ‘Event-related brain potentials elicited by syntactic anomaly.’ Journal of Memory and Language : –. Osterhout, Lee, and Linda A. Mobley (). ‘Event-related brain potentials elicited by failure to agree.’ Journal of Memory and Language : –. Osterhout, Lee, Phillip J. Holcomb, and David Swinney (). ‘Brain potentials elicited by garden-path sentences: Evidence of the application of verb information during parsing.’ Journal of Experimental Psychology: Learning, Memory, and Cognition : –. Östman, Jan-Ola, and Jef Verschueren (eds) (). Handbook of Pragmatics Online. Amsterdam/ Philadelphia: John Benjamins. https://benjamins.com/online/hop/. Övergaard, Gerd (). The Mandative Subjunctive in American and British English in the th Century. Uppsala: Almquist and Wiksell. Palmer, Frank R. (). A Linguistic Study of the English Verb. London: Longman. Palmer, Frank R. (). Modality and the English Modals. London: Longman. Palmer, Frank R. (). Mood and Modality. Cambridge: Cambridge University Press. Palmer, Frank R. (). Modality and the English Modals, nd edn. London: Longman. Palmer, Frank R. (). Mood and Modality, nd edn. Cambridge: Cambridge University Press. Palmer, Frank R. (). ‘Modality in English: Theoretical, descriptive and typological issues’, in Roberta Facchinetti, Manfred Krug, and Frank Palmer (eds), Modality in Contemporary English. Berlin/New York: Mouton de Gruyter, –. Paltridge, Brian, and Sue Starfield (eds) (). The Handbook of English for Specific Purposes. Boston: Wiley-Blackwell. Panagiotidis, Phoevos (). Categorial Features: A Generative Theory of Word Class Categories. Cambridge: Cambridge University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Papafragou, Anna (). ‘Inference and word meaning: the case of modal auxiliaries.’ Lingua : –. Partee, Barbara H. (). ‘Lexical semantics and compositionality’, in Lila R. Gleitman and Mark Liberman (eds), An Invitation to Cognitive Science: Language, vol. , nd edn. Cambridge, MA: MIT Press, –. Partee, Barbara H., Alice ter Meulen, and Robert E. Wall (). Mathematical Methods in Linguistics. Dordrecht: Kluwer. Pater, Joe (). ‘Weighted constraints in generative linguistics.’ Cognitive Science , –. Patten, Amanda (). The English It-cleft. A Constructional Account and a Diachronic Investigation. Berlin/New York: Mouton de Gruyter. Pawley, Andrew (). ‘Australian Vernacular English: Some grammatical characteristics’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Pawley, Andrew, and Frances Hodgetts Syder (). ‘Natural selection in syntax: Notes on adaptive variation and change in vernacular and literary grammar.’ Journal of Pragmatics : –. Payne, John (). ‘Genitive coordinations with personal pronouns.’ English Language and Linguistics : –. Payne, John, and Rodney Huddleston (). ‘Nouns and noun phrases’, in Rodney Huddleston and Geoffrey K. Pullum (eds), The Cambridge Grammar of the English Language. Cambridge: Cambridge University Press: –. Payne, John, Rodney Huddleston, and Geoffrey K. Pullum (). ‘Fusion of functions: The syntax of once, twice and thrice.’ Journal of Linguistics : –. Payne, John, Rodney Huddleston, and Geoffrey K. Pullum (). ‘The distribution and category status of adjectives and adverbs.’ Word Structure : –. Payne, John, Geoffrey K. Pullum, Barbara C. Scholz, and Eva Berlage (). ‘Anaphoric one and its implications.’ Language (): –. Penhallurick, Robert (). ‘Welsh English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Penrose, Ann, and Steven Katz (). Writing in the Sciences: Exploring Conventions of Scientific Discourse. New York: St. Martin’s Press. Perek, Florent (). ‘Alternation-based generalizations are stored in the mental grammar: Evidence from a sorting task experiment.’ Cognitive Linguistics (): –. Perek, Florent, and Adele E. Goldberg (). ‘Generalizing beyond the input: The functions of the constructions matter.’ Journal of Memory and Language : –. Pérez-Guerra, Javier, and Ana E. Martınez-Insua (). ‘Do some genres or text types become more complex than others?’, in Heidrun Dorgeloh and Anja Wanner (eds), Syntactic Variation and Genre. Berlin/New York: Mouton de Gruyter, –. Perlmutter, David M. (ed) (). Studies in Relational Grammar . Chicago: University of Chicago Press. Perry, John (). ‘Thought without representation.’ Supplementary Proceedings of the Aristotelian Society : –. Phillips, Colin (). ‘The real-time status of island phenomena.’ Language : –. Phillips, Colin, Nina Kazanina, and Shani H. Abada (). ‘ERP effects of the processing of syntactic long-distance dependencies.’ Cognitive Brain Research : –. Phillips, Susan (). ‘Chaucer’s language lessons’. The Chaucer Review ( and ): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Pickering, Martin J., and Victor S. Ferreira (). ‘Structural priming: A critical review.’ Psychological Bulletin (): –. Pierrehumbert, Janet (). ‘Phonetic diversity, statistical learning, and acquisition of phonology.’ Language and Speech : –. Pierrehumbert, Janet (). ‘The next toolkit.’ Journal of Phonetics : –. Pierrehumbert, Janet, and Julia Hirschberg (). ‘The meaning of intonational contours in the interpretation of discourse.’ Intentions in Communication –. Pierrehumbert, Janet, Mary E. Beckman, and D. R. Ladd (). ‘Conceptual foundations of phonology as a laboratory science’, in Noel Burton-Roberts, Philip Carr, and Gerard Docherty (eds), Phonological Knowledge: Conceptual and Empirical Issues. Oxford: Oxford University Press, –. Pietsch, Lukas (). Variable Grammars: Verbal Agreement in Northern Dialects of English. Tübingen: Niemeyer. Pietsch, Lukas (). ‘Hiberno-English medial-object perfects reconsidered: A case of contact-induced grammaticalisation.’ Studies in Language (): –. Pinker, Steven (). Learnability and Cognition: The Acquisition of Argument Structure. Cambridge, MA: MIT Press. Pinker, Steven (). Words and Rules. London: Weidenfeld and Nicolson. Pinker, Steven (). Language Learnability and Language Development [with new commentary by the author]. Cambridge, MA: Harvard University Press. Pintzuk, Susan (). ‘Adding linguistic information to parsed corpora’. Linguistic Issues in Language Technology, . Available at: http://csli-lilt.stanford.edu/ojs/index.php/LiLT/ article/view/ Plag, Ingo (). Word-Formation in English. Cambridge: Cambridge University Press. Plag, Ingo, and Harald Baayen (). ‘Suffix ordering and morphological processing.’ Language : –. Plag, Ingo, Gero Kunter, Sabine Lappe, and Maria Braun (). ‘The role of semantics, argument structure, and lexicalisation in compound stress assignment in English.’ Language : –. Plank, Frans (). ‘The modals story retold.’ Studies in Language : –. Pollard, Carl, and Ivan A. Sag (). Information-based Syntax and Semantics. Stanford, CA: CSLI Publications. Pollard, Carl, and Ivan A. Sag (). Head-Driven Phrase Structure Grammar. Chicago: University of Chicago Press. Popper, Karl (/). The Logic of Scientific Discovery. London: Routledge. Portner, Paul, and Barbara H. Partee (eds) (). Formal Semantics: The Essential Readings. Oxford: Blackwell. Postal, Paul M. (). Constituent Structure: A Study of Contemporary Models of Syntactic Description. The Hague: Mouton. Postal, Paul M. (). On Raising. Cambridge, MA: MIT Press. Postal, Paul M., and Geoffrey K. Pullum (). ‘Expletive noun phrases in subcategorized positions.’ Linguistic Inquiry : –. Potts, Christopher (). The Logic of Conventional Implicatures. Oxford: Oxford University Press. Potts, Christopher (a). ‘The centrality of expressive indices.’ Theoretical Linguistics (): –. Potts, Christopher (b). ‘The Expressive Dimension.’ Theoretical Linguistics (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Potts, Christopher (). ‘Presupposition and implicature’, in Lappin and Fox (eds), –. Poutsma, Hendrik (–). A Grammar of Late Modern English I–V. Groningen: Noordhoff. Prince, Alan, and Paul Smolensky (). Optimality Theory: Constraint Interaction in Generative Grammar. [Revision of Prince and Smolensky ().] Oxford: Blackwell. Prince, Ellen F. (). ‘A comparison of Wh-clefts and it-clefts in discourse.’ Language : –. Prince, Ellen F. (a). ‘Toward a taxonomy of given-new information’, in Peter Cole (ed), Radical Pragmatics. New York: Academic Press, –. Prince, Ellen F. (b). ‘Topicalization, focus-movement, and Yiddish-movement: A pragmatic differentiation’, in Danny K. Alford, Karen Ann Hunold, Monica A. Macaulay, Jenny Walter, Claudia Brugman, Paul Chertok, Inese Civkulis and Marta Tobey (eds), Proceedings of the th Annual Meeting of the Berkeley Linguistics Society, –. Prince, Ellen F. (). ‘Topicalisation and left-dislocation: A functional analysis’, in Sheila J. White and Virginia Teller (eds), Discourses in Reading and Linguistics. New York: The New York Academy of Sciences, –. Prince, Ellen F. (). ‘The ZPG letter: Subjects, definiteness, and information-status’, in William C. Mann and Sandra A. Thompson (eds), Discourse Description: Diverse Analyses of a Fund-raising Text. Amsterdam/Philadelphia: John Benjamins, –. Prince, Ellen F. (). ‘On the functions of left-dislocation in English discourse’, in Akio Kamio (ed), Directions in Functional Linguistics. Amsterdam/Philadelphia: John Benjamins, –. Pullum, Geoffrey K. (). ‘Lowth’s grammar: A re-evaluation.’ Linguistics : –. Pullum, Geoffrey K. (). ‘Syncategorematicity and English infinitival to.’ Glossa : –. Pullum, Geoffrey K. (). ‘Lexical categorization in English dictionaries and traditional grammars.’ Zeitschrift für Anglistik und Amerikanistik : –. Pullum, Geoffrey K. (). ‘The truth about English grammar: Rarely pure and never simple’, in Tien-en Kao and Yao-fu Lin (eds), A New Look at Language Teaching and Testing: English as Subject and Vehicle. Taipei: Language Training and Testing Center, –. Pullum, Geoffrey K. (). ‘The central question in comparative syntactic metatheory.’ Mind and Language (): –. Pullum, Geoffrey K., and Rodney Huddleston (). ‘Prepositions and preposition phrases’, in Rodney Huddleston and Geoffrey K. Pullum (eds), The Cambridge Grammar of the English Language. Cambridge: Cambridge University Press, –. Pullum, Geoffrey K., and James Rogers (). ‘Expressive power of the syntactic theory implicit in The Cambridge Grammar.’ Presented to the Linguistics Association of Great Britain (online at http://www.lel.ed.ac.uk/~gpullum/EssexLAGB.pdf). Pullum, Geoffrey K., and Deirdre Wilson (). ‘Autonomous syntax and the analysis of auxiliaries.’ Language : –. Pulvermüller, Friedemann, Bert Cappelle, and Yury Shtyrov (). ‘Brain basis of meaning, words, constructions and grammar’, in Thomas Hoffmann and Graeme Trousdale (eds), The Oxford Handbook of Construction Grammar. Oxford: Oxford University Press, –. Pustejovsky, James (). The Generative Lexicon. Cambridge, MA: MIT Press. Putnam, Hilary (). ‘The “Corroboration” of Scientific Theories’, republished in Ian Hacking (ed) (), Scientific Revolutions, Oxford Readings in Philosophy. Oxford: Oxford University Press, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Quaglio, Paulo, and Douglas Biber (). ‘The grammar of conversation’, in Bas Aarts and April McMahon (eds), The Handbook of English Linguistics. Oxford: Blackwell, –. Quine, Willard van Orman (). ‘Quantifiers and propositional attitudes.’ Journal of Philosophy (): –. Quine, Willard van Orman (). Word and Object. Cambridge, MA: MIT Press. Quirk, Randolph (). ‘Towards a description of English usage.’ Transactions of the Philological Society : –. Quirk, Randolph (). ‘Randolph Quirk’, in Keith Brown and Vivian Law (eds), Linguistics in Britain: Personal Histories. Oxford: Philological Society, –. Quirk, Randolph, Sidney Greenbaum, Geoffrey Leech, and Jan Svartvik (). A Grammar of Contemporary English. London: Longman. Quirk, Randolph, Sidney Greenbaum, Geoffrey Leech, and Jan Svartvik (). A Comprehensive Grammar of the English Language. London: Longman. Radden, Günter, and René Dirven (). Cognitive English Grammar. Amsterdam/Philadelphia: John Benjamins. Radford, Andrew (). Transformational Syntax. Cambridge: Cambridge University Press. Radford, Andrew (). Transformational Grammar: A First Course. Cambridge: Cambridge University Press. Radford, Andrew (). Syntactic Theory and the Structure of English: A Minimalist Approach. Cambridge: Cambridge University Press. Rainer, Franz (). ‘Blocking’. Oxford Research Encyclopedia of Linguistics. Retrieved Jun. , from https://oxfordre.com/linguistics/view/./acrefore/. ./acrefore--e-. Ramchand, Gillian (). Verb Meaning and the Lexicon: A First Phase Syntax. Cambridge: Cambridge University Press. Ramchand, Gillian, and Charles Reiss (eds) (). The Oxford Handbook of Linguistic Interfaces. Oxford: Oxford University Press. Randall, Beth (–). CorpusSearch . Available at: http://corpussearch.sourceforge.net. Rappaport Hovav, Malka, and Beth Levin (). ‘Building verb meanings’, in Miriam Butt and Wilhelm Geuder (eds), The Projection of Arguments. Stanford: CSLI Publications, – . Rauh, Gisa (). Syntactic Categories: Their Identification and Description in Linguistic Theories. Oxford: Oxford University Press. Recanati, François (). Meaning and Force: The Pragmatics of Performative Utterances. Cambridge: Cambridge University Press. Recanati, François (). ‘Unarticulated constituents.’ Linguistics and Philosophy (): –. Recanati, François (). Literal Meaning. Cambridge: Cambridge University Press. Reichenbach, Hans (). Elements of Symbolic Logic. London: Collier-Macmillan Ltd. Reinhart, Tanya (a). ‘Definite NP anaphora and C-Command domains.’ Linguistic Inquiry : –. Reinhart, Tanya (b). ‘Pragmatics and linguistics: An analysis of sentence topics.’ Philosophica : –. Reinhart, Tanya (). Pragmatics and Linguistics: An Analysis of Sentence Topics. Bloomington, IN: Indiana University Linguistics Club. Renner, Vincent (). ‘On the semantics of English coordinate compounds.’ English Studies : –. Renner, Vincent (). ‘Lexical blending as wordplay.’ Paper presented at the International Conference on Wordplay and Metalinguistic Reflection, Tübingen, March .
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Rett, Jessica (). ‘A degree account of exclamatives’, in Proceedings of SALT . Ithaca, NY: CLC Publications. Rett, Jessica (). ‘Exclamatives, degrees and speech acts.’ Linguistics and Philosophy (): –. Rialland, Annie (). ‘Question prosody: An African perspective’, in Tomas Riad and Carlos Gussenhoven (eds), Tones and Tunes: Typological Studies in Word and Sentence Prosody. Berlin/New York: Mouton de Gruyter, –. Richards, Marc (). ‘Minimalism’, in Tibor Kiss and Artemis Alexiadou (eds), Syntax – Theory and Analysis: An International Handbook, Handbücher zur Sprach- und Kommunikationswissenschaft. Berlin/New York: Mouton de Gruyter, –. Riemsdijk, Henk van (). ‘Free relatives’, in Martin Everaert and Henk van Riemsdijk (eds), The Blackwell Companion to Syntax, vol. . Oxford: Blackwell, –. Rijkhoff, Jan (). The Noun Phrase. Oxford: Oxford University Press. Ritz, Marie-Eve (). ‘Perfect tense and aspect’, in Robert Binnick (ed), The Oxford Handbook of Tense and Aspect. Oxford: Oxford University Press, –. Rizzi, Luigi (). ‘The fine structure of the left periphery’, in Liliane Haegeman (ed), Elements of Grammar. Dordrecht: Kluwer, –. Rizzi, Luigi (). ‘Notes on labeling and subject positions’, in Elisa Di Domenico, Cornelia Hamann, and Simona Matteini (eds), Structures, Strategies and Beyond: Studies in Honour of Adriana Belletti. Amsterdam/Philadelphia: John Benjamins, –. Robenalt, Clarice and Adele E. Goldberg (). ‘Nonnative speakers do not take competing alternative expressions into account the way native speakers do’. Language Learning (): –. Roberts, Ian (). ‘Agreement parameters and the development of English modal auxiliaries.’ Natural Language and Linguistic Theory : –. Roberts, Ian, and Anna Roussou (). Semantic Change: A Minimalist Approach to Grammaticalization. Cambridge: Cambridge University Press. Robins, R. H. (). General Linguistics. An Introductory Survey. London: Longmans. Rochemont, Michael (). Focus in Generative Grammar. Amsterdam/Philadelphia: John Benjamins. Rochemont, Michael S. and Peter Culicover (). English Focus Constructions and the Theory of Grammar. Cambridge: Cambridge University Press. Rogalsky, Corianne, and Gregory Hickok (). ‘Selective attention to semantic and syntactic features modulates sentence processing networks in anterior temporal cortex.’ Cerebral Cortex : –. Rohdenburg, Gunther (). ‘Cognitive complexity and horror aequi as factors determining the use of interrogative clause linkers’, in Gunther Rohdenburg and Britta Mondorf (eds), Determinants of Grammatical Variation in English. Berlin/New York: Mouton de Gruyter, –. Rooth, Mats (). Association with Focus. Ph.D. thesis. Amherst, MA: University of Massachusetts at Amherst. Rooth, Mats (). ‘A theory of focus interpretation.’ Natural Language Semantics : –. Rosch, Eleanor (). ‘Cognitive representations of semantic categories.’ Journal of Experimental Psychology: General (): –. Rosch, Eleanor, and Carolyn B. Mervis (). ‘Family resemblances: Studies in the internal structure of categories.’ Cognitive Psychology (): –. Rosenbach, Anette (). ‘Descriptive genitives in English: A case study on constructional gradience.’ English Language and Linguistics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Rosenbach, Anette (). ‘English genitive variation – the state of the art.’ English Language and Linguistics (): –. Ross, John R. (). Constraints on Variables in Syntax. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Ross, John R. (). ‘Auxiliaries as main verbs’, in W. Todd (ed), Studies in Philosophical Linguistics (Series ). Evanston, IL: Great Expectations Press (online at http://www-personal.umich.edu/~jlawler/haj/AuxasMV.pdf). Ross, John R. (). ‘On declarative sentences’, in Roderick A. Jacobs and Peter S. Rosenbaum (eds), Readings in English Transformational Grammar. Waltham, MA: Ginn and Company, –. Rossi, Sonja, Manfred F. Gugler, Anja Hahne, and Angela D. Friederici (). ‘When word category information encounters morphosyntax: An ERP study.’ Neuroscience Letters : –. Round, Erich (). ‘Rhizomorphomes, meromorphomes and metamorphomes’, in Matthew Baerman, Dunstan Brown, and Greville G. Corbett (eds), Understanding and Measuring Morphological Complexity. Oxford: Oxford University Press, –. Rumens, Carol (). Hex. Hexham, Northumberland: Bloodaxe Books. Ruppenhofer, Josef, Michael Ellsworth, Miriam R. L. Petruck, Christopher R. Johnson, and Jan Scheffczyk (). FrameNet II: Extended Theory and Practice. Berkeley, CA: International Computer Science Institute. Distributed with the FrameNet data. Russell, Benjamin (). ‘Against grammatical computation of scalar implicatures.’ Journal of Semantics (): –. Russell, Benjamin (). ‘Imperatives in conditional conjunction.’ Natural Language Semantics : –. Sacks, Harvey, Emanuel A. Schegloff, and Gail Jefferson (). ‘A simplest systematics for the organization of turn-taking for conversation.’ Language : –. Sadock, Jerrold M. (). Toward a Linguistic Theory of Speech Acts. New York: Academic Press. Sadock, Jerrold M. (). The Modular Architecture of Grammar. Cambridge: Cambridge University Press. Sadock, Jerrold, and Arnold M. Zwicky (). ‘Speech act distinctions in syntax’, in Tim Shopen (ed), Language Typology and Linguistic Description, vol. . Cambridge: Cambridge University Press, –. Sag, Ivan A. (). Deletion and Logical Form. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Sag, Ivan A. (). ‘English relative clauses.’ Journal of Linguistics : –. Sag, Ivan A. (). ‘Sign-Based Construction Grammar: An informal synopsis’, in Hans Boas and Ivan A. Sag (eds), Sign-Based Construction Grammar. Stanford: CSLI Publications, –. Sag, Ivan A., Thomas Wasow, and Emily M. Bender (). Syntactic Theory, nd edn. Stanford: CSLI Publications. Sag, Ivan A., Gerald Gazdar, Thomas Wasow, and Steven Weisler (). ‘Coordination and how to distinguish categories.’ Natural Language and Linguistic Theory : . Salkie, Raphael (). ‘Will: Tense or modal or both?’ English Language and Linguistics : –. Sanders, T., and H. Pander Maat (). ‘Cohesion and coherence: Linguistic approaches’, in Keith Brown (ed), Encyclopedia of Language and Linguistics, nd edn. Amsterdam: Elsevier (supplied as online service by ScienceDirect), –. Sanders, T., and J. Sanders (). ‘Text and text analysis’, in Keith Brown (ed), Encyclopedia of Language and Linguistics, nd edn. Amsterdam: Elsevier (supplied as online service by ScienceDirect), –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Sankoff, Gillian, and Hélène Blondeau (). ‘Language change across the lifespan: /r/ in Montreal French.’ Language, (): –. Sansom, Peter (). Point of Sale. Manchester: Carcanet Press. Santi, Andrea, and Yosef Grodzinsky (). ‘Working memory and syntax interact in Broca’s area.’ Neuroimage : –. Sapir, Edward (). Language. New York: Harcourt/Brace. Sasse, Hans-Jürgen (). ‘Syntactic categories and subcategories’, in Joachim Jacobs, Arnim von Stechow, Wolfgang Sternefeld, and Theo Vennemann (eds), Syntax. Ein internationales Handbuch zeitgenössischer Forschung/An International Handbook of Contemporary Research (Handbücher zur Sprach- und Kommunikationswissenschaft/ Handbooks of Linguistics and Communication, ). Berlin/New York: Mouton de Gruyter, –. Sauerland, Uli (). ‘Embedded implicatures and experimental constraints: A reply to Geurts & Pouscoulous and Chemla.’ Semantics and Pragmatics (): –. Scalise, Sergio, and Emiliano Guevara (). ‘The lexicalist approach to word formation and the notion of the lexicon’, in Pavol Stekauer and Rochelle Lieber (eds), Handbook of Wordformation. Dordrecht: Springer, –. Scheer, Tobias (). ‘Why the prosodic hierarchy is a diacritic and why the interface must be direct’, in Jutta M. Hartmann, Veronika Hegedüs, and Henk Van Riemsdijk (eds), Sounds of Silence: Empty Elements in Syntax and Phonology. Amsterdam: Elsevier, –. Scheer, Tobias (). ‘Issues in the development of generative phonology’, in Nancy C. Kula, Bert Botma, and Kuniya Nasuukawa (eds), The Bloomsbury Companion to Phonology. London: Bloomsbury Publishing, –. Scheer, Tobias (). A Guide to Morphosyntax-Phonology Interface Theories: How Extraphonological Information is Treated in Phonology since Trubetzkoy’s Grenzsignale. Berlin/ New York: Mouton de Gruyter. Schilk, Marco, and Marc Hammel (). ‘The progressive in South Asian and Southeast Asian varieties of English: Mapping areal homogeneity and heterogeneity’, in Lieven Vandelanotte, Kristin Davidse, Caroline Gentens, and Ditte Kimps (eds), Recent Advances in Corpus Linguistics: Developing and Exploiting Corpora. Amsterdam: Rodopi, –. Schleppegrell, Mary J. (). ‘Systemic functional linguistics’, in James Paul Gee and Michael Handford (eds) (). The Routledge Handbook of Discourse Analysis. London/New York: Routledge, –. Schlobinski, Peter (). ‘*knuddel –zurueckknuddel – dichganzdollknuddel*. Inflektive und Inflektivkonstruktionen im Deutschen.’ Zeitschrift für germanistische Linguistik : –. Schmid, Hans-Jörg (). ‘Entrenchment, salience, and basic levels’, in Dirk Geeraerts and Hubert Cuyckens (eds), The Oxford Handbook of Cognitive Linguistics. Oxford: Oxford University Press, –. Schmid, Hans-Jörg (). ‘Is usage more than usage after all? The case of English not that.’ Linguistics : –. Schneider, Agnes (). ‘Typological profile: Pidgins and Creoles’, in Bernd Kortmann and Kerstin Lunkenheimer (eds), The Mouton World Atlas of Variation in English. Berlin/ Boston: Mouton de Gruyter, –. Schneider, Edgar W. (). ‘Synopsis: Morphological and syntactic variation in the Americas and the Caribbean’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Schneider, Edgar W. (). Postcolonial English: Varieties around the World. Cambridge: Cambridge University Press. Schneider, Edgar W. (). English Around the World: An Introduction. Cambridge: Cambridge University Press. Schönefeld, Doris (). Where Lexicon and Syntax Meet. Berlin/New York: Mouton de Gruyter. Schönefeld, Doris (). ‘Constructions.’ Constructions SV-/ (http://journals.linguisticsociety.org/elanguage/constructions/article/download//---PB.pdf). Schönefeld, Doris (). ‘Introduction: On evidence and the convergence of evidence in linguistic research’, in Doris Schönefeld (ed), Converging Evidence: Methodological and Theoretical Issues for Linguistic Research. Amsterdam/Philadelphia: John Benjamins, –. Schreier, Daniel, and Marianne Hundt (eds) (). English as a Contact Language. Cambridge: Cambridge University Press. Schröter, Verena, and Bernd Kortmann (). ‘Pronoun deletion in Hong Kong English and colloquial Singaporean English’, in Alexander Onysko (ed), Language Contact in World Englishes. Special issue of World Englishes (): –. Schumacher, Helmut, Jacqueline Kubczak, Renate Schmidt and Vera de Ruiter (). VALBU – Valenzwörterbuch Deutscher Verben. Tübingen: Narr. Schütze, Carson T. (). The Empirical Base of Linguistics: Grammaticality Judgments and Linguistic Methodology. Chicago: University of Chicago Press. Reprinted by Language Science Press (Classics in Linguistics #), Berlin, . Schütze, Carson T. (). ‘Semantically empty lexical heads as last resorts’, in Norbert Corver and Henk van Riemsdijk (eds), Semi-lexical Categories. Berlin/New York: Mouton de Gruyter, –. Schütze, Carson T., and Jon Sprouse (). ‘Judgment data’, in Robert J. Podesva and Devyani Sharma (eds), Research Methods in Linguistics. New York: Cambridge University Press, –. Schuyler, Tamara (). Wh-movement out of the Site of VP Ellipsis. MA thesis. Santa Cruz, CA: University of California at Santa Cruz. Schwarzschild, Roger (). ‘Givenness, avoid F, and other constraints on the placement of accents.’ Natural Language Semantics : –. Scott, Michael (). WordSmith Tools version , Stroud: Lexical Analysis Software. Searle, John R. (). Speech Acts: An Essay in the Philosophy of Language. Cambridge: Cambridge University Press. Searle, John R. (). ‘A taxonomy of illocutionary acts’, in Keith Gunderson (ed), Language, Mind, and Knowledge, vol. VII of Minnesota Studies in the Philosophy of Science. Minneapolis, MN: University of Minnesota Press, –. Reprinted in Searle (: –). Searle, John R. (). ‘A classification of illocutionary acts.’ Language in Society : –. Searle, John R. (). Expression and Meaning. Cambridge: Cambridge University Press. Selkirk, Elisabeth (). ‘Some remarks on noun phrase structure’, in Peter W. Culicover, Thomas Wasow, and Adrian Akmajian () (eds), Formal Syntax. New York: Academic Press, –. Selkirk, Elisabeth (). Phonology and Syntax: The Relation Between Sound and Structure. Cambridge, MA: MIT Press. Selkirk, Elisabeth (). ‘On derived domains in sentence phonology.’ Phonology Yearbook : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Selkirk, Elisabeth (). ‘The prosodic structure of function words’, in James L. Morgan and Katherine Demuth (eds), Signal to Syntax: Bootstrapping from Speech to Grammar in Early Acquisition. Mahwah, NJ: Lawrence Erlbaum Associates, –. Selkirk, Elisabeth (). ‘The interaction of constraints on prosodic phrasing’, in Merle Horne (ed), Prosody: Theory and Experiment. Dordrecht: Kluwer, –. Selkirk, Elisabeth (). ‘Contrastive FOCUS vs. presentational focus: Prosodic evidence from right node raising in English’, in Bernard Bel and Isabel Marlin (eds), Speech Prosody : Proceedings of the First International Conference on Speech Prosody. Aix-en-Provence: Laboratoire Parole et Langage, –. Selkirk, Elisabeth (). ‘The syntax-phonology interface’, in John A. Goldsmith, Jason Riggle and Alan C. L. Yu (eds), The Handbook of Phonological Theory. Oxford: Wiley-Blackwell, –. Sells, Peter (). ‘Lexical-functional grammar’, in Marcel den Dikken (ed), The Cambridge Handbook of Generative Syntax. Cambridge: Cambridge University Press, –. Semino, Elena (). ‘Language, mind and autism in Mark Haddon’s The Curious Incident of the Dog in the Night-Time’, in Monika Fludernik and Daniel Jacob (eds), Linguistics and Literary Studies. Berlin/New York: Mouton de Gruyter, –. Seoane, Elena (). ‘Changing styles: On the recent evolution of scientific British and American English’, in Christiane Dalton-Puffer, Dieter Kastovsky, Nikolaus Ritt, and Herbert Schendel (eds), Syntax, Style and Grammatical Norms: English from –. Bern: Peter Lang, –. Sereno, Joan A. (). ‘Phonosyntactics’, in Leanne Hinton, Johanna Nichols, and John J. Ohala (eds), Sound Symbolism. Cambridge: Cambridge University Press, –. Sereno, Joan A., and Allard Jongman (). ‘Phonological and form class relations in the lexicon.’ Journal of Psycholinguistic Research : –. Seuren, Pieter A. M. (). Western Linguistics: An Historical Introduction. Oxford: Blackwell. Sgall, Petr, Eva Hajicová, and Jarmila Panevová (). The Meaning of the Sentence and its Semantic and Pragmatic Aspects. Dordrecht: Reidel. Shaer, Benjamin, and Werner Frey (). ‘“Integrated” and “non-integrated” left-peripheral elements in German and English.’ ZAS Papers in Linguistics (): –. Sherman, Donald (). ‘Noun-verb stress alternation: an example of the lexical diffusion of sound change in English.’ Linguistics : –. Shi, Rushen, James L. Morgan, and Paul Allopenna (). ‘Phonological and acoustic bases for earliest grammatical category assignment: A cross-linguistic perspective.’ Journal of Child Language : –. Sidnell, Jack (). ‘Turn-continuation by self and by other.’ Discourse Processes : –. Sidnell, Jack, and Tanya Stivers (eds) (). The Handbook of Conversation Analysis. Oxford: Wiley-Blackwell. Siemund, Peter (). ‘Reflexive and intensive self-forms across varieties of English.’ Zeitschrift für Anglistik und Amerikanistik (ZAA) (): –. Siemund, Peter (). Pronominal Gender in English: A Study of English Varieties from a Cross-Linguistic Perspective. London: Routledge. Siemund, Peter (). Varieties of English: A Typological Approach. Cambridge: Cambridge University Press. Siemund, Peter (). ‘Exclamative clauses in English and their relevance for theories of clause types.’ Studies in Language (): –. Siemund, Peter (). ‘The mutual relevance of typology and variation studies.’ Linguistic Typology (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Siemund, Peter (). Speech Acts and Clause Types: English in a Cross-linguistic Context. Oxford: Oxford University Press. Siewierska, Anna (). Functional Grammar. London: Routledge. Siewierska, Anna, and Willem Hollmann (). ‘Ditransitive clauses in English with special reference to Lancashire dialect’, in Mike Hannay and Gerald J. Steen (eds), StructuralFunctional Studies in English Grammar: In Honour of Lachlan Mackenzie (Studies in Language Companion Series, ). Amsterdam/Philadelphia: John Benjamins, –. Silverstein, Michael (). ‘Hierarchy of features and ergativity’, in R. M. W. Dixon (ed), Grammatical Categories in Australian Languages (Linguistic Series, ). Canberra: Australian Institute of Aboriginal Studies, –. Simons, Mandy (). ‘Foundational issues in presupposition.’ Philosophy Compass (): –. Simpson, Paul (). Language, Ideology and Point of View. London: Routledge. Sims, Andrea (). Inflectional Defectiveness. Cambridge: Cambridge University Press. Sinclair, John (). ‘Collocation: A progress report’, in Ross Steele and Terry Threadgold (eds), Language Topics: Essays in Honour of Michael Halliday. Amsterdam/Philadelphia: John Benjamins, –. Sinclair, John (). Corpus, Concordance, Collocation. Oxford: Oxford University Press. Sinclair, John (). ‘The automatic analysis of corpora’, in Jan Svartvik (ed), Directions in Corpus Linguistics. Berlin/New York: Mouton de Gruyter, –. Sinclair, John (). Trust the Text: Language, Corpus and Discourse. London: Routledge. Sinclair, John, and Anna Mauranen (). Linear Unit Grammar: Integrating Speech and Writing. Amsterdam/Philadelphia: John Benjamins. Singh, Raj, Ken Wexler, Andrea Astle-Rahim, Deepthi Kamawar, and Danny Fox (). ‘Children interpret disjunction as conjunction: consequences for theories of implicature and child development.’ Natural Language Semantics (): –. Slobin, Dan. I. (). ‘Talking perfectly: Discourse origins of the present perfect’, in William Pagliuca (ed), Perspectives on Grammaticalization. Amsterdam/Philadelphia: John Benjamins, –. Smith, Andrew D. M., Graeme Trousdale, and Richard Waltereit (eds) (). New Directions in Grammaticalisation Research. Amsterdam/Philadelphia: John Benjamins. Vol. in Studies in Language Companion Series. Smith, C. Alphons (). Review of A New English Grammar, Logical and Historical. Journal of Germanic Philology : –. Smith, Carlota S. (). The Parameter of Aspect, nd edn. Dordrecht/Boston/London: Kluwer. Smith, Carlota S. (). Modes of Discourse: The Local Structure of Texts. Cambridge: Cambridge University Press. Smith, Jennifer, and Sophie Holmes-Elliott (). ‘The unstoppable glottal: Tracking rapid change in an iconic British variable.’ English Language and Linguistics –. Smith, Neil, and Nicholas Allott (). Chomsky: Ideas and Ideals, rd edn. Cambridge: Cambridge University Press. Smith, Nicholas, and Geoffrey Leech (). ‘Verb structures in twentieth-century British English’, in Bas Aarts, Joanne Close, Geoffrey Leech, and Sean Wallis (eds), The Verb Phrase in English: Investigating Recent Language Change with Corpora. Cambridge: Cambridge University Press, –. Snider, Neal, and Inbal Arnon (). ‘A unified lexicon and grammar? Compositional and non-compositional phrases in the lexicon’, in Stefan Th. Gries and Dagmar Divjak (eds), Frequency Effects in Language. Berlin/New York: Mouton de Gruyter, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Sobin, Nicholas (). ‘Expletive constructions are not “Lower Right Corner” movement constructions.’ Linguistic Inquiry : –. Somers, Harald L. (). Valency and Case in Computational Linguistics. Edinburgh: Edinburgh University Press. Sopher, H. (). ‘Apposition.’ English Studies : –. Sóskuthy, Márton, and Jennifer Hay (). ‘Changing word usage predicts changing word durations in New Zealand English.’ Cognition: International Journal of Cognitive Science : –. Spencer, Andrew (). ‘Bracketing paradoxes and the English lexicon.’ Language : –. Spencer, Andrew (). Morphological Theory: An Introduction to Word Structure in Generative Grammar. Oxford: Blackwell. Spencer, Andrew (). ‘Morphology – an overview of central concepts’, in Louisa Sadler and Andrew Spencer (eds), Projecting Morphology. Stanford: CSLI Publications, –. Spencer, Andrew (). ‘Identifying stems.’ Word Structure, : –. Spencer, Andrew (). Lexical Relatedness: A Paradigm-based Model. Oxford: Oxford University Press. Spencer, Andrew (). ‘Derivation’, in Peter O. Müller, Ingeborg Ohnheiser, Susan Olsen, and Franz Rainer (eds), Word-Formation: An International Handbook of the Languages of the World, vol. . Berlin/New York: Mouton de Gruyter, –. Spencer, Andrew (a). ‘How are words related?, in Daniel Siddiqi, and Heidi Harley (eds), Morphological Metatheory. Amsterdam/Philadelphia: John Benjamins, –. Spencer, Andrew (b). ‘Individuating lexemes’, in Miriam Butt and Tracy Holloway King (eds), Proceedings of LFG. Stanford: CSLI Publications, –. Spencer, Andrew (c). ‘Two morphologies or one?’, in Andrew Hippisley and Gregory Stump (eds), The Cambridge Handbook of Morphology. Cambridge: Cambridge University Press, –. Spencer, Andrew (a). ‘Morphology’, in Mark Aronoff and Janie Rees-Miller (eds), The Handbook of Linguistics, nd edn. Oxford: Wiley Blackwell, –. Spencer, Andrew (b). ‘Split-morphology and lexicalist morphosyntax: The case of transpositions’, in Claire Bowern, Laurence Horn, and Raffaella Zanuttini (eds), On Looking Into Words (And Beyond). Berlin: Language Science Press, –. Spencer, Andrew, and Ana R. Luís (). Clitics: An Introduction. Cambridge: Cambridge University Press. Spencer, Andrew, and Gergana Popova (). ‘Inflection and periphrasis’, in Matthew Baerman (ed), The Oxford Handbook of Inflection. Oxford: Oxford University Press, –. Spencer, Nancy J. (). ‘Differences between linguists and nonlinguists in intuitions of grammaticality-acceptability.’ Journal of Psycholinguistic Research : –. Spender, Dale (). Man Made Language. London: Routledge and Kegan Paul. Sperber, Dan, and Deirdre Wilson (). Relevance: Communication and Cognition, st edn. Oxford: Blackwell. Second edition published in . Sportiche, Dominique (). ‘A theory of floating quantifiers and its corollaries for constituent structure.’ Linguistic Inquiry : –. Sprouse, Jon, and Diogo Almeida (). ‘Assessing the reliability of textbook data in syntax: Adger’s Core Syntax.’ Journal of Linguistics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Sprouse, Jon, Carson T. Schütze, and Diogo Almeida (). ‘A comparison of informal and formal acceptability judgments using a random sample from Linguistic Inquiry –.’ Lingua : –. Sprouse, Jon, Matt Wagers, and Colin Phillips (). ‘A test of the relation between working memory and syntactic island effects.’ Language : –. Sprouse, Jon, Ivano Caponigro, Ciro Greco, and Carlo Cecchetto (). ‘Experimental syntax and the variation of island effects in English and Italian.’ Natural Language and Linguistic Theory : –. Stalnaker, Robert C. (). ‘Pragmatics.’ Synthese : –. Stalnaker, Robert C. (). ‘Pragmatic presuppositions’, in Milton K. Munitz and Peter Unger (eds), Semantics and Philosophy. New York: Oxford University Press, –. Stalnaker, Robert C. (). ‘Assertion’, in Peter Cole (ed), Pragmatics, Syntax and Semantics, vol. ix. New York: Academic Press, –. Stanley, Jason (). ‘Context and logical form.’ Linguistics and Philosophy (): –. Stanley, Jason (). ‘Making it articulated.’ Mind and Language (–): –. Stanley, Jason, and Zoltán Gendler Szabó (). ‘On quantifier domain restriction.’ Mind and Language (–): –. Starke, Michal (). Move Reduces to Merge: A Theory of Locality. Ph.D. thesis. Geneva: University of Geneva. Stassen, Leon (). Comparison and Universal Grammar. Oxford. Blackwell. Steedman, Mark (). ‘Structure and intonation.’ Language : –. Steedman, Mark (). Surface Structure and Interpretation. Cambridge, MA: MIT Press. Steedman, Mark (). The Syntactic Process. Cambridge, MA: MIT Press. Steedman, Mark (). Taking Scope: The Natural Semantics of Quantifiers. Cambridge, MA: MIT Press. Stefanowitsch, Anatol, and Stefan Th. Gries (). ‘Collostructions: Investigating the interaction of words and constructions.’ International Journal of Corpus Linguistics (): –. Stefanowitsch, Anatol, and Stefan Th. Gries (). ‘Covarying collexemes.’ Corpus Linguistics and Linguistic Theory (): –. Stefanowitsch, Anatol, and Stefan Th. Gries (). ‘Corpora and grammar’, in Anke Lüdeling and Merja Kytö (eds), Corpus Linguistics: An International Handbook. Berlin/New York: Mouton de Gruyter, –. Steinhauer, Karsten, and John Drury (). ‘On the early left-anterior negativity (ELAN) in syntax studies.’ Brain and Language : –. Štekauer, Pavol (). An Onomasiological Theory of English Word-formation. Amsterdam/ Philadelphia: John Benjamins. Štekauer, Pavol (). ‘Derivational paradigms’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Derivational Morphology. Oxford: Oxford University Press, –. Štekauer, Pavol (). ‘The delimitation of derivation and inflection’, in Peter O. Müller, Ingeborg Ohnheiser, Susan Olsen, and Franz Rainer (eds), Word-Formation: An International Handbook of the Languages of the World, vol. . Berlin/New York: Mouton de Gruyter, –. Štekauer, Pavol, Salvador Valera, and Lívia Körtvélyessy (). Word-formation in the World’s Languages. Cambridge: Cambridge University Press. Stenius, Erik (). ‘Mood and language game.’ Synthese : .
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Stewart, Thomas W. (). Contemporary Morphological Theories: A User’s Guide. Edinburgh: Edinburgh University Press. Stirling, Lesley, and Rodney Huddleston (). ‘Deixis and anaphora’, in Rodney Huddleston and Geoffrey K. Pullum (eds), The Cambridge Grammar of the English Language. Cambridge: Cambridge University Press, –. Stivers, Tanya (). ‘An overview of the question–response system in American English conversation.’ Journal of Pragmatics : –. Stivers, Tanya, N. J. Enfield, and Stephen C. Levinson (eds) (). Question–Response Sequences in Conversation Across Ten Languages. Special issue of Journal of Pragmatics, (). Stockwell, Peter (). Cognitive Poetics: An Introduction. London: Routledge. Stockwell, Robert, Paul Schachter, and Barbara Hall Partee (). The Major Syntactic Structures of English. New York: Holt, Rinehart and Winston. Storrer, Angelika (). ‘Ergänzungen und Angaben’, in Vilmos Ágel, Ludwig M. Eichinger, Hans-Werner Eroms, Peter Hellwig, Hans Jürgen Heringer, and Henning Lobin (eds), Dependenz und Valenz: Ein internationales Handbuch der zeitgenössischen Forschung. Dependency and Valency: An International Handbook of Contemporary Research. . Halbband/Volume . Berlin/New York: Mouton de Gruyter, –. Stowe, Laurie A. (). ‘Parsing WH-constructions: Evidence for on-line gap location.’ Language and Cognitive Processes : –. Stowe, Laurie A., Cees A. J. Broere, Anne M. J. Paans, Albertus A. Wijers, Gijsbertus Mulder, Wim Vaalburg, and Frans Zwarts (). ‘Localizing components of a complex task: Sentence processing and working memory.’ NeuroReport : –. Stowell, Timothy (). Origins of Phrase Structure. Ph.D. thesis. Cambridge, MA: Massachusetts Institute of Technology. Stowell, Tim (). ‘The tense of infinitives.’ Linguistic Inquiry : –. Stowell, Tim, and Eric Wehrli (eds) (). Syntax and Semantics . Syntax and the Lexicon. San Diego, CA: Academic Press. Strawson, Peter F. (). ‘On Referring.’ Mind (): –. Strawson, Peter F. (). ‘Identifying reference and truth-values’, in Danny D. Steinberg and Leon A. Jakobovits (eds), Semantics: An Interdisciplinary Reader in Philosophy, Linguistics and Psychology. Cambridge: Cambridge University Press, –. Stroud, Clare, and Colin Phillips (). ‘Examining the evidence for an independent semantic analyzer: An ERP study in Spanish.’ Brain and Language : –. Stubbs, Michael () ‘Conrad in the computer: Examples of quantitative stylistic methods.’ Language and Literature (): –. Stubbs, Michael, and Isabel Barth (). ‘Using recurrent phrases as text-type discriminators: a quantitative method and some findings.’ Functions of Language (): –. Stump, Gregory T. (). Inflectional Morphology: A Theory of Paradigm Structure. Cambridge: Cambridge University Press. Stump, Gregory T. (). ‘The derivation of compound ordinal numerals: Implications for morphological theory.’ Word Structure : –. Stump, Gregory T. (). Inflectional Paradigms: Content and Form at the Syntax-Morphology Interface. Cambridge: Cambridge University Press. Sturt, Patrick (). ‘The time-course of the application of binding constraints in reference resolution.’ Journal of Memory and Language : –. Sturtevant, Edgar H. (). An Introduction to Linguistic Science. New Haven: Yale University Press.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Swales, John (). Genre Analysis: English in Academic and Research Settings. Cambridge: Cambridge University Press. Sweet, Henry (). ‘The practical study of language.’ Transactions of the Philological Society : –. Sweet, Henry (). A History of English Sounds. Oxford: Clarendon Press. Sweet, Henry (). A New English Grammar, Logical and Historical. Part I: Introduction, Phonology, and Accidence. Oxford: Clarendon Press. Sweet, Henry (). A New English Grammar, Logical and Historical. Part II: Syntax. Oxford: Clarendon Press. Sweetser, Eve E. (). From Etymology to Pragmatics. Cambridge: Cambridge University Press. Swerts, Marc, and Sabine Zerbian (). ‘Prosodic transfer in Black South African English.’ Proceedings of Speech Prosody , : –. downloaded from http://speechpro sody.illinois.edu/papers/.pdf. Szabó, Zoltán Gendler (ed) (). Semantics vs. Pragmatics. Oxford: Oxford University Press. Szabolcsi, Anna, and Terje Lohndal (). ‘Strong vs. weak islands’, in Martin Everaert and Henk van Riemsdijk (eds), The Wiley Blackwell Companion to Syntax, nd edn. Malden, MA: Wiley-Blackwell, –. Szczepek, Beatrice (a). ‘Formal aspects of collaborative productions in English conversation.’ InLiSt – Interaction and Linguistic Structures, . http://www.inlist.uni-bayreuth.de/ issues//index.htm. Szczepek, Beatrice (b). ‘Functional aspects of collaborative productions in English conversation.’ InLiSt – Interaction and Linguistic Structures, . http://www.inlist.unibayreuth.de/issues//index.htm. Szmrecsanyi, Benedikt (). ‘Typological profile: L varieties’, in Bernd Kortmann and Kerstin Lunkenheimer (eds), The Mouton World Atlas of Variation in English. Berlin/ Boston: Mouton de Gruyter, –. Szmrecsanyi, Benedikt, and Bernd Kortmann (a). ‘The morphosyntax of varieties of English worldwide: A quantitative perspective.’ Special Issue The Forests Behind the Trees. Lingua : –. Szmrecsanyi, Benedikt, and Bernd Kortmann (b). ‘Vernacular universals and Angloversals in a typological perspective’, in Markku Filppula, Juhani Klemola, and Heli Paulasto (eds), Vernacular Universals and Language Contacts: Evidence from Varieties of English and Beyond. London/New York: Routledge, –. Szmrecsanyi, Benedikt, and Bernd Kortmann (). ‘Introduction’, in Bernd Kortmann and Benedikt Szmrecsanyi (eds), Linguistic Complexity: Second Language Acquisition, Indigenization, Contact. Berlin/New York: Mouton de Gruyter, –. Szymanek, Bogdan (). Introduction to Morphological Analysis. Warsaw: Państwowe Wydawnictwo Naukowe. Taavitsainen, Irma (). ‘Subjectivity as a text-type marker in historical stylistics.’ Language and Literature : –. Tagliamonte, Sali (). ‘Was/were variation across the generations: View from the city of York.’ Language Variation and Change (): –. Tagliamonte, Sali (). ‘So who? Like how? Just what? Discourse markers in the conversation of young Canadians.’ Journal of Pragmatics : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Tagliamonte, Sali, Jennifer Smith, and Helen Lawrence (). ‘No taming the vernacular! Insights from the relatives in northern Britain.’ Language Variation and Change (): –. Talmy, Leonard (). Toward a Cognitive Semantics. Cambridge, MA: MIT Press. Tannen, Deborah (). ‘Spoken and written language: Exploring orality and literacy.’ Language : –. Tanner, Darren (). ‘On the left anterior negativity (LAN) in electrophysiological studies of morphosyntactic agreement: A commentary on “Grammatical agreement processing in reading: ERP findings and future directions” by Molinaro et al., .’ Cortex : –. Tanner, Darren, and Janet G. Van Hell (). ‘ERPs reveal individual differences in morphosyntactic processing.’ Neuropsychologia : –. Tarasova, Elizaveta (). Some New Insights into the Semantics of English NþN Compounds. Ph.D. thesis. Wellington: Victoria University of Wellington. Taylor, John R. (). Possessives in English. Oxford: Oxford University Press Taylor, John R. (). Cognitive Grammar. Oxford: Oxford University Press. Taylor, John R. (). ‘The ecology of constructions’, in Günter Radden and Klaus-Uwe Panther (eds), Studies in Linguistic Motivation. Berlin/New York: Mouton de Gruyter, –. Taylor, John R. (). The Mental Corpus: How Language is Represented in the Mind. Oxford: Oxford University Press. Taylor, John R. (a). ‘Prototype effects in grammar’, in Ewa Dąbrowska and Dagmar Divjak (eds), Handbook of Cognitive Linguistics. Berlin/New York: Mouton de Gruyter, –. Taylor, John R. (b). ‘Word-formation in cognitive grammar’, in Peter O. Müller, Ingeborg Ohnheiser, Susan Olsen, and Franz Rainer (eds), Word-Formation: An International Handbook of the Languages of Europe. Berlin/New York: Mouton de Gruyter, –. Taylor, John R. (). ‘Lexical semantics’, in Barbara Dancygier (ed), The Cambridge Handbook of Cognitive Linguistics. Cambridge: Cambridge University Press, –. Taylor, John R., and Kam-Yiu Pang (). ‘Seeing as though.’ English Language and Linguistics : –. ten Hacken, Pius (). ‘Delineating derivation and inflection’, in Rochelle Lieber and Pavol Štekauer (eds), The Oxford Handbook of Derivational Morphology. Oxford; Oxford University Press, –. Ten Wolde, Elnora, and Evelien Keizer (). ‘Structure and substance in functional discourse grammar: the case of the binominal noun phrase.’ Acta Linguistica Hafniensia (): –. Tesnière, Lucien (). Éléments de syntaxe structurale. Paris: Klincksieck. Tesnière, Lucien (). Elements of Structural Syntax. Translated by Timothy Osborne and Sylvain Kahane. Amsterdam/Philadelphia: John Benjamins [Éléments de syntaxe structurale , Paris: Klincksieck]. Theses (). Theses presented to the First Congress of Slavists held in Prague in . In Josef Vachek and Libuše Dušková (eds), Praguiana: Some Basic and Less Known Aspects of the Prague Linguistic School. Amsterdam/Philadelphia: John Benjamins, –. Thieroff, Rolf (). ‘Moods, moods, moods’, in Björn Rothstein and Rolf Thieroff (eds), Mood in the Languages of Europe. Amsterdam/Philadelphia: John Benjamins, –. Thompson, Sandra A. (). ‘Grammar and written discourse: Initial vs. final purpose clauses in English.’ Text : –. Thompson, Sandra A. (). ‘A discourse approach to the category “adjective”’, in John A. Hawkins (ed), Explaining Language Universals. Oxford: Blackwell, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Thompson, Sandra A. (). ‘Information flow and dative shift in English discourse’, in Jerold A. Edmondson, Feagin Crawford, and Peter Mühlhäusler (eds), Development and Diversity: Language Variation across Space and Time. Dallas, Texas: Summer Institute of Linguistics, –. Thompson, Sandra A. (). ‘“Object complements” and conversation: Towards a realistic account.’ Studies in Language (): –. Thompson, Sandra A., and Yuka Koide (). ‘Iconicity and “indirect objects” in English.’ Journal of Pragmatics (): –. Tieken-Boon van Ostade, Ingrid. (). ‘Double negation and eighteenth-century English grammars.’ Neophilologus : –. Tieken-Boon van Ostade, Ingrid (). ‘Exemplification in eighteenth-century English grammars.’ Papers from the Fifth International Conference on English Historical Linguistics, Cambridge, – April . Amsterdam/Philadelphia: John Benjamins, –. Tieken-Boon van Ostade, Ingrid (ed) (). Two Hundred Years of Lindley Murray. Münster: Nodus. Tieken-Boon van Ostade, Ingrid (). The Bishop’s Grammar: Robert Lowth and the Rise of Prescriptivism. Oxford: Oxford University Press [print edition]. Tieken-Boon van Ostade, Ingrid (). The Bishop’s Grammar: Robert Lowth and the Rise of Prescriptivism in English. Oxford: Oxford University Press [online edition]. Tieken-Boon van Ostade, Ingrid and Wim van der Wurff (eds) () Current Issues in Late Modern English. Berlin and New York: Peter Lang. Tognini-Bonelli, Elena (). Corpus Linguistics at Work. Amsterdam/Philadelphia: John Benjamins. Tomasello, Michael (). Constructing a Language: A Usage-based Theory of Language Acquisition, Cambridge, MA: Harvard University Press. Tomlin, Russell S. (). Basic Word Order: Functional Principles. London: Croom Helm. Tonhauser, Judith, David I. Beaver, and Judith Degen (). ‘How projective is projective content? Gradience in projectivity and at-issueness.’ Journal of Semantics (): –. Tonhauser, Judith, David I. Beaver, Craige Roberts, and Mandy Simons (). ‘Towards a taxonomy of projective content.’ Language (): –. Toolan, M. (). ‘Peter Black, Christopher Stevens, class and inequality in the Daily Mail.’ Discourse and Society (): –. Traugott, Elizabeth Closs (). ‘On the rise of epistemic meanings in English: An example of subjectification in semantic change.’ Language (): –. Traugott, Elizabeth Closs (). ‘Syntax’, in Richard M. Hogg (ed), The Cambridge History of the English Language. Vol. I, From the Beginnings to . Cambridge: University Press, – . Traugott, Elizabeth Closs (). ‘Historical aspects of modality’, in Frawley (ed) (), –. Traugott, Elizabeth Closs (). ‘The grammaticalization of NP of NP patterns’, in Alexander Bergs and Gabriele Diewald (eds), Constructions and Language Change. Berlin/New York: Mouton de Gruyter, –. Traugott, Elizabeth Closs (). ‘(Inter)subjectivity and (inter)subjectification,’ in Kristin Davidse, Lieven Vandelanotte, and Hubert Cuyckens (eds), Subjectification, Intersubjectification and Grammaticalization. Berlin/New York: Mouton de Gruyter, –. Traugott, Elizabeth Closs, and Richard Dasher (). Regularity in Semantic Change. Cambridge: Cambridge University Press. Traugott, Elizabeth Closs, and Bernd Heine (eds) (). Approaches to Grammaticalization. Two volumes. Amsterdam/Philadelphia: John Benjamins.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Traugott, Elizabeth Closs, and Graeme Trousdale (). ‘Gradience, gradualness and grammaticalization: How do they intersect?’, in Elizabeth Closs Traugott and Graeme Trousdale (eds), Gradience, Gradualness and Grammaticalization. Amsterdam/Philadelphia: John Benjamins, –. Traugott, Elizabeth Closs, and Graeme Trousdale (). Constructionalization and Constructional Changes. Oxford: Oxford University Press. Traugott, Elizabeth Closs, and Graeme Trousdale (). ‘Contentful constructionalization.’ Journal of Historical Linguistics (): –. Truckenbrodt, Hubert (). ‘Semantics of intonation.’, in Claudia Maienborn, Klaus von Heusinger and Paul Portner (eds), Semantics: An International Handbook of Natural Language Meaning : –. Trudgill, Peter (a). ‘Standard English: What it isn’t’, in Tony Bex and Richard Watts (eds), Standard English: The Widening Debate. London: Routledge, –. Trudgill, Peter (b). The Dialects of England. Oxford: Blackwell. Trudgill, Peter (). ‘The dialect of East Anglia: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Trudgill, Peter (a). ‘Vernacular universals and the sociolinguistic typology of English dialects’, in Markku Filppula, Juhani Klemola, and Heli Paulasto (eds), Vernacular Universals and Language Contacts: Evidence from Varieties of English and Beyond. London/ New York: Routledge, –. Trudgill, Peter (b). ‘Sociolinguistic typology and complexification’, in Geoffrey Sampson, David Gil, and Peter Trudgill (eds), Language Complexity as a Variable Concept. Oxford: Oxford University Press, –. Trudgill, Peter, and Jack Chambers (). Dialects of English: Studies in Grammatical Variation. London/New York: Longman. Trudgill, Peter, and Jean Hannah (). International English: A Guide to Varieties of Standard English. London: Arnold. Tubau Muntañá, Susagna (). Negative Concord in English and Romance: Syntax– Morphology Interface Conditions on the Expression of Negation. Utrecht: LOT. Upton, Clive, and J. D. A. Widdowson (). An Atlas of English Dialects. Oxford: Oxford University Press. Välimaa-Blum, Rütta (). Cognitive Phonology in Construction Grammar: Analytic Tools for Students of English. Berlin/New York: Mouton de Gruyter. Vallduví, Enric (). The Informational Component. New York: Garland. Vallduví, Enric (). ‘Information structure’, in Maria Aloni and Paul Dekker (eds), The Cambridge Handbook of Formal Semantics. Cambridge: Cambridge University Press, –. Vallduví, Enric, and Elisabet Engdahl (). ‘The linguistic realization of information packaging.’ Linguistics : –. van Benthem, Johan, and Alice ter Meulen (eds) (). Handbook of Logic and Language, st edn. Amsterdam: North-Holland. van Benthem, Johan, and Alice ter Meulen (eds) (). Handbook of Logic and Language, nd edn. London: Elsevier. van Bergen, Linda (). ‘Early progressive passives.’ Folia Linguistica (): –. Vandenberghe, Rik, Anna Christina Nobre, and Cathy J. Price (). ‘The response of left temporal cortex to sentences.’ Journal of Cognitive Neuroscience : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
van der Auwera, Johan (). ‘Conditionals and speech acts’, in Elizabeth Traugott, Alice ter Meulen, and Judy Reilly (eds), On Conditionals. Cambridge: Cambridge University Press, –. van der Auwera, Johan (). ‘On the semantic and pragmatic polyfunctionality of modal verbs’, in K. Turner (ed), The Semantics-Pragmatics Interface from Different Points of View. Oxford: Elsevier, –. van der Auwera, Johan (). ‘Standard Average European’, in Bernd Kortmann and Johan van der Auwera (eds), The Languages and Linguistics of Europe: A Comprehensive Guide. Berlin/Boston: Mouton de Gruyter, –. van der Auwera, Johan, and Andrej Malchukov (). ‘A semantic map for depictive adjectivals’, in Nikolaus Himmelmann and Eva F. Schultze-Berndt (eds), Secondary Predication and Adverbial Modification. Oxford: Oxford University Press, –. van der Auwera, Johan, and Vladimir Plungian (). ‘Modality’s semantic map.’ Linguistic Typology : –. van der Auwera, Johan, Dirk Noël, and An Van linden (). ‘Had better, ’d better and better: Diachronic and transatlantic variation’, in Juana I. Marín-Arrese, Marta Carratero, Jorge Arús Hita, and Johan van der Auwera (eds), English Modality: Core, Periphery and Evidentiality. Berlin/New York: Mouton de Gruyter, –. van der Gaaf, Willem (). ‘Beon and habban connected with an inflected infinitive.’ English Studies (–): –. van der Hulst, Harry (). ‘On the parallel organization of linguistic components.’ Lingua : –. van der Sandt, Rob (). ‘Presupposition projection as anaphora resolution.’ Journal of Semantics (): –. van de Velde, Freek (). ‘A structural-functional account of NP-internal mood.’ Lingua : –. van Dijk, Teun A. (). ‘Critical discourse analysis and nominalization: Problem or pseudoproblem?’ Discourse and Society (): –. van Dijk, Teun A. (). Discourse and Knowledge: A Sociocognitive Approach. Cambridge: Cambridge University Press. Van Eynde, Frank (). ‘The big mess construction’, in Stefan Müller (ed), Proceedings of the HPSG- Conference, –. van Gompel, Roger P. G., and Simon P. Liversedge (). ‘The influence of morphological information on cataphoric pronoun assignment.’ Journal of Experimental Psychology: Learning, Memory, and Cognition : –. van Herten, Marieke, Herman H. J. Kolk, and Dorothee J. Chwilla (). ‘An ERP study of P effects elicited by semantic anomalies.’ Cognitive Brain Research : –. van Kemenade, Ans (). ‘Functional categories, morphosyntactic change, grammaticalization.’ Linguistics : –. Van Langendonck, Willy (). ‘Determiners as heads?’ Cognitive Linguistics : –. Van linden, An, Jean-Christophe Verstraete, and Hubert Cuyckens (). ‘The semantic development of essential and crucial: Paths to deontic meaning.’ English Studies (): –. van Marle, Jaap (). On the Paradigmatic Dimension of Morphological Creativity. Dordrecht: Foris Publications. Van Peer, Willie (). Stylistics and Psychology: Investigations of Foregrounding. London: Croom Helm. van Tiel, Bob, Emiel Van Miltenburg, Natalia Zevakhina, and Bart Geurts (). ‘Scalar diversity.’ Journal of Semantics (): –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Van Trijp, Remi (). ‘A comparison between Fluid Construction Grammar and Sign-Based Construction Grammar.’ Constructions and Frames (): –. Van Valin, Robert D. Jr. (ed) (). Advances in Role and Reference Grammar. Amsterdam/ Philadelphia: John Benjamins. Van Valin, Robert D. Jr. (). Exploring the Syntax-Semantics Interface. Cambridge: Cambridge University Press. Van Valin, Robert D. Jr., and Randy J. LaPolla (). Syntax: Structure, Meaning and Function. Cambridge: Cambridge University Press. Vater, Heinz (). ‘On the possibility of distinguishing between complements and adjuncts’, in Werner Abraham (ed), Valence, Semantic Case, and Grammatical Relations. Amsterdam/Philadelphia: John Benjamins, –. Velupillai, Viveka (). An Introduction to Linguistic Typology. Amsterdam/Philadelphia: John Benjamins. Venditti, Jennifer J., Sun-Ah Jun, and Mary E. Beckman (). ‘Prosodic cues to syntactic and other linguistic structures in Japanese, Korean and English’, in James Morgan, and Katherine Demuth (eds), Signal to Syntax: Bootstrapping from Speech to Grammar in Early Acquisition. Mahwah, NJ: Lawrence Erlbaum, –. Vendler, Zeno (). Linguistics in Philosophy. Ithaca: Cornell University Press. Verstraete, Jean-Christophe (). ‘Subjective and objective modality: Interpersonal and ideational functions in the English modal auxiliary system.’ Journal of Pragmatics : –. Villata, Sandra, Luigi Rizzi, and Julie Franck (). ‘Intervention effects and Relativized Minimality: New experimental evidence from graded judgments.’ Lingua : –. Virtanen, Tuija (). ‘Point of departure: Cognitive aspects of sentence-initial adverbials’, in Tuija Virtanen (ed), Approaches to Cognition Through Text and Discourse. Berlin/New York: Mouton de Gruyter, –. Visser, Fredericus Th. (–). An Historical Syntax of the English Language, Vol. III(). Leiden: E.J. Brill. von Fintel, Kai (). Restrictions on Quantifier Domains. Ph.D. thesis. Amherst, MA: University of Massachusetts at Amherst. Vorlat, Emma (). ‘The sources of Lindley Murray’s The English Grammar.’ Leuvensche Bijdragen : –. Vorlat, Emma (). The Development of English Grammatical Theory –. Leuven: Leuven University Press. Vos, Riet (). A Grammar of Partitive Constructions. Ph.D. thesis. Tilburg: Tilburg University. Wade, Katie (). Personal Pronouns in Present Day English. Cambridge: Cambridge University Press. Wade, Terence (). A Comprehensive Russian Grammar. Oxford: Blackwell. Wagers, Matthew, and Colin Phillips (). ‘Multiple dependencies and the role of the grammar in real-time comprehension.’ Journal of Linguistics : –. Wagner, Michael (). ‘Prosody and recursion in coordinate structures and beyond.’ Natural Language and Linguistic Theory : –. Wagner, Susanne (). Gender in English Pronouns: Myth and Reality. Ph.D. thesis. Freiburg: University of Freiburg.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Wagner, Susanne (). ‘English dialects in the Southwest: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Wagner, Susanne (). ‘Pronominal systems’, in Raymond Hickey (ed), Areal Features of the Anglophone World. Berlin/New York: Mouton de Gruyter, –. Wald, Benji, and Lawrence Besserman (). ‘The emergence of the verb-verb compound in twentieth century English and twentieth century linguistics’, in Donka Minkova and Robert Stockwell (eds), Studies in the History of the English Language. Berlin/New York: Mouton de Gruyter, –. Walker, Traci (). ‘Form (does not equal) function: The independence of prosody and action.’ Research on Language and Social Interaction : –. Waller, Tim (). The Subjunctive in Present-Day British English: A Survey, with Particular Reference to the Mandative Subjunctive. M.A. thesis. London: University College London. Wallis, John (). Grammatica Linguae Anglicanae. Hamburg: Gotfried Schultzen. Wallis, Sean (). ‘Annotation, retrieval and experimentation’, in Anneli Meurman-Solin and Arja Nurmi (eds), Annotating Variation and Change. Helsinki: Varieng, University of Helsinki. Available at: www.helsinki.fi/varieng/series/volumes//wallis Wallis, Sean (). ‘Searching treebanks and other structured corpora’, in Anke Lüdeling and Merja Kytö (eds), Corpus Linguistics: An International Handbook. Berlin/New York: Mouton de Gruyter, –. Wallis, Sean (). ‘Binomial confidence intervals and contingency tests: Mathematical fundamentals and the evaluation of alternative methods.’ Journal of Quantitative Linguistics (): –. Wallis, Sean (). ‘What might a corpus of parsed spoken data tell us about language?’, in Ludmila Veselovská and Markéta Janebová (eds), Complex Visibles Out There. (Proceedings of the Olomouc Linguistics Colloquium : Language Use and Linguistic Structure.) Olomouc: Palacký University, –. Wallis, Sean (). Adapting Random-Instance Sampling Variance Estimates and Binomial Models for Random-Text Sampling. London: Survey of English Usage. Available at http:// corplingstats.wordpress.com////adapting-variance. Wallis, Sean (). ‘Investigating the additive probability of repeated language production decisions.’ International Journal of Corpus Linguistics (): –. Wallis, Sean (forthcoming). Statistics in Corpus Linguistics: A New Approach. New York: Routledge. Wallis, Sean, and Gerald Nelson (). ‘Syntactic parsing as a knowledge acquisition problem.’ Proceedings of th European Knowledge Acquisition Workshop, Catalonia, Spain: Springer, –. Wallis, Sean, and Gerald Nelson (). ‘Knowledge discovery in grammatically analysed corpora.’ Data Mining and Knowledge Discovery : –. Waltereit, Richard (). ‘Grammaticalization and discourse’, in Narrog and Heine (eds) (), –. Wanner, Anja (). Deconstructing the English Passive. (Topics in English Linguistics, .) Berlin/New York: Mouton de Gruyter. Ward, Gregory L. (). The Semantics and Pragmatics of Preposing. New York/London: Garland.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Ward, Gregory L., and Betty J. Birner (). ‘Discourse and information structure’, in Deborah Schiffrin, Deborah Tannen, and Heidi E. Hamilton (eds), The Handbook of Discourse Analysis. Malden, MA: Blackwell, –. Ward, Gregory, and Ellen F. Prince (). ‘On the topicalization of indefinite NPs.’ Journal of Pragmatics : –. Ward, Gregory, Richard Sproat, and Gail McKoon (). ‘A pragmatic analysis of so-called anaphoric islands.’ Language : –. Warner, Anthony R. (). ‘Review article: D.W. Lightfoot, Principles of Diachronic Syntax.’ Journal of Linguistics (): –. Warner, Anthony R. (). English Auxiliaries: Structure and History. Cambridge: Cambridge University Press. Warner, Anthony R. (). ‘Predicting the progressive passive: Parametric change within a lexicalist framework.’ Language (): –. Warner, Anthony R. (). ‘English auxiliaries without lexical rules’, in Robert D. Borsley (ed), The Nature and Function of Syntactic Categories. New York: Academic Press, –. Wasow, Thomas (). ‘End-weight from the speaker’s perspective.’ Journal of Psycholinguistic Research (): –. Wasow, Thomas (). Postverbal Behavior, Stanford: CSLI Publications. Wattam, Stephen (). Technological Advances in Corpus Sampling Methodology. Ph.D. thesis. Lancaster: Lancaster University. Available at: www.extremetomato.com/cv/papers/ thesis.pdf. Webelhuth, Gert (). ‘X-bar theory and case theory’, in Gert Webelhuth (ed), Government and Binding Theory and the Minimalist Program. Oxford: Blackwell, –. Webster’s Third New International Dictionary (). Springfield, MA: G. and C. Merriam Co. Wechsler, Stephen (). Word Meaning and Syntax: Approaches to the Interface. Oxford: Oxford University Press. Weir, Andrew (). ‘Left-edge deletion in English and subject omission in diaries.’ English Language and Linguistics : –. Welke, Klaus (). Valenzgrammatik des Deutschen. Berlin/New York: Mouton de Gruyter. Wells, J. C. (). English Intonation: An Introduction. Cambridge: Cambridge University Press. Wells, J. C. (). Longman Pronunciation Dictionary, rd edn. Harlow: Pearson Education. Wellwood, Alexis, Roumyana Pancheva, Valentine Hacquard, and Colin Phillips (). ‘The anatomy of a comparative illusion.’ Journal of Semantics : –. Werth, Paul (). Text Worlds: Representing Conceptual Space in Discourse. London: Longman. Westerhoff, Jan (). Ontological Categories: Their Nature and Significance. Oxford: Oxford University Press. Whorf, Benjamin Lee (). Language, Thought and Reality: Selected Writings of Benjamin Lee Whorf. Cambridge, MA: MIT Press. Wichmann, Anne (). Beginnings, Middles and Ends: Intonation in Text and Discourse. London: Routledge. Wiechmann, Daniel, and Elma Kerz. (). ‘The positioning of concessive adverbial clauses in English: Assessing the importance of discourse-pragmatic and processing-based constraints.’ English Language and Linguistics : –. Wierzbicka, Anna (). The Semantics of Grammar. Amsterdam/Philadelphia: John Benjamins.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Williams, Edwin (). ‘Discourse and logical form.’ Linguistic Inquiry : –. Williamson, Timothy (). Vagueness. New York: Routledge. Wilson, Deirdre, and Dan Sperber (). ‘Mood and the analysis of non-declarative sentences’, in Jonathan Dancy, J. M. E. Moravcsik, and C. C. W. Taylor (eds), Human Agency: Language, Duty and Value. Stanford: Stanford University Press, –. Wilson, Deirdre, and Dan Sperber (). Meaning and Relevance. Cambridge: Cambridge University Press. Wilson, George (). The Youth’s Pocket Companion, nd edn. London: J. Cooke. Wiltshire, Caroline, and James Harnsberger (). ‘The influence of Gujarati and Tamil Ls on Indian English: A preliminary study.’ World Englishes : –. Winkler, Susanne (). ‘The information structure of English’, in Manfred Krifka and Renate Musan (eds), The Expression of Information Structure. Berlin/New York: Mouton de Gruyter. Wolff, Susann, Matthias Schlesewsky, Masako Hirotani, and Ina Bornkessel-Schlesewsky (). ‘The neural mechanisms of word order processing revisited: Electrophysiological evidence from Japanese.’ Brain and Language : –. Wolfram, Walt (a). ‘Rural and ethnic varieties in the Southeast: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Wolfram, Walt (b). ‘Urban African American vernacular English: Morphology and syntax’, in Bernd Kortmann, Kate Burridge, Rajend Mesthrie, Edgar Schneider, and Clive Upton (eds) (), –. Wood, Mary McGee (). Categorial Grammars. London: Routledge. Wright, Joseph (–). The English Dialect Dictionary, volumes. Oxford: Henry Frowde. Wurmbrand, Susi (). ‘Tense and aspect in English infinitives.’ Linguistic Inquiry (): –. Xu, Yi, and Ching X. Xu (). ‘Phonetic realization of focus in English declarative intonation.’ Journal of Phonetics : –. Yoshida, Masaya, Michael Walsh Dickey, and Patrick Sturt (). ‘Predictive processing of syntactic structure: Sluicing and ellipsis in real-time sentence processing.’ Language and Cognitive Processes : –. Zanuttini, Raffaela, and Paul Portner (). ‘Exclamative clauses: At the syntax-semantics interface.’ Language : –. Zappavigna, Michele (). ‘Searchable talk: The linguistic functions of hashtags.’ Social Semiotics : –. Zec, Draga, and Sharon Inkelas (). ‘Prosodically constrained syntax’, in Sharon Inkelas and Draga Zec (eds), The Syntax-Phonology Connection. Chicago: University of Chicago Press, –. Zeijlstra, Hedde (). Sentential Negation and Negative Concord. Utrecht: LOT. Ziegeler, Debra P. (). Hypothetical Modality: Grammaticalisation in an L Dialect. Amsterdam/Philadelphia: John Benjamins. Ziegeler, Debra P. (). ‘Past ability modality and the derivation of complementary inferences.’ Journal of Historical Pragmatics : –. Ziegeler, Debra P. (). ‘On the generic origins of modality in English’, in David Hart (ed), English Modality in Context: Diachronic Perspectives. Bern: Peter Lang, –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Ziegeler, Debra P. (). Interfaces with English Aspect: Diachronic and Empirical Studies. Amsterdam: John Benjamins. Ziegeler, Debra P. (a). ‘Propositional aspect and the development of modal inferences in English’, in W. Abraham and E. Leiss (eds) (), –. Ziegeler, Debra P. (b). ‘Grammaticalisation under control: Towards a functional analysis of same-subject identity-sharing.’ Folia Linguistica : –. Ziegeler, Debra P. (). ‘Semantic determinism and the grammaticalisation of have to in English: A reassessment.’ Journal of Historical Pragmatics : –. Ziegeler, Debra P. (). ‘Towards a composite definition of nominal modality’, in Werner Abraham and Elisabeth Leiss (eds), Covert Patterns of Modality. Cambridge: Cambridge Scholars Series, –. Ziegeler, Debra P. (). ‘On the generic argument for the modality of will’, in Juana I. MarínArrese, Marta Carretero, Jorge Arús Hita, and Johan van der Auwera (eds), English Modality: Core, Periphery and Evidentiality. Berlin/New York: Mouton de Gruyter [TIELS ], –. Ziem, Alexander, and Alexander Lasch (). Konstruktionsgrammatik: Konzepte und Grundlagen gebrauchsbasierter Ansätze. Berlin/New York: Mouton de Gruyter. Zifonun, Gisela, Ludger Hoffmann, and Bruno Strecker (). Grammatik der deutschen Sprache. Bände. Berlin/New York: Mouton de Gruyter. Zimmermann, Ilse (). ‘Syntactic categorization’, in Werner Bahner, Joachim Schildt, and Dieter Viehweger (eds), Proceedings of the Fourteenth International Congress of Linguists, Berlin, GDR, . Berlin: Akademie-Verlag, –. Zirkel, Linda (). ‘Prefix combinations in English: Structural and processing factors.’ Morphology : –. Ziv, Yael, and Barbara Grosz (). ‘Right-dislocation and attentional state.’ Proceedings of the th Annual Conference and of the Workshop on Discourse. The Israeli Association for Theoretical Linguistics, –. Zubizarreta, Maria-Luisa (). Prosody, Focus and Word Order. Cambridge, MA: MIT Press. Zupitza, Julius (ed) (). Aelfrics Grammatik und Glossar. Berlin: Weidmannsche Verlagsbuchhandlung. Zwicky, Arnold M. (). ‘On clitics.’ Bloomington, IN: Indiana University Linguistics Club. Zwicky, Arnold M. (). ‘Heads.’ Journal of Linguistics : –. Zwicky, Arnold M. (). ‘Suppressing the Z’s.’ Journal of Linguistics : –. Zwicky, Arnold M. (). ‘Heads, bases and functors’, in Greville G. Corbett, Norman M. Fraser, and Scott McGlashan (eds), Heads in Grammatical Theory. Cambridge: Cambridge University Press, –. Zwicky, Arnold M., and Geoffrey K. Pullum (). ‘Cliticization vs. inflection: English n’t.’ Language : –.
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
N I
...................................
A
Aarts, B. , , , , , , , , , , n, , , , n, , , , , n, , , , n Aarts, B. and L. Haegeman , n, Aarts, B. and S. Wallis Aarts, B., S. Chalker, et al. Aarts, B., J. Close, et al. , , , Aarts, B., D. Denison, et al. , Aarts, F. and J. Aarts n Abney, S. P. n, , –, , , Aboh, E. n Abraham, W. Abraham, W. and E. Leiss Achard, M. , Ackema, P. and M. Schoorlemmer Ackerman, F., et al. Acqaviva, P. and P. Panagiotidis Acuña-Fariña, J. C. n Adcock, F. Ades, A. E. and M. Steedman Adger, D. , n, , Aelius Donatus Åfarli, T. A. Ágel, V. n, n, n Ágel, V., et al. n Aikhenvald, A. Y. and R. M. W. Dixon Ainsworth-Darnell, K., et al. Aitchison, J. n Ajdukiewicz, K. Akmajian, A. and A. Lehrer , , Alexiadou, A. , n, Alexiadou, A., E. Anagnostopoulou, et al. n Alexiadou, A., L. Haegeman, et al. n, , n, n Alexiadou, A., M. Rathert, et al. Allan, K. , ,
Allan, K. and K. M. Jaszczolt n Allen, M. R. Allerton, D. J. n, , n, , Allwood, J., et al. n Al-Mutairi, F. R. Aloni, M. and P. Dekker n Alston, R. C. Alston, W. Anderson, J. M. Anderson, S. R. , Anderwald, L. , , , –, –, n Anderwald, L. and B. Kortmann , n Andreou, M. and A. Ralli Andrews, A. Andrews, B. Anthony, L. , Anttila, A. and Y-M. Y. Cho Arbini, R. Ariel, M. , Aristotle , , Arnauld, A. and C. Lancelot Arnold, D. and A. Spencer Arnold, J. E., et al. Aronoff, M. , Aronoff, M. and M. Lindsay , Asudeh, A. , Asudeh, A. and G. Giorgolo Asudeh, A. and I. Toivonen Asudeh, A., M. Dalrymple, et al. Asudeh, A., G. Giorgolo, et al. Athanasiadou, A. et al. Atkinson, D. , Atlas, J. D. n Austen, J. n, Austin, J. L. , –
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
B
Bach, E. Bach, K. Bach, K. and R. M. Harnish Baerman, M., et al. Baeskow, H. Baker, M. C. –, , , , Baker, P. and E. Levon Baker, R. G. and P. T. Smith Barber, Ch. Bard, E. G., et al. , Bar-Hillel, Y. Barker, C. , Barker, C. and P. Jacobson Baron, D. Barsky, R. F. Bartels, C. Barth-Weingarten, D. and E. Couper-Kuhlen Bary, C. and D. Haug Bastiaansen, M., et al. Bates, E. and B. MacWhinney Bauer, L. , , n, , , , , , , , , , Bauer, L. and R. Huddleston Bauer, L. and A. Renouf Bauer, L., et al. n, , , , n, –, , , , , , , , , , , –, Baumgärtner, K. n Bazerman, C. , , Beal, J. , , , Beard, R. , n, n Beaugrande, R.-A. de and W. U. Dressler Beaver, D. I. n Beaver, D. I. and B. Geurts n Bell, M. J. – Bell, M. J. and I. Plag , Bemis, D. K. and L. Pylkkänen Benczes, R. Bender, E. M. and I. A. Sag n Ben-Shachar, M. et al. Berg, T. , Berko, J. , Bermúdez-Otero, R. Bermúdez-Otero, R. and K. Börjars , ,
Bermúdez-Otero, R. and P. Honeybone , Berwick, R. C., P. Pietroski, B. Yankama, and N. Chomsky Bianchi, V. Biber, D. –, , , , , , , Biber, D. and S. Conrad , , , , , , , – Biber, D. and B. Gray , –, , , , Biber, D. and R. Reppen Biber, D., et al. , , , , n, , n, , , , , , , , , –, , Bieswanger, M. Billig, M. Binnick, R. , , Birner, B. J. n, , Birner, B. J. and G. Ward n, , , , , –, , Blevins, J. P. Blevins, J. P. and I. Sag Bloch, B. Bloomfield, L. , , , –, , –, , , , , , Blumenthal-Dramé, A. n Boas, H. C. , n Bobaljik, J. D. , n, n, Bodomo, A. n Boeckx, C. Boehm, B. n Boersma, P. and D. Weenink Bolinger, D. , , , , Bonami, O. Bonami, O. and G. Stump Booij, G. , , , n, , , , , , Borer, H. , Borik, O. n Bornkessel, I. and M. Schlesewsky Bornkessel, I., et al. Bornkessel-Schlesewsky, I. and M. Schlesewsky Bornkessel-Schlesewsky, I., et al. , Borsley, R. D. , , n, Bošković, Ž. n
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Bouma, G., et al. Bowerman, S. Bowers, J. Bowie, J. and B. Aarts , Bowie, J. and S. Wallis , n Bowie, J., et al. , Boyé, G. and G. Schalchli Boye, K. and E. Engberg-Pedersen Boye, K. and P. Harder Brazil, D. , – Breban, T. and C. Gentens Breivik, L. E. , Brekle, H. E. Brems, L. n, Brems, L. and K. Davidse n Brennan, J. and L. Pylkkänen , Brennan, J., et al. Brentari, D. Bresnan, J. , n, , , , , Bresnan, J. and J. Grimshaw Bresnan, J. and J. Hay Bresnan, J., A. Asudeh, et al. , –, n, , Bresnan, J., A. Cueni, et al. Brezina, V. and M. Meyerhoff n Brinton, L. J. n Brisard, F. Briscoe, E. J. Broca, P. Broccias, C. and W. B. Hollmann Bromberger, S. and M. Halle Brown, D. and A. Hippisley n, Brown, Gillian and G. Yule n Brown, Goold Brown, R. , Bruening, B. Bullokar, W. xxi–xxii, , Büring, D. n, n, Burridge, K. , Burrows, J. F. Burton-Roberts, N. n Burton-Roberts, N. and G. Poole Butler, C. S. , Butler, C. S. and F. Gonzálvez-García , , n Bybee, J. L. , , , , , n, , , , , n, , ,
Bybee, J. L. and S. Fleischman Bybee, J. L. and P. J. Hopper –, Bybee, J. L. and W. Pagliuca – Bybee, J. L., et al. n, , n, , , , ,
C
Campbell, L. Cann, R., et al. Caplan, D., et al. Cappelle, B. and I. Depraetere Cardinaletti, A. Carnap, R. Carnie, A. Carpenter, B. Carr, P. and P. Honeybone Carstairs-McCarthy, A. Carston, R. , –, Carter, R. Carter, R. and M. McCarthy Carter, R. and W. Nash Carver Cassidy, K. and M. Kelly , Cattell, R. Cauthen, J. V. Chafe, W. L. n, , –, , , , , Chapin, P. n Chapman, S. , Chaucer, G. Chaves, R. P. n Chemla, E. Chemla, E. and B. Spector Chen, D. and C. D. Manning Chen, E., et al. Chen, R. Cheshire, J., V. Edwards, et al. Cheshire, J., P. Kerswill, et al. Chierchia, G. – Chierchia, G. and S. McConnell-Ginet n Chierchia, G., et al. Chilton, P. Chomsky, N. , –, –, , , , , , , , , n, , n, n, –, n, , , , –, , –, , n, , , n, , , , , , , , , , , ,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Chomsky, N. and M. Halle , , , Christian, D. Chung, S. Church, K. Churchill, W. Cinque, G. , Cinque, G. and L. Rizzi n Clark, H. H. and S. E. Haviland , , Clarke, S. , , , Close, J. and B. Aarts , Coates, J. , , , Coene, M. and Y. D’hulst Coffin, C., et al. Cohen, M. X. Collins, C. n Collins, P. , n, , , , , Collins, P. and P. Peters Comrie, B. , , , n, n, Condoravdi, C. and S. Kaufmann Conrad, J. – Cook, G. Coote, C. Copestake, A., et al. Corbett, G. G. , , Corver, N. n Coseriu, E. Coulson, S. and T. Oakley Coulson, S., et al. , Couper-Kuhlen, E. Couper-Kuhlen, E. and T. Ono , Couper-Kuhlen, E. and M. Selting , Cowart, W. , Craenenbroeck, J. van and J. Merchant n Craenenbroeck, J. van and T. Temmerman n Crain, S. and J. D. Fodor Cristofaro, S. n, – Crnič, L., et al. Croft, W. , n, n, , , , , , , –, n, , , , , Croft, W. and D. A. Cruse , Cruse, A. D. n, , n, – Cruttenden, A. , , n Crystal, D. , , , , , , Crystal, D. and D. Davy Culicover, P. W. ,
Culicover, P. W. and R. S. Jackendoff , , n, , –, n, , , , , , , Culicover, P. W. and M. Rochemont Culpeper, J. Culpeper, J. and M. Kytö Cumming, S. et al. Curme, G. O. – Curzan, A. , Cutler, A. and D. Norris
D
Dąbrowska, E. Dąbrowska, E. and D. Divjak Dahl, Ö. n, n, , n, n, Dahl, Ö. and V. Velupillai Dalrymple, M. , , , Dalrymple, M., R. M. Kaplan, et al. Dalrymple, M., J. Lamping, et al. Daneš, F. Davidse, K. Davidse, K., et al. Davidsen-Nielsen, N. Davidson, J. Davies, M. , , n, , , , Davis, S. n Davis, S. and B. S. Gillon n Davis, S. M. and M. H. Kelly Davydova, J. Davydova, J., et al. Déchaine, R.-M. Déchaine, R.-M. and M. Wiltschko n Declerck, R. , n, , , n, , Declerck, R. and S. Reed n Declerck, R., et al. n, , Degand, L. and J. Evers-Vermeul , Degand, L. and A.-M. SimonVandenbergen Dehé, N. Dehé, N. and Y. Kavalova Delahunty, G. P. Delais-Roussarie, E., et al. de Marneffe, M.-C. and C. D. Manning Den Dikken, M. n, n Denison, D. , , n Denison, D. and A. Cort , n
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Depraetere, I. Depraetere, I. and C. Langford n, , n Depraetere, I. and S. Reed n Depraetere, I. and A. Verhulst , De Smet, H. Dickens, C. Diessel, H. Dik, S. C. n, , , , , , – Dikker, S., et al. Di Sciullo, A.-M. and E. Williams Divjak, D. n, n Dixon, R. M. W. and A. Y. Aikhenvald , , Dodgson, M. Don, J. Don, J. and M. Erkelens , Dons, U. Dorgeloh, H. , , , , Dorgeloh, H. and A. Wanner , , –, Douthwaite, J. Downing, A. , n, Dowty, D. R. , , , , Dowty, D. R., et al. , Doyle, A. C. Doyle, R. Dreschler, G. Dronkers, N. F., et al. Drubig, H. B. Dryer, M. S. , , n Du Bois, J. W. , , Duffley, P. J. , , , –, , Dunbar, P. L. Dunmore, H. Durieux, G. and S. Gillis , Dürscheid, C.
E
Edwards, V. , Egan, T. , , Ehrlich, S. Ehrlich, V. Einstein, A. Elbourne, P. n, , n Eliot, T. S.
Ellis, N. C. Elsness, J. , Embick, D. , Embick, D. and R. Noyer , , Emmott, C. Emonds, J. E. , , Emons, R. , n, – Engel, U. n, , n Engel, U. and H. Schumacher Engleberg, S., et al. n Englebretson, R. Epstein, S. D., et al. Erteschik-Shir, N. , Evans, N. Evans, N. and S. Levinson Evans, V. Evans, V. and M. Green ,
F
Fabb, N. Fairclough, N. , Faltz, L. M. Fanthorpe, U. A. , Farmer, T. A., et al. , Fauconnier, G. Fauconnier, G. and M. Turner , Faulkner, W. , Fawcett, R. P. n, n, n, , Featherston, S. , , Fellbaum, C. n Fens-de-Zeeuw, L. Ferguson, C. A. Ferreira, F. Ferris, D. C. Féry, C. and V. Samek-Lodovici Fiebach, C. J., et al. Fiengo, R. and R. May, Fillmore, C. , –, , , , n, Fillmore, C. J., et al. , Filppula, M. , , , , Findlay, J. Y. , Firbas, J. , Firth, J. R. , Fischer, K. and A. Stefanowitsch Fischer, O. , , ,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Fischer, O. and W. van der Wurff n, n, Fitting, M. n Fleischman, S. , Fligelstone, S., M. Pacey, et al. Fligelstone, S., P. Rayson, et al. Fodor, J. A. , Foley, W. A. and R. D. Van Valin Jr. n, Folli, R. and H. Harley n Foolen, A. Ford, C. E. Fowler, R. , Fowler, R., et al. , Fowles, J. Fox, B. A. Francis, G. Francis, W. N. Franke, M. Frawley, W. , Frazier, L. and G. B. Flores d’Arcais Freed, A. F. Frege, G. , , Freidin, R. Fried, M. , Friederici, A. D. , Friederici, A. D. and S. Frisch Friederici, A. D., J. Bahlmann, et al. Friederici, A. D., A. Hahne, et al. , Friederici, A. D., M. Meyer, et al. Friederici, A. D., E. Pfeifer, et al. , Fries, C. C. , Fries, P. H. n Frisch, S., et al. –
G
Gabelentz, G. von der Gabrielatos, C. and P. Baker Gagné, C. Gahl, S. and S. M. Garnsey Gamut, L. T. F. , Garside, R. Garside, R. and N. Smith Gast, V. , Gavins, J. Gazdar, G. ,
Gazdar, G., E. H. Klein, et al. , , , n, Gazdar, G., G. K. Pullum, et al. Geach, P. T. n Gee, J. P. Geeraerts, D. n, , Geeraerts, D. and H. Cuyckens Geis, M. Geluykens, R. , Gensler, O. Georgakopoulou, A. and D. Goutsos , Gerdes, K., et al. n, n Gerken, L., et al. Gerwin, J. Geurts, B. Geurts, B. and N. Pouscoulous Geurts, B. and B. van Tiel Giannakidou, A. Gibson, E. and E. Fedorenko , , Giegerich, H. J. , , , , , Ginzburg, J. Ginzburg, J. and R. Fernández Ginzburg, J. and I. A. Sag , – Giorgi, A. and F. Pianesi n Girard, J.-Y. Gisborne, N. n, n, , Givón, T. , n, , n, , –, , , , , , n, n, Godfrey, E. and S. Tagliamonte Goldberg, A. E. , , , , , –, , –, , , n, n, , , n, , –, , Goldberg, A. E. and F. Ackerman Goldberg, A. E. and R. Jackendoff Golding, W. – Gómez-González, M. Á. n Goossens, L. Gotti, M. , Grabe, E., et al. Grafmiller, J. Graustein, G. and G. Leitner Gray, B. and D. Biber – Greaves, P. Greenbaum, S. and R. Quirk , Greenberg, J. n,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Gregory, M. L. and L. A. Michaelis Grewe, T., et al. Grice, H. P. , , –, n, , Gries, S. Th. , n, , n, n Gries, S. Th. and A. Stefanowitsch , , Gries, S. Th. and S. Wulff Grimshaw, J. n, , n Grimshaw, J. and S. Vikner Grodzinsky, Y. – Grodzinsky, Y. and A. Santi Groenendijk, J. and M. Stokhof , Gropen, J., et al. Gross, T. and T. Osborne – Grossman, E. and S. Polis Grosu, A. Gruber, J. S. Guéron, J. n Gueron, J. and J. Lacarme Guevara, E. and S. Scalise Guion, S. G., et al. Gumperz, J. J. Gundel, J. K. n, , n, , Gundel, J. K., et al. n Gunter, T. C., et al. Günther, C. n, Gussenhoven, C. , Gwynne, N. M.
H
Haas, W. Haddon, M. Haegeman, L. n, n, n, Haegeman, L. and J. Guéron Haegeman, L. and T. Ihsane Hagège, C. Hagoort, P. , Hagoort, P., C. Brown, et al. Hagoort, P., M. Wassenaar, et al. Hahne, A. and A. D. Friederici , Haig, G. and S. Schnell n Hale, K. and S. J. Keyser Hall, A. , – Halle, M. Halle, M. and A. Marantz , , , Halle, M. and J.-R. Vergnaud
Halliday, M. A. K. , , n, –, , , , , , , , , , –, , , , , Halliday, M. A. K. and R. Hasan , , Halliday, M. A. K. and C. M. I. M. Matthiessen –, , , , , Hamawand, Z. Hamblin, C. L. Hamilton, W. L., et al. – Han, C.-H. Harley, H. , –, Harley, H. and R. Noyer , Harman, G. H. Harris, R. A. Harris, Z. S. Harrison, C., et al. , Harrison, T. Hart, C. Harvey, T. Hasan, R. Haspelmath, M. , , , –, n, , Hatcher, A. G. Hawkins, J. A. , , , , Hay, J. Hay, J. and I. Plag Hayes, B. and C. Wilson Hayes, B., et al. Hedberg, N. n Hedberg, N. and L. Fadden n, Heger, K. n Heim, I. , Heim, I. and A. Kratzer n, – Heine, B. –, , , , Heine, B. and T. Kuteva Heine, B. and H. Narrog n Heine, B., et al. Helbig, G. , n, n, n Helbig, G. and W. Schenkel , n Hemingway, E. Hengeveld, K. , , n, Hengeveld, K. and J. L. Mackenzie , , , , n, Herbst, T. , , n, n, , Herbst, T. and S. Schüller n, , n, , , n, ,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Herbst, T. and P. Uhrig n Herbst, T., D. Heath, et al. n, , Heringer, H. J. , , Hermann, J. G. J. Herring, S. C. Herrmann, T. Hewitt, G. Hewson, J. n, , Hewson, J. and V. Bubenik Heycock, C. and A. Kroch n, Hickey, R. , n Hill, A. A. Hilpert, M. , , , , , , , Hiltunen, R. Himmelmann, N. and E. Schultze-Berndt Hinojosa, J., et al. Hinrichs, L., et al. , Hinzen, W., et al. n Hirschberg, J. Hirtle, W. H. , Hjelmslev, L. , Ho, Y. Hockett, C. F. Hodson, J. Hoey, M. Hoffmann, T. and G. Trousdale , Hofmeister, P. and I. A. Sag Hogg, R. M. Höhle, B., et al. Hollmann, W. B. –, n, , , Hommerberg, Ch. and G. Tottie Hopper, P. J. , n, , , Hopper, P. J. and S. A. Thompson Hopper, P. J. and E. C. Traugott n, , , , Horn, L. R. , Horn, L. R. and G. Ward n Hornby, A. S. Hornstein, N. n Howells, W. Hoye, L. F. Huang, C.-T. J. , , Huang, Y. n Huddleston, R. , , n, , , , n
Huddleston, R. and G. K. Pullum , , , , , , , , , , , , n, , , , n, , , , , , , , , , , , , , , , , n, n, n, , , , n, , , –n, , , n, n, , n, , , , , n, , Huddleston, R., et al. xxii, , n, , n, , , n, n, –, –, Hudson, R. A. , , , n, –, , , n, n, , , , , , , n, n, n, , –, n, Hughes, A., et al. – Hughes, R. , Hume, D. Humphries, C., et al. Hundt, M. , n, , , , , , – Hundt, M. and A. Gardner , Hundt, M. and G. Leech Hundt, M. and C. Mair Hundt, M., et al. n Hunston, S. and G. Francis , n Hunter, J.
I
Isel, F., et al. Iverson, G. K. and S.-C. Ahn
J
Jackendoff, R. S. , , , , , , , , , , , , , , , , , , , n, Jacobs, J. n, n, Jacobson, P. , , James, F. James, H. Jansen, W. Janssen, T. M. V. Jarrett, G. A. Jary, M. and M. Kissine , , Jaszczolt, K. M. Jeffries, L. , , , Jeffries, L. and D. McIntyre
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Jespersen, O. xxii, , –, , , , , , , , , n, , Ježek, E. n Johannessen, J. B. , – Johnson, D. E. and P. M. Postal Johnson, Keith n Johnson, Kyle , Johnson, S. , Jones, D. Jonson, B. Jurafsky, D. n Jusczyk, P. W., et al. Just, M. A., et al. ,
K
Kaan, E. , Kaan, E. and T. Y. Swaab , Kaan, E., et al. Kachru, B. , , Kahan, J. n Kahane, S. n, Kahane, S. and T. Osborne n Kaisse, E. M. Kaleta, A. n Kallel, A. n Kaltenböck, G. n, Kaltenböck, G., et al. – Kamp, H. Kamp, H. and U. Reyle n, Kaplan, D. Kaplan, R. M. and J. Bresnan , Karlsson, F. Karttunen, L. , Karttunen, L. and S. Peters Kasher, A. n Kastovsky, D. n Katamba, F. and J. Stonham Katz, J. and E. O. Selkirk Kay, P. , Kay, P. and C. Fillmore , Kaye, J. , Kayne, R. S. , , , n Kazanina, N., et al. Kearns, K. n, , , n Keenan, E. and B. Comrie Keenan-Ochs, E. and B. Schieffelin n Kehoe, A. and M. Gee
Keizer, E. , , , , n, n, , n, , Keller, F. , Kelly, M. H. , Kelly, M. H. and J. K. Bock , , Kelly, M. H., et al. Kelman, J. Kempson, R. M. , Kempson, R. M., et al. Kenesei, I. , Kennedy, C. n, , Kennedy, C. and L. McNally , Kenstowicz, M. Kho, K. H., et al. Kilgariff, A., et al. , Kim, A. and L. Osterhout Kim, J.-B. and I. A. Sag Kim, J.-B. and P. Sells n Kiparsky, P. , Kiparsky, P. and C. Kiparsky Kiss, K. É. Kissine, M. Kitagawa, Y. Klavans, J. L. Klein, D. and C. D. Manning Klein, E. Klein, E. and I. A. Sag , n Klotz, M. n Klotz, M. and T. Herbst n Kluender, R. and M. Kutas , , Koch, P. and W. Oesterreicher Kokkonidis, M. König, E. König, E. and P. Siemund , , König, E., et al. Koontz-Garboden, A. Koopman, H. and D. Sportiche Koopman, H., et al. , Korta, K. and J. Perry Kortmann, B. , , , , , n Kortmann, B. and K. Lunkenheimer , –, , n Kortmann, B. and V. Schröter n Kortmann, B. and B. Szmrecsanyi , , Kortmann, B. and C. Wolk – Kortmann, B., et al. –
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Koster, J. Kratzer, A. n, n Kratzer, A. and E. O. Selkirk Kreidler, C. W. Kress, G. and R. Hodge Kreyer, R. , Krifka, M. , , , , n, n, n, Krug, M. G. , , , –, , , n, Kruisinga, E. , n Kučera, H. and W. N. Francis , Kuhn, T. S. Kuno, S. , Kunter, G. Kuperberg, G. R. Kuperberg, G. R., et al. Kutas, M. and K. D. Federmeier Kutas, M. and S. A. Hillyard Kutas, M., et al. Kuteva, T.
L
Labov, W. n, n, , Ladd, D. R. , , , Ladd, D. R. and R. Morton Lakatos, I. , Lakoff, G. , , n, , , – Lakoff, G. and M. Johnson , n, Lakoff, R. Lambek, J. Lambrecht, K. , , , –, –, , , , , Landau, I. n Langacker, R. W. , –, , , , , , , , n, , , , , , –, –, –, , n, , , –, , , , , , , , n, n, –, , , , Langacker, R. W., et al. n Langendoen, D. T. and H. Savin Lapointe, S. Lappin, S. and C. Fox n Larson, R. n, n Lasnik, H. n Lasnik, H. and T. Lohndal , n Lass, R. n,
Lau, E. F., et al. Lee, S.-A. n Leech, G. N. , –, , , , , , , , Leech, G. N. and R. Garside Leech, G. N. and M. Short , , , – Leech, G. N. and J. Svartvik Leech, G. N., et al. , , , , –, , Lees, R. B. , Lehmann, C. , Leino, J. Lerner, G. H. Lev, I. Levi, J. , Levin, B. Levin, B. and M. Rappaport Hovav Levine, R. D. , Levinson, S. , , , , n, –, , , Levy, Y. Lewis, D. , Lewis, M. Lewis, S. and C. Phillips n Li, J. Liberman, M. and J. Pierrehumbert Liberman, M. and A. Prince Lieber, R. n, , , n, Lieber, R. and S. Scalise Lieber, R. and P. Štekauer n Lieven, E. V. , n Lightfoot, D. , , , , , , Lindstrom, J. et al. Linell, P. Linn, A. xxi, , , Littlemore, J. and J. R. Taylor Lobeck, A. Löbel, E. Löbner, S. , n Lohndal, T. , , Longacre, R. E. Longobardi, G. n López-Couso, M. J. and B. Mendez-Naya Los, B. , n,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Lounsbury, F. Lowe, J. J. Lowth, R. –, , , , – Luck, S. Lunkenheimer, K. n, , n Lyons, J. , , , , ,
M
Mackenzie, J. L. , Macnamara, J. MacWhinney, B. , MacWhinney, B., et al. Maddieson, I. Madlener, K. Mahlberg, M. Mahlberg, M., et al. n Maiden, M. n Maienborn, C., et al. n Mair, C. n, n, , –, Mair, C. and G. Leech Malchukov, A. n Maling, J. n Mann, W. C. and S. A. Thompson Marantz, A. , , , , Marchand, H. , Marcus, M., et al. Martí, L. n Martin, J. Martínez Insúa, A. E. Mastop, R. J. Matthews, P. H. , n, , n, –, , , , , n, Mattys, S., et al. Mazoyer, B. M., et al. McArthur, T. McCall Smith, A. McCarthy, Cormac McCawley, J. D. xxii, , McCloskey, J. , n McCoard, R. W. McCormick, K. , McEnery, T. and A. Hardie McGilvray, J. McGinnis-Archibald, M. , McGregor, W. B. , n McIntyre, D.
McKinnon, R. and L. Osterhout McMahon, A. McWhorter, J. , Melchers, G. , Mel’čuk, I. n, n, –, n Melloni, C. n, Merchant, J. n, , , , – Miall, D. and D. Kuiken Michael, I. xxii, , , Michaelis, L. and K. Lambrecht Miestamo, M. , Millar, N. Miller, C. Miller, D. G. Miller, G. A. n Miller, G. A. and N. Chomsky Miller, J. , , , , , Miller, J. and R. Weinert , Miller, J. E. Minkova, D. and R. Stockwell n Moens, M. and M. Steedman Mompean, J. Monaghan, C. , Monaghan, P., et al. , – Mondorf, B. Montague, R. , , , n, , Montalbetti, M. Montgomery, M. , , , Moreton, E. Morgan, J. L. Morrill, G. V. Moss, L. – Mufwene, S. , Mukherjee, J. Mulder, J. and S. A. Thompson Mulder, J., et al. Müller, P. O., et al. n Müller, S. , n Müller, S. and S. Wechsler – Münte, T. F. and H. J. Heinze Münte, T. F., et al. Murphy, M. L. n Murray, L. , –, , Murray, T. and B. L. Simon , , , , Muysken, P.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
N
Nagle, S. J. Narrog, H. , , Narrog, H. and B. Heine Nathan, G. S. Neale, S. Neidle, C. Nelson, G., et al. , , , , , n, Nesfield, J. C. – Nespor, M. and I. Vogel Nevalainen, T. n, Nevalainen, T. and E. C. Traugott , n Neville, H., et al. Nevins, A. and J. Parrott Newlyn, L. Newmeyer, F. J. , n, , , , Nikolaeva, I. Noonan, M. Nordlinger, R. and L. Sadler , Nordlinger, R. and E. C. Traugott , Nouwen, R., et al. – Noyes, A. – Nunberg, G., et al. Nuyts, J. , , , ,
O
Occam, William of Öhl, P. O’Keefe, A. and M. McCarthy Orasan, C. Orton, H. and E. Dieth Osborne, T. n, n Osborne, T. and T. Gross Osborne, T. and S. Kahane n Osborne, T., et al. Osterhout, L. and P. Hagoort Osterhout, L. and P. J. Holcomb Osterhout, L. and L. A. Mobley Osterhout, L., et al. Östman, J.-O. and J. Verschueren n Övergaard, G. –
P
Palmer, F. R. , –, , , , , –, , , n Paltridge, B. and S. Starfield Panagiotidis, P. , –,
Papafragou, A. Partee, B. H. Partee, B. H., et al. n, , , Pater, J. Patten, A. Pawley, A. , , , Pawley, A. and F. H. Snider Payne, J. Payne, J. and R. Huddleston , , , n, , , Payne, J., R. Huddleston, et al. n, –, , Payne, J., G. K. Pullum, et al. Penhallurick, R. , Penrose, A. and S. Katz Perek, F. Perek, F. and A. E. Goldberg Pérez-Guerra, J. and A. E. MartìnezInsua Perlmutter, D. M. Perry, J. Phillips, C. Phillips, C., et al. , Phillips, S. Pickering, M. J. and V. S. Ferreira Pierrehumbert, J. Pierrehumbert, J. and J. Hirschberg Pierrehumbert, J., et al. Pietsch, L. , – Pike, K. Pinker, S. , , Pinter, H. Pintzuk, S. –n Plag, I. , Plag, I. and H. Baayen Plag, I., et al. Plank, F. Pollard, C. and I. A. Sag , –, , , , , Popper, K. – Portner, P. and B. H. Partee n Postal, P. M. n Postal, P. M. and G. K. Pullum n Potts, C. , –, n, Poutsma, H. n Priestley, J. Prince, A. and P. Smolensky
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Prince, E. F. , , –, , , –, – Pullum, G. K. , n, –, , –, , , , , , Pullum, G. K. and R. Huddleston , Pullum, G. K. and J. Rogers – Pullum, G. K. and D. Wilson xxii, , Pulvermüller, F., et al. Pustejovsky, J. , n, Putnam, H.
Q
Quaglio, P. and D. Biber Quine, W. van O. Quirk, R. , , – Quirk, R., et al. xxii, –, , n, , , –, , , , , , n, , , , , , , , , , , , , n, n, n, , , –, , –, , , ,
R
Radden, G. and R. Dirven Radford, A. , n, Rainer, F. Ramchand, G. n, Ramchand, G. and C. Reiss n Randall, B. Rappaport Hovav, M. and B. Levin Rauh, G. Recanati, F. , , Reichenbach, H. , n Reinhart, T. , , , Renner, V. , Rett, J. , –, Rialland, A. Richards, M. Riemsdijk, H. C. van Rijkhoff, J. Ritz, M.-E. Rizzi, L. n, , n, n, n Robenalt, C. and A. E. Goldberg n Roberts, I. n, Roberts, I. and A. Roussou n, Robins, R. H. , Rochemont, M.
Rochemont, M. S. and P. Culicover , Rodefer, S. Rogalsky, C. and G. Hickok Rohdenburg, G. Rooth, M. n Rosch, E. n Rosch, E. and C. B. Mervis n Rosenbach, A. , Ross, J. R. xxii, , , , , , , Rossi, S., et al. Round, E. n Rumens, C. , Ruppenhofer, J., et al. n Russell, B. , , ,
S
Sacks, H., et al. , Sadock, J. M. , Sadock, J. M. and A. M. Zwicky , Sag, I. A. , , , , Sag, I. A., G. Gazdar, et al. Sag, I. A., T. Wasow, et al. , Salkie, R. , n, , Sanders, T. and H. Pander Maat Sanders, T. and J. Sanders , Sankoff, G. and H. Blondeau Sansom, P. – Santi, A. and Y. Grodzinsky Sapir, E. Sasse, H.-J. Sauerland, U. Saussure, F. de , , , Scalapino, L. Scalise, S. and E. Guevara , Scheer, T. , , Schilk, M. and M. Hammel n Schleppegrell, M. Schlobinski, P. Schmid, H.-J. n, , Schneider, A. n, Schneider, E. W. , Schönefeld, D. n, , n, n, n Schreier, D. and M. Hundt Schröter, V. and B. Kortmann Schumacher, H., et al.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Schütze, C. T. , , Schütze, C. T. and J. Sprouse n, Schuyler, T. Schwarzschild, R. Scott, M. , Searle, J. R. n, , , , –, , Selkirk, E. , , , , , Sells, P. Semino, E. Seoane, E. Sereno, J. A. Sereno, J. A. and A. Jongman , Seuren, P. A. M. Sgall, P., et al. Shaer, B. and W. Frey Shakespeare, W. , –, – Shaw, G. B. Sherman, D. Shi, R., et al. , Sidnell, J. Sidnell, J. and T. Stivers Siemund, P. , , , , , , , , , , , , , , –, , n Siewierska, A. n Siewierska, A. and W. Hollmann Silverstein, M. Simons, M. n Simpson, P. Sims, A. Sinclair, J. , –, –, n, , , , , Sinclair, J. and A. Mauranen – Singh, R., et al. Slobin, D. I. Smith, A. D. M., et al. Smith, C. A. Smith, C. S. , n, n, , , , Smith, J. and S. Holmes-Elliott Smith, N. and N. Allott Smith, N. and G. N. Leech n Snider, N. and I. Arnon , Sobin, N. Somers, H. L. n Sopher, H. n
Sóskuthy, M. and J. Hay Spencer, A. , , , , , , , , n, – Spencer, A. and A. Luís Spencer, A. and G. Popova Spencer, N. J. Spender, D. Sperber, D. and D. Wilson , n, Sportiche, D. Sprouse, J. and D. Almeida Sprouse, J., I. Caponigro, et. al. Sprouse, J., C. T. Schütze, et al. –, Sprouse, J., M. Wagers, et al. Stalnaker, R. C. , Stanley, J. , , Stanley, J. and Z. G. Szabó , , n, Starke, M. Stassen, L. Steedman, M. , , Stefanowitsch, A. and S. Th. Gries , , Steinhauer, K. and J. Drury Štekauer, P. , Štekauer, P., et al. n, Stenius, E. Stevenson, R. L. Stewart, T. W. Stirling, L. and R. Huddleston n Stivers, T. n, Stivers, T., et al. Stockwell, P. Stockwell, R., et al. –n Storrer, A. n Stowe, L. A. , Stowe, L. A., et al. Stowell, T. , n Stowell, T. and E. Wehrli Strawson, P. F. , , Stroud, C. and C. Phillips Stubbs, M. Stubbs, M. and I. Barth – Stump, G. J. , , , –, , Sturt, P. Sturtevant, E. H. n Swales, J. , ,
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Sweet, H. , – Sweetser, E. E. , , – Swerts, M. and S. Zerbian Swift, J. , Szabó, Z. G. Szabolsci, A. and T. Lohndal Szczepek, B. Szmrecsanyi, B. n Szmrecsanyi, B. and B. Kortmann , , Szymanek, B.
Traugott, E. C. and G. Trousdale , , , Truckenbrodt, H. Trudgill, P. , , , , , , , n Trudgill, P. and J. Chambers Trudgill, P. and J. Hannah Tubau Muntañá, S. n
T
V
Taavitsainen, I. Tagliamonte, S. , Tagliamonte, S., et al. Talmy, L. , , , Tannen, D. Tanner, D. Tanner, D. and J. G. van Hell Tarasova, E. Tarkington, B. Taylor, J. R. , , , , , , , , , , , , – Taylor, J. R. and K.-Y. Pang ten Hacken, P. Ten Wolde, E. and E. Keizer n Tesnière, L. , –, –, –, n, , , , Theses Thieroff, R. Thompson, S. A. , , n, , , Thompson, S. A. and Y. Koide Tieken-Boon van Ostade, I. , , Tognini-Bonelli, T. Tomasello, M. , , , Tomlin, R. S. Tonhauser, J., D. I. Beaver, and J. Degen Tonhauser, J., D. I. Beaver, C. Roberts et al. Traugott, E. C. n, , , , , , –, , n, – Traugott, E. C. and R. Dasher , , , , , , –, Traugott, E. C. and B. Heine n,
U
Upton, C. and J. D. A. Widdowson Välimaa-Blum, R. Vallduví, E. , , n, Vallduví, E. and E. Engdahl van Benthem, J. and A. ter Meulen n van Bergen, L. Vandenberghe, R., et al. van der Auwera, J. , , van der Auwera, J. and A. Malchukov n van der Auwera, J. and V. Plungian , –, van der Auwera, J., et al. – van der Hulst, H. van der Sandt, R. Van de Velde, F. van Dijk, T. A. Van Eynde, F. van Gompel, R. P. G. and S. P. Liversedge van Herten, M., et al. van Kemenade, A. Van Langendonck, W. n Van linden, A., et al. – van Marle, J. n Van Peer, W. van Tiel, B., et al. Van Trijp, R. Van Valin, R. D. Jr. , Van Valin, R. D. Jr. and R. J. LaPolla n, , , n, n, Vater, H. n Velupillai, V. n Venditti, J., et al. Vendler, Z. , , , Verstraete, J.-C. , Villata, S., et al.
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
Virtanen, T. Visser, F. Th. n, von Fintel, K. Vorlat, E. , Vos, R. n, n
W
Wade, K. Wade, T. Wagers, M. and C. Phillips Wagner, M. Wagner, S. , , –, , n Wald, B. and L. Besserman Walker, T. Waller, T. n Wallis, J. Wallis, S. , , , , , n, , , , , n Wallis, S. and G. Nelson –, Waltereit, R. Wanner, A. , – Ward, G. L. , , Ward, G. L. and B. J. Birner n, Ward, G. L. and E. F. Prince Ward, G. L., et al. Warner, A. R. , , , –, , Wasow, T. , Wattam, S. Webelhuth, G. n Webster Wechsler, S. n, Weir, A. Welke, K. Wells, H. G. Wells, J. C. , Wellwood, A., et al. Welsh, I. , Werth, P. Westerhoff, J. – Whorf, B. L. –
Wichmann, A. Wiechmann, D. and E. Kerz Wierzbicka, A. , – Williams, E. Williamson, T. n Wilson, D. and D. Sperber n Wilson, G. Wiltshire, C. and J. Harnsberger Winkler, S. Wolff, S., et al. Wolfram, W. , Wood, M. M. n Woolf, V. Wright, J. Wurmbrand, S. n, n
X
Xu, Y. and C. X. Xu
Y
Yoshida, M., et al.
Z
Zanuttini, R. and P. Portner Zappavigna, M. Zec, D. and S. Inkelas Zeijlstra, H. – Ziegeler, D. P. , , , , , , , , , , Ziem, A. and A. Lasch Zifonun, G., et al. n Zimmermann, I. Zirkel, L. Ziv, Y. and B. Grosz Zubizaretta, M.-L. Zupitza, J. Zwicky, A. M. , , , n, , , Zwicky, A. M. and G. K. Pullum n,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
S I ........................................
factorial designs – A (annotation, abstraction, analysis) model –
A
Abkhaz language aboutness see also topic; topicalization absolute universals – abstraction in corpora – abuses of performative speech acts academic writing –, , – vs. news writing – accent placement for focus – acceptability judgements – accessibility of discourse referents accessibility hierarchy Accomplishments , – Achievements , – acoustic phonetic cues actants action words see also verbs activation of information units – Active Filler Strategy – Activities , – actor and undergoer adherent adjectives adjacency condition adjectival modifiers, as noun phrases adjective clauses adjective phrases , – topicalization adjectives –, –, adherent antonyms – attributive , vs. predicative comparison –
as compounds – definitions denominal descriptive, vs. limiting – lumping – modal – ornative (X Y-ed) taking clauses as complement type-specifying function as valency carriers – in written language adjunct clauses adjunctizers, as subordinators or prepositions – adjuncts , vs. complements – mobility vs. stackability obligatory vs. optional – in valency theory – in verb phrases – in X-bar theory see also modifiers adverbial clauses, in discourse adverbials conveying stance in narrative language – in valency theory – adverbs – comparison – homophonous with prepositions modal – advertising language, parallelism affixation , , auxiliary clitics and clitics , – comparative – negative prefixation –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
affixation (cont.) ordering phonological alternation – subject–verb agreement – African American English after-perfect afterthoughts (repairs) , agency, deletion – agent-oriented actions – Agentive (A) case agentivity, genre variation – agglutination Agree operations agreement and linear vs. hierarchical structure –, number subject–verb, regional varieties – violations , ain’t construction Aktionsart (situation types) –, , – allophones allosentences alternation and baselines – alternative double object construction – Alternative Semantics n ambiguity , , , n avoided by focus marking in modals structural –, American English , –, – American structuralism , amplitude, as phonotactic cue analysis of corpus data – analysis, different approaches xxii anaphora, for cohesion – anchoring see grounding angloversals , – animacy hierarchy annotation of corpora , AntConc tool , anterior temporal lobe – antonymy – aphasia Appalachian English , , , apposition
ARCHER (A Representative Corpus of Historical English Registers) areoversals argument-focus – argument reversal – see also inversion; passive construction argument structure –, – alternations argumental compounds –, argumentation , see also syntactic argumentation asemantic derivation – aspect and Aktionsart – effect on sentence properties – marking, regional varieties – perfect – progressive , – Aspects model (Chomsky) , assertions, vs. presuppositions – associative interaction evidence assumed familiarity model assumptions in theories attestation rate (AR) , –, – attributive adjectives , vs. predicative adjectives attrition of inflection in English Australian English , , , , , automatic parsing autonomy of language – vs. groundedness auxiliary assumptions auxiliary clitics – auxiliary verbs –, – in complementizer position , – different analyses xxii as indication of clause type – in verb phrases – Aviation English
B
back-formation vs. compounding background cognition backward slash rule Bahamian English
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
bahuvrihi compounds –, Bank of English™ corpus , bare argument ellipsis – bare existentials Bare Phrase Structure Grammar n baselines and alternation – be, as auxiliary – being to V construction –, big mess construction binarity of compounds – binary branching of phrase structure –, binary Merge – binding binding argument Binding Constraints Binding Theory n binominal syntagms – Black South African English, de-accenting bleaching blending theory blends blocking blood oxygen-level dependent (BOLD) signals – BNC see British National Corpus bootstrapping hypothesis – bottom-up generalization algorithms – boundedness and unboundedness branching –, bridging contexts – British Component of the International Corpus of English see ICE-GB British National Corpus (BNC) collocation analysis complements – exclamatives negation , , n-grams non-prototypical coordination , passive constructions n phonological analysis semi-modals , sentence structure , , size ,
spoken vs. written English , subordinating conjunctions valency –, word classes word order – broad focus – Broca’s area – Brown Corpus , , , , , Brown family corpora , , , –, B-Brown – B-LOB – Brown , , , , , FLOB , , Frown , n, , LOB –, , busy-progressive
C Cambridge Grammar of the English Language, The (CGEL; Huddleston and Pullum) – Cape Flats English , n Cardiff Grammar n Cardiff School n Caribbean English cartographic approach n case – distinctions, nouns vs. pronouns , loss in English pronominal – violations case languages , – Case Theory catastrophic change , Categorial Grammar , – categorial particularism categories – catena n catenative constructions n simple vs. complex , catenative-auxiliary analysis xxii causal relations c-command – Centre for Lexical Information (CELEX) corpus CG see Cognitive Grammar
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
CGEL see Comprehensive Grammar of the English Language, A; Cambridge Grammar of the English Language, The chains chunking circumstants classical morphemics clauses –, – comment n complements vs. adjuncts – coordination – vs. subordination – elliptical –, – embedding form types fragments – grounding headedness , – main clauses in subordinate form n Merge – phonological boundaries radicals –, , , relative regional varieties – separation by punctuation in spoken discourse –, – structure, regional varieties – subordinate – finite – modifiers vs. complements – non-finite – types , –, – mood as , unbounded dependency – cleft sentences – clefts , , – clippings – clitics –, – auxiliary – clusters closed interrogatives –, , , – clusters, lexical CM see Construction Morphology COBUILD project (Collins COBUILD English Grammar) , , , –, n
COCA (Corpus of Contemporary American English) agentivity complements , , compounds , modals , , non-prototypical coordination – pronouns pseudopartitives size spoken vs. written English , valency – co-construction in discourse – code, in modals coercion , – Cognitive Construction Grammar cognitive-functional linguistics , – Cognitive Grammar (CG; Langacker) xxii, –, , –, , –, n, – phonology Cognitive Linguistics –, , , , , n, on word classes – cognitive relations cognitive-semantic criteria for headedness COHA (Corpus of Historical American English) , , – coherence cohesion explicit linguistic devices – collection-noun (CollN) constructions Collins COBUILD English Grammar see COBUILD project collocations , , , collostructions – Combinatory Categorial Grammar ‘Comfortable box’ (Newlyn) comment clauses n commissive speech acts n common ground – see also presuppositions Communicative Grammar of English, A (Leech and Svartvik) comparatives – clauses
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
comparison categories – vs. descriptive categories compartmentalization of grammar and lexis competence vs. performance, in generative grammar Complement function complementation complementizer phrases (CPs) , – complementizers , – as subordinators or prepositions – complements vs. adjuncts – in chains clauses hierarchy – internal vs. external vs. modifiers –, – non-finite and prepositional phrases – relationship with modifiers subcategorization , – as subordinate clauses in valency theory , –, – in X-bar theory complex catenative constructions , n Complex Noun Phrase Constraint complexity – Complexity Based Ordering (CBO) n, compositional semantics , – compositionality –, – compounds argumental vs. non-argumental – binarity – definitions –, –, – endocentric vs. exocentric – headedness – models – morphology vs. syntax –, , – neo-classical semantics – subordinative vs. coordinative , syntax – types – word classes –
Comprehensive Grammar of the English Language, A (CGEL; Quirk et al.) –, , , , , – compression computational system (C) computer-mediated communication (CMC) , – conceptual metaphor theory conceptual overlap n conceptual vs. medial approach to written/ spoken distinction – conceptualization – concession concordancing tools – conditional coordination constructions conditional/implicational universals – conditionals – auxiliaries congruence conjunctions , – atemporal relations coordinating –, –, connectedness , – constative speech acts constituency –, , – constituency grammar , , –, , vs. dependency view constituent order, identifying clause types –, – constraints, in constructions Construction Grammar (Goldberg) xxii, –, , , –, n, , –, dependency and valency – in language learning – and other approaches – see also constructions Construction Morphology (CM) , , n, n construction types, compounds as – constructional/phrasal theories – constructionalization, of modal verbs – constructions , argument structure – definition – information packaging –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
constructions (cont.) modal auxiliary –, morphological – container-noun (CN) constructions content, and function content-driven approach to derivational morphology context/code of auxiliaries contextual deletion contextual frame theory contextual inflection contextual relevance – contrast –, contrastive focus control vs. raising n control verbs convenience thresholds n conventional implicature , – conversation – see also spoken language Conversation Analysis (CA) , – conversational implicature (CI) – Cooperative Principle (CP; Grice) , –, n coordination –, , – clause level – of compounds , , –, non-prototypical – phrase level – in spoken language vs. written language vs. subordination –, – syndetic vs. asyndetic – copula ellipsis – pseudopartitives n Core Syntax (Adger) Coronal Lenition rule corpora American English (AmE) see Brown family corpora approaches to research – ARCHER corpus Bank of English™ corpus , baselines and alternation – bottom-up generalization algorithms – British Component of the International Corpus of English, see ICE-GB
British English (BrE) see Brown family corpora British National Corpus (BNC) collocation analysis complements – exclamatives negation , , n-grams non-prototypical coordination , passive constructions n phonological analysis semi-modals , sentence structure , , size , spoken vs. written English , subordinating conjunctions valency –, word classes word order – Brown family corpora , , , –, B-Brown – B-LOB – Brown , , , , , FLOB , , Frown , n, , LOB –, , Centre for Lexical Information (CELEX) COBUILD project (Collins COBUILD English Grammar) , , , –, n COCA (Corpus of Contemporary American English) agentivity complements , , compounds , modals , , non-prototypical coordination – pronouns pseudopartitives size spoken vs. written English , valency – COHA (Corpus of Historical American English) , , – Collins Corpus, see COBUILD project
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
corpora (cont.) collocations collostructions – concordancing tools – Corpus of Early English Correspondence (CEEC) Corpus of Late Modern English Texts (CLMET) , Corpus of Online Registers of English (CORE) DCPSE (Diachronic Corpus of Present-Day Spoken English) n, , , n, disadvantages evidence gained from – experimental corpus linguistics –, factual evidence , – frequency evidence – Helsinki Corpus of English Texts (HC) ICE (International Corpus of English) n, n, n, ICE-AUS ICECUP , , ICE-GB anaphora – clefting collocations –, , distributional analyses , , , – exclamatives modals , postponement – spoken language , – transitivity interaction evidence , –, , –, lexicons London-Lund Corpus (LLC) –, , see also Survey of English Usage Longman Grammar of Spoken and Written English (LGSWE) , n, , , , , , n-grams – natural language – NOW Corpus , Old Bailey Proceedings (OBP) , – Oxford Children’s Corpus parsed (treebanks) –, , , –, Quirk Corpus see Survey of English Usage sampling –
spoken discourse , – Standard Corpus of Present-day Edited American English, see Brown Corpus studies of genre Survey of English Usage (Quirk Corpus) , , , , tagged , Time Magazine tools and algorithms – Wellington Corpus of Written New Zealand English (WWC) n word sketches Corpus of American English (AmE) see Brown family corpora corpus-based linguistics , vs. usage-based approach – Corpus of British English (BrE) see Brown family corpora Corpus of Contemporary American English see COCA corpus-driven linguistics –, – Corpus of Early English Correspondence Corpus of Historical American English see COHA Corpus of Late Modern English Texts (CLMET) , corpus linguistics, role in grammar and lexis – Corpus of Online Registers of English (CORE) CorpusSearch tool –n, corpus stylistics – corrected treebanks see also treebanks Correspondence Principle n counterfactuality cranberry morphs creoloids n Critical Discourse Analysis (CDA) –, Critical Linguistics – cross-linguistic studies negative concord typology – vs. variation studies c-structure Curious Incident of the Dog in the Night-time, The (Haddon) cyclic methodology
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
D
data collection data sources –, , see also acceptability judgements; corpora; electrophysiology data; haemodynamic responses; reading time data Dative (D) case dative movement – DCPSE see Diachronic Corpus of Present-Day Spoken English de-accenting de-adjectival derivation nouns verbs declarative clauses , , – identifying declarative speech acts/performatives , n, , , , , – declaratives deduction vs. induction – Deep-Syntactic Dependency n defamiliarization defectiveness definite descriptions, and pronouns definite determiners degree words – deictic shift theory deixis, in tense , deletion, indefinite and contextual Delta Π demonstrations demonstrative adjectives – denominal derivation deontic modality , , –, , –, , , dependent-auxiliary analysis xxii dependency clauses, unbounded – dependency grammar –, –, , –, , , and constituency analysis vs. constituency view in constructionist frameworks – long distance dependencies – types of dependency according to Mel’čuk –
dependency relations in Word Grammar – dependents, nouns vs. pronouns , deranking – derivational approaches , derivational morphology , , , , – vs. inflectional morphology – theoretical approaches – descriptive adjectives, vs. limiting adjectives – descriptive categories vs. comparative categories determinatives , – Determiner function determiner phrases (DPs) n determiners functionalist approach as head of noun phrase xxii indefiniteness constraint deverbal derivation deviation Devon English Diachronic Corpus of Present-Day Spoken English (DCPSE) n, , , n, diachronic studies diachrony of modal verbs –, – diagraphs dialects , double negatives in literature – plural inflections on pronouns pronunciation of reflexives – see also regional varieties of English Dialogic Syntax (Du Bois) dialogue systems digital communication – digital genres dimonotransitivity , direct interpretation, vs. indirect interpretation – directional interaction evidence directive speech acts , , Discontinuous clefts discourse definitions effect on grammar –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
role of grammar – vs. grammar – discourse contexts – discourse markers , – regional varieties discourse-pragmatic criteria for headedness – discourse referents mental representations – discourse structure model disjunctions disjunctive coordination Distributed Morphology (DM) , , n, , , –, , – distribution of word classes distributional analysis , –, distributional fragmentation distributionalism , , ditransitive constructions , , –, , , , n regional varieties – without subordinators divergent grammar , , – DM see Distributed Morphology do-support Dominion Post, The donkey pronouns n double dependency n double-head analysis – double negatives Chomsky’s generative grammar cross-linguistic comparison Jespersen’s Modern English Grammar – Murray’s English Grammar Quirk et al.’s CGEL Sweet’s New English Grammar treatment of see also multiple negation double-object constructions DP-hypothesis – ‘Droplets’ (Rumens) D-structure – dual-system theories , , dummy subjects duration – dynamic modality –
Dynamic Model of the development of Postcolonial Englishes dynamic semantics n, – dynamic syntax theory dynamicity –
E
Early Immediate Constituents principle early left anterior negativity (ELAN) – Early Modern English modality – multiple negation East African Englishes East Anglia English , –, , echo questions – economy (ontological simplicity) – electroencephalography (EEG) – electronic World Atlas of Varieties of English (eWAVE; Kortmann and Lunkenheimer) electrophysiology data – elegance (syntactic simplicity) , , – Éléments de syntaxe structurale (Tesnière) , n, , eliciting conditions of ERPs – ellipsis in coordinate structures n and corpus annotation of head nouns in noun phrases –, in spoken language –, elliptical clauses –, – embedded inversion, regional varieties emergent grammar emphasis of auxiliaries of modals end-focus – end-weight , – endocentric compounds –, endocentric structures, vs. exocentric structures endonormative stabilization English Grammar (Givón) English Grammar (Murray) – English Reader (Murray) entailments n, entrenchment ,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
epistemic modality –, , , , , , , Erlangen Valency Patternbank n event-denoting nominalizations event-related fields (ERFs) , event-related potentials (ERPs) – evidence, from corpora – evoked activity and induced activity – exceptional case marking clauses exclamative clauses , –, – exclamatory adjectives exemplar-based model – Exemplar Theory exhaustification operator – existential modality existential there construction –, exocentric compounds –, , exocentric structures, vs. endocentric structures exonormative stabilization experimental corpus linguistics –, explanatory grammars – expletive insertion n, explicature/free enrichment –, explicitness, loss of exploration cycle expressive speech acts Extended Projection Principle (EPP) external deviation vs. internal deviation External Merge extraordinary balanced coordination (EBC) extraposition , – eye-tracking data –
F
factual evidence , – ‘false grammar’ see also grammatical errors FDG see Functional Discourse Grammar features of input felicity conditions feminism fictional genres , filler-gap dependencies – fillers and gaps – finite subordinate clauses – Finnish, inflection Fisher’s exact test
flapping FLOB corpus Fluid Construction Grammar focus –, – end-focus – movement , prosodic foregrounding – form–meaning pairings (schemas) , –, , , form of noun phrases formal-generative grammars, phonology , – formal judgement experiments – formal semantic approaches to speech acts – formalist vs. functionalist approach , , forward slash rule Foundations of Cognitive Grammar (Langacker) FrameNet project , n frames – frameworks, different approaches xxii free enrichment/explicature –, free indirect speech free morphemes, as markers of tense free relatives – French adjectives reflexivity stress patterns subjunctive frequency effects , frequency evidence –, fronting operations –, , – FROWN corpus FTFs see Fuzzy Tree Fragments function and content vs. form , , and structure function fusion – function words, as clitics – functional approaches to discourse to language change
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
functional-cognitive grammars, phonology Functional Discourse Grammar (FDG; Hengeveld and Mackenzie) , , –, n, information structure – Functional Grammar (FG; Dik) , functional heads – functional interpretation of ERPs – functional linguistics –, , – corpus-based vs. usage-based approach – grammar–discourse interface – hierarchical structure – information structure – language processing – and language typology – noun phrases –, – functional magnetic resonance imaging (fMRI) – Functional Sentence Perspective , functional structure functional typology, on word classes – future-projecting categories of modals future tense in English – Fuzzy Tree Fragments (FTFs) –, , ,
G G&B see Government and Binding (G&B) theory gapping , gaps and fillers – garden-path sentences , gender, pronominal – gender distinctions, nouns vs. pronouns , gender mismatch effect generalized conversational implicatures (GCIs) , , Generalized Phrase Structure Grammar (GPSG) , , , , , , generative approaches , , – to language change – generative grammar xxii, –, , , , –, n, n, – binary Merge –
c-command – hierarchical structure –, reflexives – VP ellipsis (VPE) – on word classes – Generative Lexicon model Generative Linguistics Generative Semantics xxii, , Generative Syntax (GS) –, , , generative transformational grammar generic modality genitive case –, – nouns vs. pronouns role of genre genre – agile vs. uptight – spoken language – vs. written language –, – studies of variation –, – written language – German , – interrogatives reflexivity Germanic languages de-accenting philology gerund – vs. present participle – in subordinate clauses gerund-participles , , – get-passive constructions – given-before-new principle , givenness, vs. newness –, n, – Givenness Hierarchy n glottal replacement rule Glue Semantics , – go and V construction – government Government and Binding (G&B) theory , , –, , governors – GPSG see Generalized Phrase Structure Grammar gradient grammar gradient phonetic effects grammar–discourse interface –, –, –, –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
grammar and lexis –, – continuum in Construction Grammar – functional models – generative models – role of corpus linguistics – usage-based theories –, Grammar of Speech (Brazil) – grammarians – definitions of word classes – grammars, traditional – grammatical argumentation see syntactic argumentation grammatical complexity – grammatical errors exercises in identifying in literature – grammatical metaphor grammatical priming grammaticality acceptability judgements – and sentence processing speed grammaticalization n, , , –, , and modality –, – see also language change grammemes n grounding (anchoring) –, , , , , –, , GS see Generative Syntax Gullah variety
H
habitual aspects, regional varieties (had) better modal construction – haemodynamic responses – hashtags head inflection , Head-driven Phrase Structure Grammar (HPSG; Pollard and Sag) , n, –, , , , , , , , –, –, , , , , , headedness of clauses , – of compounds – in noun phrases – relational vs. non-relational nouns –
hearer-oriented vs. speaker-oriented approach Heart of Darkness (Conrad) – Heavy Noun Phrase Shift Helsinki Corpus of English Texts (HC) hierarchical structure –, , – hierarchy of individuation , ‘Highwayman, The’ (Noyes) – holistic views of language hortatives – host words How do you do? construction n HPSG see Head-driven Phrase Structure Grammar human language processing hyponyms , hypotheses , n testing with experimental corpus linguistics hypothesis-falsification approach
I
ICE (International Corpus of English) n, n, n, ICE-AUS ICECUP , , ICE-GB anaphora – clefting collocations –, , distributional analyses , , , – exclamatives modals , postponement – spoken language , – transitivity iconicity identifiability Identity Function Default idiomatic constructions – defining nouns in modal (had) better – idiomatic expressions , , – as constructions – idiosyncratic constructions illocutionary acts –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
illocutionary force , –, – immediate observer effect imperative mood in Aviation English clauses , , –, –, let implicational/conditional universals – implicature theory , – impoverishment inalienable possession increments indefinite adjectives indefinite deletion indefinite determiners independent reference, nouns vs. pronouns , indeterminacy , n indexicals indexing of corpora Indian English, de-accenting indicative mood (factual modality) , indirect interpretation, vs. direct interpretation – indirect licensing indirect object shift – indirect observations indirect speech acts , induced activity and evoked activity – induction vs. deduction – inferential-realizational models infinitives contraction of to marker –, perfect forms vs. V-ing infixes n, Inflection Phrases (IPs) inflectional attrition inflectional category n inflectional morphology , – vs. derivational morphology – inflectional paradigms – informatics information given and new –, n new, accenting information structure –, –, –, –
approaches – definition – focus – ordering principles – syntactic form – topic – informational asymmetry Informative-presupposition it-clefts , inherent endpoints inheritance hierarchies Inheritors, The (Golding) – inner circle Englishes , Instrumental (I) case intensifying adjectives , n interaction evidence , –, , –, interactional linguistics (IL) – interjections internal deviation vs. external deviation Internal Merge – International Corpus of English see ICE International Corpus of English Corpus Utility Program see ICECUP interpretation, direct vs. indirect – Interpretive Composition , – Interpretive Progressive n interrogative clauses , , – identifying – interrogatives , –, adjectives and constituent noun phrases ellipsis – functional approach identifying open vs. closed –, – types see also questions intersubjectivity intonation inversion of auxiliaries embedded, regional varieties modals inverted T-model Invited Inferences Theory of Semantic Change (IITSC) Irish English , , –, , , reflexives
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
irrealis n, , see also subjunctive mood island constraint violations , island effects – islands it-clefts –, – it-extraposition – Italian, multiple negation
J
junction –
K
Key Word In Context (KWIC) –, KISS (Keep It Short and Simple) principle knowledge of language , , identifying idiomatic language – KWIC see Key Word in Context
L
L Englishes vs. L Englishes – Lancaster-Oslo-Bergen Corpus (LOB) , landmarks and trajectors –, , language acquisition , according to Chomsky – Construction Grammar in – distributional cues for identifying word class lexical categorization , , written language language change , – according to Otto Jespersen according to Henry Sweet approaches – catastrophic , incremental , mandative subjunctive – modality , , –, – mood , –, – see also grammaticalization language contact , , , n Language Faculty –, language processing, in functional linguistics –
language typology –, n, – language universals – Latin , learnability left anterior negativity (LAN) , left dislocation , – in spoken language , legal texts lesion mapping let-imperative Level Ordering (Stratal Ordering) lexeme-based models of morphology , , lexeme formation, vs. word creation –, lexeme individuation problem lexeme-and-paradigm models , lexemes lexemic relatedness – lexical categories xxii, – assignment – changed in literature in Cognitive Grammar (CG) – of compounds – definitions – implicational connections language acquisition , , morphology – phonological cues – phonology – reconceptualization – structuralist analysis – tagging tools – theories – as valency carriers – lexical decision experiment , Lexical-Functional Grammar (LFG; Bresnan et al.) , n, –, , n, , , , , , , n, , lexical-grammatical searches lexical integrity lexical opposition – Lexical Phonology (Kiparsky) , – lexical relatedness , , , , – lexical retention lexical semantics , –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
lexicalist hypothesis lexicalist theories – lexicalization, of compounds – lexicon , , – models without lexicon–grammar interface – in Construction Grammar – lexis and grammar –, – functional models – generative models – role of corpus linguistics – usage-based theories –, lexis, simple vs. complex LFG see Lexical-Functional Grammar LGSWE see Longman Grammar of Spoken and Written English limiting adjectives vs. descriptive adjectives – linear order vs. hierarchical structure –, vs. structural order – linear structure Linear Unit Grammar (Sinclair and Mauranen) – linearization process Linguistic Inquiry (LI) journal , literal force hypothesis literary language , – approaches – corpus stylistics – difference from everyday language – nominal vs. verbal grammar – non-standard forms – spoken forms – ungrammaticality – verbal delay – live reports LOB see Lancaster-Oslo-Bergen Corpus localization of grammatical operations locutionary acts – Logical Form Semantics , , n logogenesis London-Lund Corpus –, , see also Survey of English Usage long distance dependencies –
long distance dependency clauses see unbounded dependencies Longman Grammar of Spoken and Written English (LGSWE) , n, , , , , , lumping adjective classes – vs. splitting –
M
metaphor magnetoencephalography (MEG) –, Mainstream Generative Grammar Major Phonological Phrase (MaP) Mandarin Chinese, interrogatives mandative subjunctive – manipulation verbs – manner maxim , Manx language mapping , – Mapping Theory n ‘Margaret in the Garden’ (Davidson) marking future tense – progressive aspect – tense – matching conditions , maxims, conversational meaning – argument structure – boundary issues – compositional semantics – lexical semantics , – pragmatics – measure-noun (MN) constructions –, medial vs. conceptual approach to written/ spoken distinction – medial object perfect medical language , , spoken mental lexicon mental representations of discourse referents – Merge operation –, –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
metaphor, and modality – metonymy and modality Middle English double negatives modality , mood marking negative prefixation mimesis mind-style – minimal type theory Minimalist framework , , –, –, , , , –, n Minimalist Program (MP) n, –, , Minimalist syntax, and Distributed Morphology misfires of performative speech acts Mismatch Negativity (MMN) mobility vs. stackability of adjuncts modal auxiliary constructions –, , – modal source – modal verbs – diachrony –, – frequency data – future tense – performativity – in tag questions modality – adverbs and adjectives – aspect conceptualization – conditional as – deontic , , –, , – in discourse analysis , – dynamic – epistemic –, , , , existential/generic and grammaticalization –, , , –, – and constructionalization – grounding – movement – non-epistemic –, root subjectivity –
model theory – Modern English Grammar on Historical Principles (Jespersen) – modification modifiers vs. complements , –, , – in compounds – of nouns – subordinate clauses as , see also adjuncts modularity , modularity hypothesis –, moment of speech (MoS) , , –, Montague Grammar Montague Semantics –n, mood , , – grammatical change , –, – Morpheme Structure Conditions (MSCs) morphological constructions – morphological dependency (Morph-D) – morphology –, alternative models to the classical approach – in Cognitive Grammar – definitions – derivational vs. inflectional – different types – Distributed Morphology (DM) – lexical vs. syntactical vs. syntax in compounding –, , – theoretical approaches – word classes – word structure –, – morphomes morphomic stems – morphosemantic mismatches , – morphosyntactic criteria for headedness – morphotactics MoS see moment of speech Move operations – movement – movement test for constituency – moving window self-paced reading tasks
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
multiple negation – see also double negatives multiword expressions (MWEs) –, n, , mutual dependency n mutual entailment
N
N effect , – naïve falsification narrative language , , , , –, – layers – Narrative Mode (Smith) narrow focus National Curriculum Glossary natural language corpora – natural language processing (NLP) negation of auxiliaries , contractions double see double negatives historical context – of imperative clauses – of modals regional varieties – of tag questions –, Negation in English and Other Languages (Jespersen) negative concord – see also double negatives NeighborNet , , neo-classical compounding , neologisms – ne-prefixation – Network Morphology n, neuroscientific approaches never as negator New English Dictionary on Historical Principles New English Grammar, Logical and Historical, A (Sweet) – New Zealand English , , Newfoundland English , , , , newness vs. givenness –, n, – news writing –
vs. academic prose – genre vs. personal letters n-grams –, NICE (Negation, Inversion, Code/Context, Emphasis) properties of auxiliaries –, –, , – nominal phrases see noun phrases nominalization , –, in literary language – suffix -er – non-argumental compounds – non-compositional meaning non-finite subordinate clauses – non-relational nouns vs. relational nouns – nonsense words – non-sentential units (NSUs) – North of England variety , , , alternative double object construction – Northern Subject Rule – noun clauses noun phrases –, appositive complexity – as constituents – different analyses xxii ellipsis in functionalist approach –, – gaps – headedness – internal structure – in spoken discourse substituted by pronouns nouns classed alongside pronouns – as compounds – de-adjectival definitions –, –, –, – denominalization gerunds vs. present participles – grounding inflection – mass vs. count as modifiers –, plural morphology ,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
nouns (cont.) pronouns as subclass relational vs. non-relational headedness – as valency carriers – novel grammatical events, in corpora novel words, in corpora NOW Corpus , null hypotheses n number agreement and linear vs. hierarchical structure –, numbers ordinal numeral adjectives
O
object control vs. subject control n object words see also nouns objectification and subjectification – Objective (O) case obligatoriness vs. optionality of adjuncts – obligatory bound roots observations, indirect Occam’s Razor, Principle of , , Old Bailey Proceedings (OBP) corpus , – Old English development into Modern English double negatives gerunds modality –, , , –, – mood marking – negative prefixation – reflexivity word order onomasiological approach ontological simplicity (economy) – open propositions , Optimality Theory (OT; Prince and Smolensky) n, n, , – optionality , – vs. obligatoriness of adjuncts – ordering principles –
ornamental rule complexity orthography, of compounds outer circle Englishes , overdifferentiation , overgeneration n Oxford Advanced Learner’s Dictionary (Hornby) Oxford Children’s Corpus Oxford English Dictionary (OED) n Oxford University Ozark English
P
P effect (Pb) P effect (syntactic positive shift) , , – Pakistani English Pamphlet for Grammar (Bullokar) xxi Pāṇinian Determinism paradigm-based models of morphology Paradigm Function Morphology (PFM; Stump) , , – paradigmatic atrophy paralinguistic communication Parallel Composition , – parallelism – parsability parsed corpora (treebanks) –, , , –, parsimony principle parsing stage parsing units part-noun (PN) constructions partially schematic constructions participant roles particles n particularized conversational implicatures (PCIs) , – partitives parts of speech see word classes passive construction –, – in academic writing – get – imperative clauses valency , passivization of constituents
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
past tense, regional variation of to be Pattern Grammar (Hunston and Francis) , n pedagogy, written language perfect periphrasis perfect tense/aspect – regional varieties – performance vs. competence, in generative grammar performative hypothesis – performatives/declarative speech acts , n, , , , , – performativity in modal verbs – periphrasis , – perlocutionary acts , person distinctions, nouns vs. pronouns , personal letters, vs. news writing Personal Nouns – personification perspective, in narrative text PFM philology Philosophical Transactions of the Royal Society Phonetic Implementation Rules phonetics, contributions of Henry Sweet phonological dependency phonological representations – phonology – compositionality hierarchical structure – in lexical categorization – as part of a grammar – sentence-level – and word classes – word-level – phonotactic cues – phrasal categories – phrasal/constructional theories – phrasal verbs, as compounds – phrase structure –, adjective phrases – branching –, components – coordination –, –
headedness – nominal phrases –, – prepositional phrases – rules (PSRs) , syntax verb phrases – violations –, phrases, phonological boundaries Phrases in English website pidgins and creoles , – Pincher Martin (Golding) pitch contours plain text corpora plural inflections, nouns vs. pronouns , pluralization , poetry , , , – polar questions –, , , – polarity emphasis of auxiliaries polarity tags – polyfunctional properties of modal verbs polysemy n Position-of-Subject Constraint possessive adjectives – possessive inflection – possessives prenominal postponement – ‘Potatoes’ (Sansom) – poverty of the stimulus – pragmatic approaches to speech acts – pragmatics , – pragmatics–semantics interface – Prague Linguistic Circle , Prague School , , predicate-focus – predicative adjectives vs. attributive adjectives prefixation, negatives – prenominal possessives Prenucleus function , prepositional complements – prepositional phrases – as adjuncts as clause fragments as complements and modifiers as modifiers and pseudopartitive constructions –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
prepositional verbs, constraints prepositions ambiguity use in apposition atemporal relations classified separately from subordinating conjunctions – as compounds definitions – on – subordinate clauses as – taking clauses as complement as valency carriers prescriptivism , present participles vs. gerunds – present perfective paradox n presuppositions –, –, vs. assertions – preterite present verbs , , n, priming –, Principle of Compositionality Principle of End Focus , Principle of End Weight – Principle of Unmarked Temporal Interpretation (Declerck) Principles and Parameters (P&P) theory , , problem of induction problem of quantifier domain restriction –, pro-forms interrogative – in substitution tests progressive aspect , , , – and Aktionsart –, regional varieties – Progressive of Affect n, prohibitives – Projection Principle projection problem promiscuous attachment , – pronominal systems, regional varieties – pronominalization test – pronouns case – classed alongside nouns –
coreference and definite descriptions dropping exchange – gender – inflection – resumptive – as subclass of nouns substituting constituent noun phrases weak and strong proper adjectives property words see also adjectives; modifiers propositional acts propositional synonymy propositions , Prosodic Bootstrapping hypothesis – prosodic constraints –, – Prosodic Hierarchy Prosodic Word (PWd) prosody, word class alternations prototypes n pseudo-clefts see also wh-clefts pseudo-coordinative constructions pseudopartitives , – pseudo-passive constructions psycholinguistics, identifying word classes – psychological realism punctuation, separation of clauses
Q
quality maxim quantifier domain restriction –, quantifier-noun (QN) constructions –, quantity maxim questions echo – intonation polar –, , , – speech acts as – tag –, , – see also interrogatives ‘Queue’s Essentially, The’ (Dunmore) Quirk Corpus see Survey of English Usage
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
R
raising vs. control n rank scales reading time data – Readjustment Rules reconceptualization of word classes – recursion n reference point, cognitive – Reference point, tense –, referential dependencies –, referential index reflexives –, , in regional varieties – role of genre regional varieties of English n, , – aspect marking – clause structure – distance between areas double negatives in literature – negation – plural inflections on pronouns pronominal systems – relative clauses – subject–verb agreement – tense marking – typological approach – World Englishes –, – angloversals , – distinctive and diagnostic features – morphosyntactic variation – variety types – register see also genre relation/relevance maxim relational grammar relational nouns vs. non-relational nouns – relative adjectives relative clauses , regional varieties – Relevance Theory n, repairs (afterthoughts) repetition – representative speech acts , , research bias
restrictive and non-restrictive relative clauses result nominalizations resultative constructions Resultative Imperfective n resumptive pronouns – Rhetorical Structure Theory Richness of the Base , right dislocation , in spoken language Rising Principle – Role and Reference Grammar , , –, n, n, , Romance languages de-accenting subjunctive mood root modality , , roots rule-to-rule hypothesis Russian irregularities in inflection segmentation Russian formalism ,
S
sampling – sampling frame Sapir–Whorf hypothesis scalar implicature – scale structure – schema-instance relations – schemas (form–meaning pairings) , , schematic constructions –, nominal constructions – Science journal scientific language –, , – scope Scottish English , , , , , scrambling , S-curve model of language change second language acquisition, Construction Grammar – Segmentation Problem –, self-paced reading data – Semantic Coherence Principle n semantic dependency (Sem-D) ,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Semantic Function Hierarchy semantic maps semantic primitives – semantic representations – semantic violations – semantics, of compounds – semantics–pragmatics interface – semantics–syntax interface – semi-auxiliaries , semi-modals , , , language change –, , , – Semiotic Grammar n sentence-focus – sentence hierarchy – sentence phonology – sentence processing Broca’s area – and grammaticality sentence radicals n sentences in spoken language , sentential meaning, effect of tense and aspect – sentential negation – Separation Hypothesis/Separationism (Beard) , Sequence of Tense sequential scanning vs. summary scanning – serial verb construction , SFG see Systemic Functional Grammar Shetland English , Short Introduction to English Grammar (Lowth) Sign-Based Construction Grammar (SBCG) , n similarity relations simple catenative constructions , Simpler Syntax Hypothesis (Culicover and Jackendoff ) simplicity – Single Competence Hypothesis (Marantz) single-system models , –, Singlish n situation types (Aktionsart) –, , – situation-type shift
SketchEngine – SLASH feature small clauses social class social varieties, in literature – Southeast of England variety , , Southwest of England variety , –, , Spanish inalienable possession multiple negation speaker-oriented vs. hearer-oriented approach specifiers , in X-bar theory speech acts , – illocutionary force , –, – meaning – speech-making, parallelism Spell-Out function splitting vs. lumping – spoken language –, – data for spontaneous speech represented in literature – vs. written language –, –, – spontaneous speech S-structure – stackability vs. mobility of adjuncts stance – Standard Average European – Standard Corpus of Present-Day Edited American English, see Brown Corpus Standard English n definition – Stanford Parser , States , – statistical inference statistical universals stative verbs storage – storytelling – stranding Stratal Optimality Theory Stratal Ordering (Level Ordering) stratified samples
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
stress in compounds patterns –, – and word classes – stress shift, triggered by affixes Stressed-focus it-clefts strict ellipsis account – strong lexicalism structural order vs. linear order – structuralist approach to word classes – structure, and function structured inventory of units – stylistics approaches – corpus approach – subcategorization , , – subject – complements control vs. object control n gapping semantic roles Subject Island Constraint subject–verb agreement regional varieties – test , subjectification and objectification – subjectivity , absent in dynamic modality in modality – subjectless clauses – subjunctive mood (non-factual modality) , , –, grammatical change – mandative – subordinating conjunctions, classified separately from prepositions – subordination , – vs. coordination –, – finite subordinate clauses – genre variation modification vs. complementation – non-finite subordinate clauses – subordinative compounds , substitution of head nouns in noun phrases – test for constituency –
summary scanning vs. sequential scanning – superadditive effects superiority effect superlatives supplementation suppletive derivation – suprasegmental constraints –, – Surface-Syntactic dependency (SSynt-D) n surface syntactic dominance criteria Survey of English Dialects (Orton) Survey of English Usage –, , Quirk Corpus , suspended affixation suspensions Sydney School (Systemic Functional Grammar) syllable onset requirement symbolic units – syncretism , , syndetic vs. asyndetic coordination – synonymy – syntactic argumentation , constituency – general principles – syntactic dependency (Synt-D) –, , syntactic island constraints syntactic positive shift (P) , , – syntactic representations – syntactic simplicity (elegance) , , – syntactic tense and aspect – syntactic tests, modifiers vs. complements – syntactic violations –, syntax according to Henry Sweet structured inventory of units – symbolic units – vs. morphology in compounding –, , – syntax-driven models of morphology syntax–semantics interface – synthetic approach to word classes –
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
Systemic Functional Grammar (SFG; Halliday, Sydney School) , , –, , , n, –, Systemic Functional Linguistics (SFL)
T
tabular function tag questions –, , – tagged corpora , tagmemics Tasmanian English taxonomy building telicity , – templates Temporal Discourse Interpretation Principle (Dowty) tense –, and Aktionsart – conditional as – deictic (absolute) vs. anaphoric (relative) n effect on sentence properties – in finite subordinate clauses future in English – grounding marking, regional varieties – movement – past vs. non-past – perfect – Text World Theory texture TG see Transformational Grammar that-clauses –, , thematic roles theme–rheme structure , theory-driven linguistics – theta-roles n theticals TigerSearch timeless truths – to-contraction –, tone topic –, – it-clefts and wh-clefts Topic-Familiarity Condition n topicalization , , –, , – of constituents
traditional grammar – trajectors and landmarks –, , transfer – Transfer-Caused-Motion construction n transformational approaches , transformational component transformational framework n transformational-generative grammar Transformational Grammar –, , , transitive phrasal verbs n transitivity framework , transpositional lexemes transpositions , –, –n, n tree fragments – treebanks (parsed corpora) –, , , –, triangulation – truncations truth conditions, synonymy try and V construction – turn-taking analysis , two-component theory of aspect type-driven translation Type-Logical Grammar type-specifying functions Type-of-Subject Constraint type theory – typological approach, regional varieties –
U
Unaccented anaphoric focus clefts unarticulated constituents unbounded dependencies –, , – undergoer and actor Universal Grammar universal statements, truth of universals – ‘Unprofessionals, The’ (Fanthorpe) usage-based approach vs. corpus-based approach – to language change phonology theories –,
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
usage events utterance-type meanings
V
vagueness n valency alternations Valency Dictionary of English (VDE; Herbst et al.) , , n valency grammar –, Valency Realization Principle n valency theory , –, –, – in constructionist frameworks – valency carriers – variation studies vs. cross-linguistic studies variationist approach varioversals velar softening verb phrases – as constituents substituted by pro-forms verbs argument structure – auxiliary –, – categorization difficulties – complements vs. adjuncts – hierarchy of – as compounds , , – de-adjectival , definitions – delayed in literary language – dependency relation with subjects in finite subordinate clauses gerunds vs. present participles – inflection –, in literary language – modal – diachrony – grammatical change , , –, – performativity – morphology , nominalization suffix -er – present progressive preterite present n,
progressive aspect , – raising semantic links serial construction , temporal relations tense –, to go –, transitives without objects transitivity valency , –, –, – view from the periphery viewing arrangements – visibility voice alternations VP ellipsis (VPE) –,
W
was-levelling WAVE see World Atlas of Varieties of English Webster’s Third New International Dictionary , weight distribution – well-formedness –, Wellington Corpus of Written New Zealand English n Welsh English , were-levelling West Coast Functionalism , n, wh-clefts –, – in spoken language wh-dependencies wh-exclamatives , wh-forms, introducing subordinate clauses wh-islands wh-movement , , , wh-phrases free relatives – in reflexives wh-questions –, – answering in discourse – ellipsis – information structure whether-island violation – White South African English , n whole-part relations
OUP CORRECTED PROOF – FINAL, 22/10/2019, SPi
word classes xxii, – assignment – changed in literature in Cognitive Grammar (CG) – of compounds – definitions – implicational connections language acquisition , , morphology – phonological cues – phonology – reconceptualization – structuralist analysis – tagging tools – theories – as valency carriers – word creation, vs. lexeme formation –, word formation processes Word Grammar (Hudson) –, n, , –, n dependency and valency word order canonical vs. non-canonical declarative clauses in discourse contexts – in English genre variation – identifying clause types –, –
implicational universals – information packaging constructions – language typology and modality regional varieties – statistical universals word-and-paradigm models word phonology – word sketches word structure –, – WordNet n WordSmith tool , World Atlas of Varieties of English (WAVE; Kortmann and Lunkenheimer) , – World Atlas of Varieties of English, electronic (eWAVE; Kortmann and Lunkenheimer) World System of (Standard and NonStandard) Englishes , – written language – vs. spoken language –, –, – wug study (Berko)
X
X-bar theory , , –, , n, , –
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
OXFORD HANDBOOKS IN LINGUISTICS
THE OXFORD HANDBOOK OF AFRICAN AMERICAN LANGUAGE Edited by Sonja Lanehart
THE OXFORD HANDBOOK OF APPLIED LINGUISTICS Second edition Edited by Robert B. Kaplan
THE OXFORD HANDBOOK OF ARABIC LINGUISTICS Edited by Jonathan Owens
THE OXFORD HANDBOOK OF CASE Edited by Andrej Malchukov and Andrew Spencer
THE OXFORD HANDBOOK OF CHINESE LINGUISTICS Edited by William S-Y. Wang and Chaofen Sun
THE OXFORD HANDBOOK OF COGNITIVE LINGUISTICS Edited by Dirk Geeraerts and Hubert Cuyckens
THE OXFORD HANDBOOK OF COMPARATIVE SYNTAX Edited by Gugliemo Cinque and Richard S. Kayne
THE OXFORD HANDBOOK OF COMPOSITIONALITY Edited by Markus Werning, Wolfram Hinzen, and Edouard Machery
THE OXFORD HANDBOOK OF COMPOUNDING Edited by Rochelle Lieber and Pavol Štekauer
THE OXFORD HANDBOOK OF COMPUTATIONAL LINGUISTICS Edited by Ruslan Mitkov
THE OXFORD HANDBOOK OF CONSTRUCTION GRAMMAR Edited by Thomas Hoffman and Graeme Trousdale
THE OXFORD HANDBOOK OF CORPUS PHONOLOGY Edited by Jacques Durand, Ulrike Gut, and Gjert Kristoffersen
THE OXFORD HANDBOOK OF DERIVATIONAL MORPHOLOGY Edited by Rochelle Lieber and Pavol Štekauer
THE OXFORD HANDBOOK OF DEVELOPMENTAL LINGUISTICS Edited by Jeffrey Lidz, William Snyder, and Joe Pater
THE OXFORD HANDBOOK OF ELLIPSIS Edited by Jeroen van Craenenbroeck and Tanja Temmerman
THE OXFORD HANDBOOK OF ENDANGERED LANGUAGES Edited by Kenneth L. Rehg and Lyle Campbell
THE OXFORD HANDBOOK OF ENGLISH GRAMMAR Edited by Bas Aarts, Jill Bowie, and Gergana Popova
THE OXFORD HANDBOOK OF ERGATIVITY Edited by Jessica Coon, Diane Massam, and Lisa deMena Travis
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
THE OXFORD HANDBOOK OF EVENT STRUCTURE Edited by Robert Truswell
THE OXFORD HANDBOOK OF EVIDENTIALITY Edited by Alexandra Y. Aikhenvald
THE OXFORD HANDBOOK OF EXPERIMENTAL SEMANTICS AND PRAGMATICS Edited by Chris Cummins and Napoleon Katsos
THE OXFORD HANDBOOK OF GRAMMATICALIZATION Edited by Heiko Narrog and Bernd Heine
THE OXFORD HANDBOOK OF HISTORICAL PHONOLOGY Edited by Patrick Honeybone and Joseph Salmons
THE OXFORD HANDBOOK OF THE HISTORY OF ENGLISH Edited by Terttu Nevalainen and Elizabeth Closs Traugott
THE OXFORD HANDBOOK OF THE HISTORY OF LINGUISTICS Edited by Keith Allan
THE OXFORD HANDBOOK OF INFLECTION Edited by Matthew Baerman
THE OXFORD HANDBOOK OF INFORMATION STRUCTURE Edited by Caroline Féry and Shinichiro Ishihara
THE OXFORD HANDBOOK OF JAPANESE LINGUISTICS Edited by Shigeru Miyagawa and Mamoru Saito
THE OXFORD HANDBOOK OF LABORATORY PHONOLOGY Edited by Abigail C. Cohn, Cécile Fougeron, and Marie Hoffman
THE OXFORD HANDBOOK OF LANGUAGE AND LAW Edited by Peter Tiersma and Lawrence M. Solan
THE OXFORD HANDBOOK OF LANGUAGE AND SOCIETY Edited by Ofelia García, Nelson Flores, and Massimiliano Spotti
THE OXFORD HANDBOOK OF LANGUAGE ATTRITION Edited by Monika S. Schmid and Barbara Köpke
THE OXFORD HANDBOOK OF LANGUAGE CONTACT Edited by Anthony P. Grant
THE OXFORD HANDBOOK OF LANGUAGE EVOLUTION Edited by Maggie Tallerman and Kathleen Gibson
THE OXFORD HANDBOOK OF LANGUAGE POLICY AND PLANNING Edited by James W. Tollefson and Miguel Pérez-Milans
THE OXFORD HANDBOOK OF LEXICOGRAPHY Edited by Philip Durkin
THE OXFORD HANDBOOK OF LINGUISTIC ANALYSIS Second edition Edited by Bernd Heine and Heiko Narrog
THE OXFORD HANDBOOK OF LINGUISTIC FIELDWORK Edited by Nicholas Thieberger
OUP CORRECTED PROOF – FINAL, 17/10/2019, SPi
THE OXFORD HANDBOOK OF LINGUISTIC INTERFACES Edited by Gillian Ramchand and Charles Reiss
THE OXFORD HANDBOOK OF LINGUISTIC MINIMALISM Edited by Cedric Boeckx
THE OXFORD HANDBOOK OF LINGUISTIC TYPOLOGY Edited by Jae Jung Song
THE OXFORD HANDBOOK OF LYING Edited by Jörg Meibauer
THE OXFORD HANDBOOK OF MODALITY AND MOOD Edited by Jan Nuyts and Johan van der Auwera
THE OXFORD HANDBOOK OF MORPHOLOGICAL THEORY Edited by Jenny Audring and Francesca Masini
THE OXFORD HANDBOOK OF NAMES AND NAMING Edited by Carole Hough
THE OXFORD HANDBOOK OF PERSIAN LINGUISTICS Edited by Anousha Sedighi and Pouneh Shabani-Jadidi
THE OXFORD HANDBOOK OF POLYSYNTHESIS Edited by Michael Fortescue, Marianne Mithun, and Nicholas Evans
THE OXFORD HANDBOOK OF PRAGMATICS Edited by Yan Huang
THE OXFORD HANDBOOK OF REFERENCE Edited by Jeanette Gundel and Barbara Abbott
THE OXFORD HANDBOOK OF SOCIOLINGUISTICS Second Edition Edited by Robert Bayley, Richard Cameron, and Ceil Lucas
THE OXFORD HANDBOOK OF TABOO WORDS AND LANGUAGE Edited by Keith Allan
THE OXFORD HANDBOOK OF TENSE AND ASPECT Edited by Robert I. Binnick
THE OXFORD HANDBOOK OF THE WORD Edited by John R. Taylor
THE OXFORD HANDBOOK OF TRANSLATION STUDIES Edited by Kirsten Malmkjaer and Kevin Windle
THE OXFORD HANDBOOK OF UNIVERSAL GRAMMAR Edited by Ian Roberts
THE OXFORD HANDBOOK OF WORLD ENGLISHES Edited by Markku Filppula, Juhani Klemola, and Devyani Sharma