Linguistics wars

Last updated

The linguistics wars were extended deputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse [1] (p70)—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.

Contents

Eventually, generative semantics spawned a different linguistic paradigm, known as cognitive linguistics, a linguistic theory that correlates learning of languages to other cognitive abilities such as memorization, perception, and categorization, while descendants of interpretive semantics continue in the guise of formal semantics.

Background

In 1957, Noam Chomsky (b.1928) published Syntactic Structures , his first influential work. The ideas in Syntactic Structures were a significant departure from the dominant paradigm among linguists at the time, championed by Leonard Bloomfield (1887–1949). [1] The Bloomfieldian approach focused on smaller linguistic units such as morphemes and phones, and had little to say about how these units were organized into larger structures such as phrases and sentences. [1] (p20) By contrast, syntax was the central empirical concern of Syntactic Structures, which modeled grammar as a sets of rules that procedurally generate all and only the sentences of a given language. This approach is referred to as transformational grammar. [1] (pp22–24) Moreover, Chomsky criticized Bloomfieldians as being "[t]axonomic linguists", mere collectors and cataloguers of language. [1] (p16) Early work in generative grammar attempted to go beyond mere description of the data and identify the fundamental underlying principles of language. [2] (pp12–13) According to Chomsky, semantic components created the underlying structure of a given linguistic sequence, whereas phonological components formed its surface-level structure. This left the problem of ‘meaning’ in linguistic analysis unanswered. [2]

Chomsky's Aspects of the Theory of Syntax (1965) developed his theory further by introducing the concepts of deep structure and surface structure, which were influenced by previous scholarship. First, Chomsky drew from Ferdinand de Saussure (1857–1913), specifically his dichotomy of langue (the native knowledge of a language) versus parole (the actual use of language). Secondly, Louis Hjelmslev (1899–1965) later argued parole is observable and can be defined as the arrangement of speech, whereas langue comprises the systems within actual speech that underpin its lexicon and grammar. Aspects of the Theory of Syntax also addressed the issue of meaning by endorsing the Katz–Postal hypothesis, which holds that transformations do not affect meaning, and are therefore “semantically transparent”. This attempted to introduce notions of semantics to descriptions of syntax. [1] [3] Chomsky's endorsement resulted in further exploration of the relation between syntax and semantics, creating the environment for the emergence of generative semantics. [2]

Dispute

The divergence in the generative semantics and Aspects' paradigms (Adapted from Harris, 2022) Aspect's paradigm vs Generative Semantics' paradigm.png
The divergence in the generative semantics and Aspects' paradigms (Adapted from Harris, 2022)

The point of disagreement between generative semantics, known at the time as Abstract Syntax, and interpretive semantics was the degree of abstractness of deep structure. This refers to the distance between deep structures and the surface structure. [4] Generative semantics views deep structure and transformations as necessary for connecting the surface structure with meaning. Whereas Chomsky’s paradigm considers the deep structure and transformation that link the deep structure to the surface structure essential for describing the structural composition of linguistic items—syntactic description—without explicitly addressing meaning. [2] Notably, generative semanticists eventually abandoned deep structures altogether for the semantic representation. [1]

Generative semantics approach

Generative semantics was inspired by the notions of Chomsky in Aspects, in which he highlights two notions: deep structures determine the semantic representations, and selectional restrictions—rules that govern what follows and precedes words in a sentence—are stated in deep structures. These restrictions include the ‘semantic’ nature of the verb eat which necessitates that it should be followed by something edible. [5] Generative semanticists initially misinterpreted Chomsky’s ideas about the relation between semantic representation and used the arguments of selectional restrictions to draw a direct and bilateral relation between meaning and surface structures, where semantic representations are mapped onto surface structures, thereby conflating the two levels of semantic representation and deep structures. [1]

Generative semantics analysis evolved to favor an approach where deep structures reflect meaning directly through semantic features and relations—semantic representations. Thus, the formal characteristics of deep structures are considered insufficient and meaning drives the surface structures. The formal features of deep structures include context-free phrase-structures grammar and lexical insertion point—the point where words enter the derivation. [6] Generative semantics view of the transformations and deep structures contrasted sharply with those of Chomsky's. Generative sematicist believed that deep structures are meaning representation and transformations apply to deep structures to create different surface structures while preserving meaning. Chomsky's model suggests that deep structure pertain the organization of linguistic items while transformations apply to and manipulate deep structure but sometimes alter the meaning. [2]

Generative semantics' model:

deep structure:[AGENT] boy,[ACTION] hitting, [PATIENT]the ball

Transfromation active: The boy hit the ball.

Chomsky's model:

deep structure: S ((NP the boy) (VP [hit]) (NP the ball))

Transformation passive: The ball was hit by the boy.

Generative semanticists used arguments such as category-changing transformations in which simple paraphrase clouds alter the syntactic categories yet the meaning is unchanged, solidifying the Katz-Postal hypothesis which postulates a transparent nature of transformations. These category-changing transformations exist in inchoative and causative clauses which share the same underlying structures similar to their stative clause as evident in the sentences below.

Inchoative: The door opened.

Causative: He opened the door.

The underlying structure is similar to the stative clause: The door is open.

Generative semanticists used this argument, first suggested by George Lakoff in his dissertation, to cement the idea that the underlying meaning (The door is OPEN) drives two different surface structures (Inchoative- causative), not the other way around. [1]

Generative semantics and logical form

The level of semantic representation in the generative semantic analyses resembled the logical form, therefore, the derivation of a sentence is the direct mapping of semantics, meaning, and logic onto the surface structure, thus all aspects of meaning are represented in phrase-marker. Generative semanticists have claimed that the semantic deep structure is generated in a universal manner similar to those of Predicate logic, thereby reducing the syntactic categories to just three: S (= proposition), NP (= argument), and V (= predicate). In this analysis adjectives, negatives, and auxiliaries are reduced to one category which is Verb, and the other forms are derived transformationally. [5]

Lexical decomposition

Lexical decomposition was used to draw the syntactic stretch of sentences relaying the semantic implication inherent to words. In the word kill the analysis would reveal that it has atomic bits such as CAUSE and BECOME and NOT and ALIVE and work the semantic and syntactic relation between lexical items and their atomic parts. [1] Generative semantics’ case for lexical decomposition in which lexical reading, and base but different lexical extensions in example such as dead where the lexical base would be NOT ALIVE and the lexical extensions such as kill or die but similar readings such as the word die come from NOT ALIVE with the transformation inchoative it becomes (BECOME NOT ALIVE), and kill with the same lexical base NOT ALIVE with transformation causative, it becomes (CUASE TO BECOME NOT ALIVE). This simplified the projections rules necessary for transformations; rather than entering the word kill directly in the deep structure, thereby creating a new ‘syntactic’ deep structure, it would be considered as sharing the same ‘semantic’ deep structure with dead, NOT ALIVE. [4] Using this case of lexical decomposition, McCawley proposed a new rule—predicate raising—where lexical items can enter at any point of the derivation process rather than the deep structures. [1] This argument by McCawley undermined deep structures as lexical insertion points; as evident in the generative semantics analysis, some transformations—predicate-raising—needed to be applied before the inserting the lexical items—lexical insertion point—in the derivation. Because predicate-raising collects the predicate parts –abstract verbs— into the meaning complexes, words. [1] [5]

These arguments were used to conclude that it made no theoretical sense to have syntactic deep structures as a separate level and that semantic representations, features, and relations should be mapped directly onto the surface structure. [4] Additionally, generative semanticists have proclaimed that any level of structure that comes between the semantic representation and surface structure requires empirical justification. [1]

Interpretivist critique of generative semantics

Chomsky and others conducted number of arguments that are designed to demonstrate that generative semantics not only it did not offer something new but was misconceived and misguided. [2] In response to these challenges, Chomsky conducted a series of lectures and papers, known later as Remarks, which culminated in what was later known as the "interpretivist program". This program aimed to establish syntax as an independent level in the linguistic analysis—autonomous syntax—with independent rules, while the meaning of the syntactic structure follows from ‘interpretive’ rules applied to the syntactic structures. [7] This approach retains the formal characteristics of deep structure as context-free phrase-structure grammar. [6] Chomsky also criticized the Predicate-raising rule of McCawley for being an upside-down interpretive rule. [1]

Lexicalism and deverbal nouns

The generative semanticist’s analysis—lexical decomposition—holds that words refuse and refusal would belong to the same category refuse, but in Remarks Chomsky argued for the limitation of transformations and the separation of lexical entries for semantically related words as some nominalizations have distinct meanings. Chomsky argued that words such as marry, marriage; revolve, revolution should not be treated as derived from their verb forms as revolution has braoder scope, so is marry. These nouns—which are known as deverbal nouns—should exist separately in the lexicon. This approach was later known as lexicalism. This posited also, that nominalization transformations should happen in the lexicon not in the deep structure thereby limiting the power of transformations. [1] The words refuse and refusal would belong to the same category REFUSE in the generative semantics framework, but in Remarks Chomsky argued for the limitation of transformations and the separation of lexical entries for semantically related words. [1]

For example:

a. John is eager to please.

b. John's eagerness to please.

c. John is easy to please.

d. *John's easiness to please.

The d. sentence shows some distributional differences not accounted for if the deverbal nouns are to be derived transformationally. [2] Another point made by Chomsky against the generative semantics was the structural similarity deverbal nouns have with noun phrases, which suggests that it has its own independent internal structure, [2] in the example, proofs functions like portraits a regular noun phrase.

a. Several of John's proofs of the theorem.

b. Several of John's portraits of the dean.

Remarks contributed to what Chomsky terms the Extended Standard Theory, which he thought of as an extension to Aspects. To many linguists, the relation between transformations and semantics in the Generative Semantics was the natural progression of Aspects. [4] [1]

Lexical decomposition

The interpretive semanticist, Jerry Fodor, also criticized generative semanticists’ approach to lexical decomposition in which the word kill is derived from CAUSE TO BECOME NOT ALIVE in the work of Foder in a sentence such as:

a Putin caused Litvinenko to die on Wednesday by having him poisoned three weeks earlier.

b * Putin killed Litvinenko on Wednesday by having him poisoned three weeks earlier.

In these sentences (a) kill is derived from (b) caused to die, however, (a) is correct and causes no discrepancies but (b) which suggests a direct causal of killing contradicts the temporal qualifier “Wednesday by having him poisoned three weeks earlier” which suggests that lexical decomposition cloud fail to account for causal and temporal intricacies required for accurate semantic interpretation. [1]

Cases for Formalism in Underlying Structures

Coreference

Under the generative semanticist coreference relations in a sentence such as “Harry thinks he should win the prize” are analyzed in the deep structure as “Harry thinks Harry should win the prize”, then transformations happen to replace Harry with he in the surface structure. But this approach was criticized for creating an infinite loop of embedding— with he—in the deep structure “The man who shows he deserves it will get the prize he desires.”. Thus, the interpretivists considered he as a base component, and finding the correct antecedents is achieved through interpretive rules. [7] Further solidifying the existence of formal structures independent of semantics, which transformations apply to. [7]

Transformations and meaning

Transformations are not fully accounted for in the Katz-Postal hypothesis which underlies the generative semantics paradigm. [1] The Interpretivists argued that passive transformations do alter meaning in sentences with qualifiers such as every. [1] [7] In the sentences

Everyone in the room knows two languages.

Two languages are known by everyone in the room.

Chomsky analyzed these two sentences as semantically different despite being only derivational pairs; he observed that the first sentence might imply that everyone knows two different languages, while the second sentence implies that everyone in the room knows the same two languages. [2] This argument was used to retain the formal characteristics of deep structures as transformation movements are not accounted for through semantic relations, but rather formal ones. The existence of an independent level of syntactic structure to which transformations apply is evidence of formalism. [4]

Global rules of generative semantics

Generative semanticists accounted for such discrepancy resulted from passive transformations by claiming that the previous sentences do not share the same underlying structure, but rather two different structures; the first sentence has an underlying structure starting with “Everyone”, while the other sentence is with “Two” with the quantifier determining the scope of the meaning. Additionally, generative semanticists provided the “Quantifier lowering” rule where quantifiers are moved to the last position in the surface structures. In the previous sentences, in the sentence with “two” as an underlying structure, everyone is lowered highlighting that it is the same two languages are known by everyone, while in the sentence with “Everyone” as an underlying structure, the quantifier “two” is lowered maintaining that it is everyone knows two different languages. [2] Thus, generative semanticist, Lakoff, has expressed that the two sentences are not semantically equivalent. [1] George Lakoff proposed another rule which he termed the global derivational constraint in which sentence such as "Two languages..." would not be possible derivationally from an underlying structure with quantifier "Everyone" encompassing "Two". [2]

Challenges in the paradigm

Generative semantics faced challenges in its empirical confirmation. Analyses in interpretive semantics involve phrase-structure rules and transformations that are innately codified according to Aspects, [2] drawing on Chomsky’s ideas of innate faculty in the human brain which process languages. [8] By contrast, generative analyses contained hypotheses concerning factors like the intent of speakers and the denotation and entailment of sentences. Its lack of explicit rules, formulas, and underlying structures made its predictions difficult to compare and evaluate compared to those of interpretive semantics. Additionally, the generative framework was criticized for introducing irregularities without justification: the attempt to bridge syntax and semantics blurred the lines between these domains, with some arguing that the approach created more problems than it solved. These limitations led to the decline of generative semantics. [1]

Aftermath

After the protracted debates and with the decline of generative semantics, its key figures pursued various paths. George Lakoff moved on to cognitive linguistics, which explores the cognitive domain and the relation between language and mental processes. Meanwhile, in the late 90s Chomsky switched his attention to a more universal program of generative grammar, the minimalist program, which does not claim to offer a comprehensive theory of language acquisition and use. [9]  Postal rejects the idea of generative semantics and embraces natural languages discarding aspects of cognition altogether and emphasizing grammaticality. Postal adopts a mathematical/ logical approach to studying ‘natural’ languages. John R. Ross ventured to more literary-orientated endeavors such as poetry, though he maintained his transformationalist essence as his name existed in many of the Chomskyan works. As for McCawley, he continued following the tradition of Generative Semantics until his unfortunate death in 1999. He was known for his malleable approach to linguistic theory, employing both Extended Standard Theory and Generative Semantics elements. [1]

Books

A first systematic description of the linguistics wars is the chapter with this title in Frederick Newmeyer's book Linguistic Theory in America, which appeared in 1980. [10]

The Linguistics Wars is the title of a 1993 book by Randy A. Harris that closely chronicles the dispute among Chomsky and other significant individuals (George Lakoff and Paul Postal, among others) and also highlights how certain theories evolved and which of their important features have influenced modern-day linguistic theories. [11] A second edition was published in 2022, in which Harris traces several important 21st century linguistic developments such as construction grammar, cognitive linguistics and Frame semantics (linguistics), all emerging out of generative semantics. [1] The second edition also argues that Chomsky's minimalist program has significant homologies with early generative semantics.

Ideology and Linguistic Theory, by John A. Goldsmith and Geoffrey J. Huck, [2] also explores that history, with detailed theoretical discussion and observed history of the times, including memoirs/interviews with Ray Jackendoff, Lakoff, Postal, and Ross. The "What happened to Generative Semantics" chapter explores the aftermath of the dispute and the schools of thought or practice that could be seen as the successors to generative semantics.

See also

Related Research Articles

The following outline is provided as an overview and topical guide to linguistics:

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

Deep structure and surface structure are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar is a theoretical approach in linguistics that regards grammar as a domain-specific system of rules that generates all and only the grammatical sentences of a given language. In light of poverty of the stimulus arguments, grammar is regarded as being partly innate, the innate portion of the system being referred to as universal grammar. The generative approach has focused on the study of syntax while addressing other aspects of language including semantics, morphology, phonology, and psycholinguistics.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

<span class="mw-page-title-main">Charles J. Fillmore</span> American linguist

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

In linguistics, well-formedness is the quality of a clause, word, or other linguistic element that conforms to the grammar of the language of which it is a part. Well-formed words or phrases are grammatical, meaning they obey all relevant rules of grammar. In contrast, a form that violates some grammar rule is ill-formed and does not constitute part of the language.

<i>Aspects of the Theory of Syntax</i> 1965 book by Noam Chomsky

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, selection denotes the ability of predicates to determine the semantic content of their arguments. Predicates select their arguments, which means they limit the semantic content of their arguments. One sometimes draws a distinction between types of selection; one acknowledges both s(emantic)-selection and c(ategory)-selection. Selection in general stands in contrast to subcategorization: predicates both select and subcategorize for their complement arguments, whereas they only select their subject arguments. Selection is a semantic concept, whereas subcategorization is a syntactic one. Selection is closely related to valency, a term used in other grammars than the Chomskian generative grammar, for a similar phenomenon.

In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

References

  1. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Harris, Randy A. (2022) [1993]. The Linguistics Wars (2nd ed.). New York: Oxford University Press. ISBN   978-0-199-74033-8.
  2. 1 2 3 4 5 6 7 8 9 10 11 12 13 Goldsmith, John A.; Huck, Geoffrey J. (1996). Ideology and Linguistic Theory: Noam Chomsky and the Deep Structure Debates. Routledge. ISBN   978-1-315-00408-2.
  3. Koerner, E. F. K. (2002). Toward a History of American Linguistics. Routledge. ISBN   978-1-134-49507-8.
  4. 1 2 3 4 5 Newmeyer, F. J. (1996). Generative linguistics: a historical perspective. Routledge.
  5. 1 2 3 Bagha, K. (2011). Generative Semantics. English Language Teaching Archives, 4(3), 223-231. https://doi.org/10.5539/elt.v4n3p223
  6. 1 2 Bedell, George (1974). "The Arguments about Deep Structure". Language. 50 (3): 423–445. doi:10.2307/412216. ISSN   0097-8507. JSTOR   412216.
  7. 1 2 3 4 Verschueren, J., & Östman, J.-O. (2022). Interpretive semantics. In Handbook of Pragmatics (Vol. M2). John Benjamins Publishing Company.
  8. Cowie, F. (2017). "Innateness and Language.". The Stanford Encyclopedia of Philosophy.
  9. Pullum, Geoffrey (2022). "Chomsky's Forever War". National Review.[ better source needed ]
  10. Newmeyer, Frederick J. (1980). Linguistic theory in America: the first quarter-century of transformational generative grammar. Academic. ISBN   978-0-125-17150-2.
  11. Harris, Randy A. (1993). The Linguistics Wars. Oxford University Press. ISBN   978-0-199-83906-3.

Further reading