Читать книгу A Companion to Chomsky - Группа авторов - Страница 52

5.1 Introduction

Оглавление

An important cluster of closely related early Chomsky papers2 had two major consequences. First, they defined a new branch of mathematics, formal language theory, which has flourished in its own right. But second, and more importantly for our purposes, this new branch of mathematics provided the formal grounding for a new conception of linguistics in which grammars, rather than sentences or collections of sentences, were the scientifically central objects: instead being derived from collections of sentences as compact summaries of observed regularities, grammars are seen as (ultimately mental) systems that determine the status of sentences. The “observed regularities” come to be seen as consequences of the structure of the underlying system, the grammar. The classification of grammars that became known as the Chomsky hierarchy was an exploration of what kinds of regularities could arise from grammars that had various conditions imposed on their structure.

Rather than laying out the mathematical theory in complete detail – numerous sources already provide this3 – my aim in this chapter will be to focus on bringing out some key intuitions that emerge from the theory and try to highlight their applicability to theoretical linguistics. Looking at a completely formal treatment makes it easy to overestimate the degree to which the important concepts are bound to certain idealizations, such as the restriction to strings as the objects being generated and a binary grammaticality/ungrammaticality distinction.4 While those idealizations are there in the theory, I hope to make the case that certain intuitions that emerge from the theory are meaningful and useful in ways that transcend those idealizations.5 To the extent that I succeed in making this case, the reader will be able to turn to the formal literature with some motivating ideas in mind about the important concepts to watch out for.

One idea that plays a major role is the intersubstitutability of subexpressions. This is familiar from the distributional approach to discovering syntactic categories that is sometimes presented in introductory textbooks.6 We reach the conclusion that cat and dog belong to the same category, for example, by noting that substituting one for the other does not change a sentence's grammaticality. While we might introduce the term “noun” or the book‐keeping symbol N as a label for the class that cat and dog both belong to, there is nothing to being a noun other than being intersubstitutable with other nouns; the two‐place predicate “belongs to the same category as” is more fundamental than the one‐place predicate “is a noun.” (This diverges from the view where a noun, for example, would be defined as a word that describes a person, place or thing.)

Intersubstitutability is closely related to the way different levels on the Chomsky hierarchy correspond to different kinds of memory. A grammar that will give rise to the intersubstitutability of cat and dog is one that ignores, or forgets, all the ways that they differ, collapsing all distinctions between them. Similarly for larger expressions: the distinctions between wash the clothes and go to a bar, such as the fact that they differ in number of words and the fact that only one of the two contains the word the, can be ignored. The flip side of this irrelevant information, that a grammar ignores, is the relevant information that a grammar tracks – this remembered, relevant information is essentially the idea of a category. Different kinds of grammars correspond to different kinds of memory in the sense that they differ in how these categories, this remembered information, are used to guide or constrain subsequent generative steps.

Much of the discussion below aims to show that this idea of intersubstitutability gets at the core of how any sort of grammar differs from a mere collection of sentences, and how any sort of grammar might finitely characterize an infinite collection of expressions. A mechanism that never collapsed distinctions between expressions would be forced to specify all combinatorial possibilities explicitly, leaving no room for any sort of productivity; a mechanism that collapsed all distinctions would treat all expressions as intersubstitutable and impose no restrictions on how expressions combine to form others. An “interesting” grammar is one that sits somewhere in between these two extremes, collapsing some but not all distinctions, thereby giving rise to constrained productivity – productivity stems from the distinctions that are ignored, while constraints stem from those that are tracked. The task of designing a grammar to generate some desired pattern amounts to choosing which distinctions to ignore and which distinctions to track.

A Companion to Chomsky

Подняться наверх