OLL 336 : SYNTAX
INTRODUCTION:
For other uses, see Syntax (disambiguation). Not to be confused with Sin tax. See also Syntaxis.
"Sentence structure" redirects here. For sentence types in traditional grammar, see Sentence clause structure.
Linguistics |
---|
Theoretical |
Descriptive |
Applied and experimental |
Related articles |
Linguistics portal |
In addition to referring to the discipline, the term syntax is also used to refer directly to the rules and principles that govern the sentence structure of any individual language. Modern research in syntax attempts to describe languages in terms of such rules. Many professionals in this discipline attempt to find general rules that apply to all natural languages.
The term syntax is also used to refer to the rules governing the behavior of mathematical systems, such as formal languages used in logic. (See logical syntax.)
Contents
Early history
Works on grammar were written long before modern syntax came about; the Aṣṭādhyāyī of Pāṇini (c. 4th century BC) is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory.[2] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.For centuries, work in syntax was dominated by a framework known as grammaire générale, first expounded in 1660 by Antoine Arnauld in a book of the same title. This system took as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought. (That natural way, coincidentally, was exactly the way it was expressed in French.)
However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.
The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale.[3]) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "Subject – Copula – Predicate." Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001).)[4]
Modern theories
There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton,[5] sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (e.g. Gerald Gazdar) take a more Platonistic view, since they regard syntax to be the study of an abstract formal system.[6] Yet others (e.g. Joseph Greenberg) consider syntax a taxonomical device to reach broad generalizations across languages.Generative grammar
Main article: Generative grammar
The hypothesis of generative grammar is that language is a structure
of the human mind. The goal of generative grammar is to make a complete
model of this inner language (known as i-language). This model could be used to describe all human language and to predict the grammaticality
of any given utterance (that is, to predict whether the utterance would
sound correct to native speakers of the language). This approach to
language was pioneered by Noam Chomsky.
Most generative theories (although not all of them) assume that syntax
is based upon the constituent structure of sentences. Generative
grammars are among the theories that focus primarily on the form of a
sentence, rather than its communicative function.Among the many generative theories of linguistics, the Chomskyan theories are:
- Government and binding theory (GB) (revised theory in the tradition of TG developed mainly by Chomsky in the 1970s and 1980s)[7]
- Minimalist program (MP) (a reworking of the theory out of the GB framework published by Chomsky in 1995)[8]
- Transformational grammar (TG) (Original theory of generative syntax laid out by Chomsky in Syntactic Structures in 1957)[9]
- Arc pair grammar
- Generalized phrase structure grammar (GPSG; now largely out of date)
- Generative semantics (now largely out of date)
- Head-driven phrase structure grammar (HPSG)
- Lexical functional grammar (LFG)
- Nanosyntax
- Relational grammar (RG) (now largely out of date)
Categorial grammar
Main article: Categorial grammar
Categorial grammar is an approach that attributes the syntactic structure not to rules of grammar, but to the properties of the syntactic categories
themselves. For example, rather than asserting that sentences are
constructed by a rule that combines a noun phrase (NP) and a verb phrase
(VP) (e.g. the phrase structure rule S → NP VP), in categorial grammar, such principles are embedded in the category of the head word itself. So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word
requiring an NP as an input and produces a sentence level structure as
an output. This complex category is notated as (NP\S) instead of V. NP\S
is read as "a category that searches to the left (indicated by \) for
an NP (the element on the left) and outputs a sentence (the element on
the right)." The category of transitive verb
is defined as an element that requires two NPs (its subject and its
direct object) to form a sentence. This is notated as (NP/(NP\S)) which
means "a category that searches to the right (indicated by /) for an NP
(the object), and generates a function (equivalent to the VP) which is
(NP\S), which in turn represents a function that searches to the left
for an NP and produces a sentence."Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.
Dependency grammar
Main article: Dependency grammar
Dependency grammar
is an approach to sentence structure where syntactic units are arranged
according to the dependency relation, as opposed to the constituency
relation of phrase structure grammars.
Dependencies are directed links between words. The (finite) verb is
seen as the root of all clause structure and all the other words in the
clause are either directly or indirectly dependent on this root. Some
prominent dependency-based theories of syntax are:- Algebraic syntax
- Functional generative description
- Meaning–text theory
- Operator grammar
- Word grammar
Stochastic/probabilistic grammars/network theories
Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism. Some theories based within this approach are:Functionalist grammars
Main article: Functional theories of grammar
Functionalist theories, although focused upon form, are driven by
explanation based upon the function of a sentence (i.e. its
communicative function). Some typical functionalist theories include:- Cognitive grammar
- Construction grammar (CxG)
- Emergent grammar
- Functional discourse grammar (Dik)
- Prague linguistic circle
- Role and reference grammar (RRG)
- Systemic functional grammar
See also
Syntactic terms
- Adjective
- Adjective phrase
- Adjunct
- Adpositional phrase
- Adverb
- Anaphora
- Answer ellipsis
- Antecedent
- Antecedent-contained deletion
- Appositive
- Argument
- Article
- Aspect
- Attributive adjective and predicative adjective
- Auxiliary verb
- Binding
- Branching
- c-command
- Case
- Category
- Catena
- Clause
- Closed class word
- Comparative
- Complement
- Compound noun and adjective
- Conjugation
- Conjunction
- Constituent
- Coordination
- Coreference
- Crossover
- Dangling modifier
- Declension
- Dependency grammar
- Dependent marking
- Determiner
- Discontinuity
- Do-support
- Dual (form for two)
- Ellipsis
- Endocentric
- Exceptional case-marking
- Expletive
- Extraposition
- Finite verb
- Function word
- Gapping
- Gender
- Gerund
- Government
- Head
- Head marking
- Infinitive
- Inverse copular construction
- Inversion
- Lexical item
- m-command
- Measure word (classifier)
- Merge
- Modal particle
- Modal verb
- Modifier
- Mood
- Movement
- Movement paradox
- Nanosyntax
- Negative inversion
- Non-configurational language
- Non-finite verb
- Noun
- Noun ellipsis
- Noun phrase
- Number
- Object
- Open class word
- Parasitic gap
- Part of speech
- Particle
- Periphrasis
- Person
- Personal pronoun
- Pied-piping
- Phrasal verb
- Phrase
- Phrase structure grammar
- Plural
- Predicate
- Predicative expression
- Preposition and postposition
- Pronoun
- Pseudogapping
- Raising
- Relation (Grammatical relation)
- Restrictiveness
- Right node raising
- Sandhi
- Scrambling
- Selection
- Sentence
- Separable verb
- Shifting
- Singular
- Sluicing
- Small clause
- Stripping
- Subcategorization
- Subject
- Subject-auxiliary inversion
- Subject-verb inversion
- Subordination
- Superlative
- Tense
- Topicalization
- Tough movement
- Uninflected word
- V2 word order
- Valency
- Verb
- Verb phrase
- Verb phrase ellipsis
- Voice
- Wh-movement
- Word order
- X-bar theory
External links
Wikimedia Commons has media related to Syntax. |
- The syntax of natural language: An online introduction using the Trees program—Beatrice Santorini & Anthony Kroch, University of Pennsylvania, 2007
No comments:
Post a Comment