English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

And who first proposed this linguistics theory?

2006-07-11 18:02:48 · 12 answers · asked by Anonymous in Society & Culture Languages

12 answers

linguistics, generative grammar generally refers to a proof-theoretic framework for the study of syntax partially inspired by formal grammar theory and pioneered by Noam Chomsky. A generative grammar is a set of rules that recursively "specify" or "generate" the well-formed expressions of a natural language. This encompasses a large set of different approaches to grammar. The term generative grammar is also broadly used to refer to the school of linguistics where this type of formal grammar plays a major part, including:

The Standard Theory (ST) (also widely known as Transformational grammar (TG))
The Extended Standard Theory (EST) (also widely known as Transformational grammar (TG))
Principles and Parameters Theory (P&P) which includes both Government and Binding Theory (GB) and the Minimalist Program (MP)
Relational Grammar (RG)
Lexical-functional Grammar (LFG)
Generalized Phrase Structure Grammar (GPSG)
Head-Driven Phrase Structure Grammar (HPSG)
Generative grammar should be distinguished from traditional grammar, which is often strongly prescriptive rather than purely descriptive, is not mathematically explicit, and has historically investigated a relatively narrow set of syntactic phenomena. In the "school of linguistics" sense it should be distinguished from other linguistically descriptive approaches to grammar, such as various functional theories.

The term generative grammar can also refer to a particular set of formal rules for a particular language; for example, one may speak of a generative grammar of English. A generative grammar in this sense is a formal device that can enumerate ("generate") all and only the grammatical sentences of a language. In an even narrower sense, a generative grammar is a formal device (or, equivalently, an algorithm) that can be used to decide whether any given sentence is grammatical or not.

In most cases, a generative grammar is capable of generating an infinite number of strings from a finite set of rules. These properties are desirable for a model of natural language, since human brains are of finite capacity, yet humans can generate and understand a very large number of distinct sentences. Some linguists go so far as to claim that the set of grammatical sentences of any natural language is indeed infinite.

Generative grammars can be described and compared with the aid of the Chomsky hierarchy proposed by Noam Chomsky in the 1950s. This sets out a series of types of formal grammars with increasing expressive power. Among the simplest types are the regular grammars (type 3); Chomsky claims that regular languages are not adequate as models for human language, because all human languages allow the embedding of strings within strings in a hierarchical way.

At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by a context-free grammar can be depicted as a derivation tree. Linguists working in generative grammar often view such derivation trees as a primary object of study. According to this view, a sentence is not merely a string of words, but rather a tree with subordinate and superordinate branches connected at nodes.

Essentially, the tree model works something like this example, in which S is a sentence, D is a determiner, N a noun, V a verb, NP a noun phrase and VP a verb phrase:



The resulting sentence could be The dog ate the bone. Such a tree diagram is also called a phrase marker. They can be represented more conveniently in a text form, (though the result is less easy to read).

However Chomsky at some point argued that phrase structure grammars are also inadequate for describing natural languages. To address this, Chomsky formulated the more complex system of transformational grammar.

When generative grammar was first proposed, it was widely hailed as a way of formalizing the implicit set of rules a person "knows" when they know their native language and produce grammatical utterances in it. However Chomsky has repeatedly rejected that interpretation; according to him, the grammar of a language is a statement of what it is that a person has to know in order to recognise an utterance as grammatical, but not a hypothesis about the processes involved in either understanding or producing language. In any case the reality is that most native speakers would reject many sentences produced even by a phrase structure grammar. For example, although very deep embeddings are allowed by the grammar, sentences with deep embeddings are not accepted by listeners, and the limit of acceptability is an empirical matter that varies between individuals, not something that can be easily captured in a formal grammar. Consequently, the influence of generative grammar in empirical psycholinguistics has declined considerably.

2006-07-11 18:06:16 · answer #1 · answered by tilaboo 3 · 1 1

Generative Grammar Definition

2016-11-03 03:14:44 · answer #2 · answered by Anonymous · 0 0

In linguistics, generative grammar generally refers to a proof-theoretic framework for the study of syntax partially inspired by formal grammar theory and pioneered by Noam Chomsky. A generative grammar is a set of rules that recursively "specify" or "generate" the well-formed expressions of a natural language. This encompasses a large set of different approaches to grammar.


LMAO....geez, and I thought I'd save the people the need to read the entire article by just clipping the first main part....

2006-07-11 18:06:34 · answer #3 · answered by Tygirljojo 4 · 0 0

Well, for the two people who just cut and pasted from Wikipedia, pffft.

Generative Grammar was developed by Noam Chomsky. The essence is that people do not build surface sentences as is. If two sentences are nearly identical in meaning, they both derive from a single "deep structure". This deep structure is generated by phrase structure rules that take the grammatical elements and lexical items and string them together in particular ways. There is then a second set of rules called the transformational rules that then manipulate the deep structure in particular ways to derive one or more different surface sentences from the one deep structure. For example, a simplified deep structure might be "John-SBJ PAST-kiss Marta-OBJ". The transformational rules can then yield a variety of surface forms depending on context and other factors, but which all basically mean the same thing: "John kissed Marta" "Marta was kissed" "Marta was kissed by John" "Marta is who John kissed" "It is Marta that John kissed", etc. Together, the two components of the grammar are called Transformational Grammar, Transformational-Generative Grammar, Generative-Transformational Grammar, or Generative Grammar.

2006-07-12 06:22:23 · answer #4 · answered by Taivo 7 · 1 0

People are really good at cutting and pasting, aren't they?!

Generative grammar refers broadly to what is also called "Chomskeyan grammar" (because it was first proposed by Noam Chomskey)--a theory of how sentences are put together (generated) by a series of rules or one or more types.

And you won't find that anywhere on the Internet...I just composed the sentence all on my own...

2006-07-12 11:09:01 · answer #5 · answered by Anonymous · 1 0

Noam Chumsky, refers to a proof-theoretic framework for the study of syntax partially inspired by formal grammar theory. It is a set of rules that recursively "specify" or "generate" the well-formed expressions of a natural language.

2006-07-11 18:10:27 · answer #6 · answered by Katasstrophe 1 · 0 0

In linguistics, generative grammar generally refers to a proof-theoretic framework for the study of syntax partially inspired by formal grammar theory and pioneered by Noam Chomsky. A generative grammar is a set of rules that recursively "specify" or "generate" the well-formed expressions of a natural language. This encompasses a large set of different approaches to grammar. The term generative grammar is also broadly used to refer to the school of linguistics where this type of formal grammar plays a major part, including:

The Standard Theory (ST) (also widely known as Transformational grammar (TG))
The Extended Standard Theory (EST) (also widely known as Transformational grammar (TG))
Principles and Parameters Theory (P&P) which includes both Government and Binding Theory (GB) and the Minimalist Program (MP)
Relational Grammar (RG)
Lexical-functional Grammar (LFG)
Generalized Phrase Structure Grammar (GPSG)
Head-Driven Phrase Structure Grammar (HPSG)
Generative grammar should be distinguished from traditional grammar, which is often strongly prescriptive rather than purely descriptive, is not mathematically explicit, and has historically investigated a relatively narrow set of syntactic phenomena. In the "school of linguistics" sense it should be distinguished from other linguistically descriptive approaches to grammar, such as various functional theories.

The term generative grammar can also refer to a particular set of formal rules for a particular language; for example, one may speak of a generative grammar of English. A generative grammar in this sense is a formal device that can enumerate ("generate") all and only the grammatical sentences of a language. In an even narrower sense, a generative grammar is a formal device (or, equivalently, an algorithm) that can be used to decide whether any given sentence is grammatical or not.

In most cases, a generative grammar is capable of generating an infinite number of strings from a finite set of rules. These properties are desirable for a model of natural language, since human brains are of finite capacity, yet humans can generate and understand a very large number of distinct sentences. Some linguists go so far as to claim that the set of grammatical sentences of any natural language is indeed infinite.

Generative grammars can be described and compared with the aid of the Chomsky hierarchy proposed by Noam Chomsky in the 1950s. This sets out a series of types of formal grammars with increasing expressive power. Among the simplest types are the regular grammars (type 3); Chomsky claims that regular languages are not adequate as models for human language, because all human languages allow the embedding of strings within strings in a hierarchical way.

At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by a context-free grammar can be depicted as a derivation tree. Linguists working in generative grammar often view such derivation trees as a primary object of study. According to this view, a sentence is not merely a string of words, but rather a tree with subordinate and superordinate branches connected at nodes.

Essentially, the tree model works something like this example, in which S is a sentence, D is a determiner, N a noun, V a verb, NP a noun phrase and VP a verb phrase:



The resulting sentence could be The dog ate the bone. Such a tree diagram is also called a phrase marker. They can be represented more conveniently in a text form, (though the result is less easy to read); in this format the above sentence would be rendered as:

[S [NP [D The ] [N dog ] ] [VP [V ate ] [NP [D the ] [N bone ] ] ] ]
However Chomsky at some point argued that phrase structure grammars are also inadequate for describing natural languages. To address this, Chomsky formulated the more complex system of transformational grammar.

When generative grammar was first proposed, it was widely hailed as a way of formalizing the implicit set of rules a person "knows" when they know their native language and produce grammatical utterances in it. However Chomsky has repeatedly rejected that interpretation; according to him, the grammar of a language is a statement of what it is that a person has to know in order to recognise an utterance as grammatical, but not a hypothesis about the processes involved in either understanding or producing language. In any case the reality is that most native speakers would reject many sentences produced even by a phrase structure grammar. For example, although very deep embeddings are allowed by the grammar, sentences with deep embeddings are not accepted by listeners, and the limit of acceptability is an empirical matter that varies between individuals, not something that can be easily captured in a formal grammar. Consequently, the influence of generative grammar in empirical psycholinguistics has declined considerably.

Generative grammar has been used in music theory and analysis such as by Fred Lerdahl and in Schenkerian analysis. See: Chord progression#Rewrite rules.

Automata theory: formal languages and formal grammars
Chomsky
hierarchy Grammars Languages Minimal
automaton
Type-0 Unrestricted Recursively enumerable Turing machine
n/a (no common name) Recursive Decider
Type-1 Context-sensitive Context-sensitive Linear-bounded
Type-2 Context-free Context-free Pushdown
Type-3 Regular Regular Finite
Each category of languages or grammars is a proper subset of the category directly above it.




[edit]
See also
Phrase structure rules
Cognitive linguistics
Parsing
Retrieved

2006-07-11 18:06:41 · answer #7 · answered by Jennifer B 5 · 0 0

Technically data is the plural of datum, so data are is correct. Common usage treats data as singular, so data is is more commonly used and treated as acceptable.

2016-03-27 02:03:01 · answer #8 · answered by Anonymous · 0 0

Does the first answer also have to be right?

2006-07-11 18:05:44 · answer #9 · answered by bentheredonethat122 2 · 0 0

The link below will answer your question. Good luck.

2006-07-11 18:06:00 · answer #10 · answered by Anonymous · 0 0

fedest.com, questions and answers