What started as purely linguistic research ... has led, through involvement in political causes and an identification with an older philosophic tradition, to no less than an attempt to formulate an overall theory of man. The roots of this are manifest in the linguistic theory ... The discovery of cognitive structures common to the human race but only to humans (species specific), leads quite easily to thinking of unalienable human attributes.

Edward Marcotte on the significance of Chomsky's linguistic theory[1]

The basis of Noam Chomsky's linguistic theory lies in biolinguistics, the linguistic school that holds that the principles underpinning the structure of language are biologically preset in the human mind and hence genetically inherited.[2] He argues that all humans share the same underlying linguistic structure, irrespective of sociocultural differences.[3] In adopting this position Chomsky rejects the radical behaviorist psychology of B. F. Skinner, who viewed speech, thought, and all behavior as a completely learned product of the interactions between organisms and their environments. Accordingly, Chomsky argues that language is a unique evolutionary development of the human species and distinguished from modes of communication used by any other animal species.[4][5] Chomsky's nativist, internalist view of language is consistent with the philosophical school of "rationalism" and contrasts with the anti-nativist, externalist view of language consistent with the philosophical school of "empiricism",[6] which contends that all knowledge, including language, comes from external stimuli.[1]

Universal grammar

Since the 1960s, Chomsky has maintained that syntactic knowledge is at least partially inborn, implying that children need only learn certain language-specific features of their native languages. He bases his argument on observations about human language acquisition and describes a "poverty of the stimulus": an enormous gap between the linguistic stimuli to which children are exposed and the rich linguistic competence they attain. For example, although children are exposed to only a very small and finite subset of the allowable syntactic variants within their first language, they somehow acquire the highly organized and systematic ability to understand and produce an infinite number of sentences, including ones that have never before been uttered, in that language.[7] To explain this, Chomsky reasoned that the primary linguistic data must be supplemented by an innate linguistic capacity. Furthermore, while a human baby and a kitten are both capable of inductive reasoning, if they are exposed to exactly the same linguistic data, the human will always acquire the ability to understand and produce language, while the kitten will never acquire either ability. Chomsky referred to this difference in capacity as the language acquisition device, and suggested that linguists needed to determine both what that device is and what constraints it imposes on the range of possible human languages. The universal features that result from these constraints would constitute "universal grammar".[8][9][10] Multiple scholars have challenged universal grammar on the grounds of the evolutionary infeasibility of its genetic basis for language,[11] the lack of universal characteristics between languages,[12] and the unproven link between innate/universal structures and the structures of specific languages.[13] Scholar Michael Tomasello has challenged Chomsky's theory of innate syntactic knowledge as based on theory and not behavioral observation.[14]

Although it was influential from 1960s through 1990s, Chomsky's nativist theory was ultimately rejected by the mainstream child language acquisition research community owing to its inconsistency with research evidence.[15][16] It was also argued by linguists including Geoffrey Sampson, Geoffrey K. Pullum and Barbara Scholz that Chomsky's linguistic evidence for it had been false.[17]

Transformational-generative grammar

Transformational-generative grammar is a broad theory used to model, encode, and deduce a native speaker's linguistic capabilities.[18] These models, or "formal grammars", show the abstract structures of a specific language as they may relate to structures in other languages.[19] Chomsky developed transformational grammar in the mid-1950s, whereupon it became the dominant syntactic theory in linguistics for two decades.[18] "Transformations" refers to syntactic relationships within language, e.g., being able to infer that the subject between two sentences is the same person.[20] Chomsky's theory posits that language consists of both deep structures and surface structures: Outward-facing surface structures relate phonetic rules into sound, while inward-facing deep structures relate words and conceptual meaning. Transformational-generative grammar uses mathematical notation to express the rules that govern the connection between meaning and sound (deep and surface structures, respectively). By this theory, linguistic principles can mathematically generate potential sentence structures in a language.[1]

A set of 4 ovals inside one another, each resting at the bottom of the one larger than itself. There is a term in each oval; from smallest to largest: regular, context-free, context-sensitive, recursively enumerable.
Set inclusions described by the Chomsky hierarchy

Chomsky is commonly credited with inventing transformational-generative grammar, but his original contribution was considered modest when he first published his theory. In his 1955 dissertation and his 1957 textbook Syntactic Structures, he presented recent developments in the analysis formulated by Zellig Harris, who was Chomsky's PhD supervisor, and by Charles F. Hockett.[lower-alpha 1] Their method is derived from the work of the Danish structural linguist Louis Hjelmslev, who introduced algorithmic grammar to general linguistics.[lower-alpha 2] Based on this rule-based notation of grammars, Chomsky grouped logically possible phrase-structure grammar types into a series of four nested subsets and increasingly complex types, together known as the Chomsky hierarchy. This classification remains relevant to formal language theory[21] and theoretical computer science, especially programming language theory,[22] compiler construction, and automata theory.[23]

Transformational grammar was the dominant research paradigm through the mid-1970s. The derivative[18] government and binding theory replaced it and remained influential through the early 1990s, [18] when linguists turned to a "minimalist" approach to grammar. This research focused on the principles and parameters framework, which explained children's ability to learn any language by filling open parameters (a set of universal grammar principles) that adapt as the child encounters linguistic data.[24] The minimalist program, initiated by Chomsky,[25] asks which minimal principles and parameters theory fits most elegantly, naturally, and simply.[24] In an attempt to simplify language into a system that relates meaning and sound using the minimum possible faculties, Chomsky dispenses with concepts such as "deep structure" and "surface structure" and instead emphasizes the plasticity of the brain's neural circuits, with which come an infinite number of concepts, or "logical forms".[5] When exposed to linguistic data, a hearer-speaker's brain proceeds to associate sound and meaning, and the rules of grammar we observe are in fact only the consequences, or side effects, of the way language works. Thus, while much of Chomsky's prior research focused on the rules of language, he now focuses on the mechanisms the brain uses to generate these rules and regulate speech.[5][26]

Selected bibliography

Linguistics

Notes

    • Smith 2004, pp. 107 "Chomsky's early work was renowned for its mathematical rigor and he made some contribution to the nascent discipline of mathematical linguistics, in particular the analysis of (formal) languages in terms of what is now known as the Chomsky hierarchy."
    • Koerner 1983, pp. 159: "Characteristically, Harris proposes a transfer of sentences from English to Modern Hebrew ... Chomsky's approach to syntax in Syntactic Structures and several years thereafter was not much different from Harris's approach, since the concept of 'deep' or 'underlying structure' had not yet been introduced. The main difference between Harris (1954) and Chomsky (1957) appears to be that the latter is dealing with transfers within one single language only"
    • Koerner 1978, pp. 41f: "it is worth noting that Chomsky cites Hjelmslev's Prolegomena, which had been translated into English in 1953, since the authors' theoretical argument, derived largely from logic and mathematics, exhibits noticeable similarities."
    • Seuren 1998, pp. 166: "Both Hjelmslev and Harris were inspired by the mathematical notion of an algorithm as a purely formal production system for a set of strings of symbols. ... it is probably accurate to say that Hjelmslev was the first to try and apply it to the generation of strings of symbols in natural language"
    • Hjelmslev 1969 Prolegomena to a Theory of Language. Danish original 1943; first English translation 1954.

References

Works cited

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.