Sociolinguistics and pragmatics



There is a considerable overlap between pragmatics and socio-linguistics, since both share an interest in linguistic meaning as determined by usage in a speech community. However, sociolinguists tend to be more oriented towards variations within such communities. They describe gender, race, identity, and their interactions with individual speech acts. For example, the study of code-switching directly relates to pragmatics, since a switch in code effects a shift in pragmatic force.

Code-switching

Code-switching is a term in linguistics referring to using more than one language or variety in conversation. Bilinguals, who can speak at least two languages, have the ability to use elements of both languages when conversing with another bilingual. Code-switching is the syntactically and phonologically appropriate use of multiple varieties.

Code-switching can occur between sentences (intersentential) or within a single sentence (intrasentential).

Although some commentators have seen code-switching as reflecting a lack of language ability, most contemporary scholars consider code-switching to be a normal and natural product of interaction between the bilingual (or multilingual) speaker's languages.

Code-switching can be distinguished from other language contact phenomena such as loan translation (calques), borrowing, and pidgins.

What are the reasons for people to code-switch?

· Code-switching a word or phrase from language-B into language-A can be more convenient than waiting for one's mind to think of an appropriate language-B word.

· Code-switching can help an ethnic minority community retain a sense of cultural identity, in much the same way that slang is used to give a group of people a sense of identity and belonging, and to differentiate themselves from society at large.


Generative linguistics

Over the twentieth century, following the work of Noam Chomsky, linguistics came to be dominated by the Generativist school, which is chiefly concerned with explaining how human beings acquire language and the biological constraints on this acquisition. Generative theory is modularist in character. Language module  refers to a hypothesized structure in the human brain (anatomical module) or cognitive system (functional module) that some psycholinguists (e.g., Steven Pinker) claim contains innate capacities for language. According to Jerry Fodor the module is immune from information from other sources not directly associated with language processing (Fodor, 2005). There is currently ongoing debate about this in the field of cognitive science (psycholinguistics) and neuroscience (neurolinguistics).

Generative linguistics is a school of thought within linguistics that makes use of the concept of a generative grammar. Formally, a generative grammar is a finite set of rules that can be applied to generate only those sentences (often, but not necessarily, infinite in number) that are grammatical in a given language. This is the definition that is offered by Noam Chomsky, who invented the term. It is important to note that generate is being used as a technical term with a particular sense. To say that a grammar generates a sentence means that the grammar "assigns a structural description" to the sentence.

The term generative grammar is also used to label the approach to linguistics taken by Chomsky and his followers. Chomsky's approach is characterised by the use of transformational grammar – a theory that has changed greatly since it was first promulgated by Chomsky in his 1957 book Syntactic Structures – and by the assertion of a strong linguistic nativism (and therefore an assertion that some set of fundamental characteristics of all human languages must be the same).       A transformational grammar, or transformational - generative grammar (TGG), is a generative grammar, especially of a natural language.

Noam Chomsky, in his work Syntactic Structures, developed the idea that each sentence in a language has two levels of representation — a deep structure and a surface structure. The deep structure represented the core semantic relations of a sentence, and was mapped on to the surface structure (which followed the phonological form of the sentence very closely) via transformations. Chomsky believed that there would be considerable similarities between languages' deep structures, and that these structures would reveal properties, common to all languages, which were concealed by their surface structures. Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF — Logical Form, and PF — Phonetic Form), and then in the 1990s Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure no longer featured and LF and PF remained as the only levels of representation.               In TGG, Deep structures were generated by a set of phrase structure rules. For example a typical transformation in TG is the operation of subject-auxiliary inversion (SAI). This rule takes as its input a declarative sentence with an auxiliary: "John has eaten all the tomatoes." and transforms it into "Has John eaten all the tomatoes, or there was a transformation that turned active sentences into passive ones. Another type of transformation raised embedded subjects into main clause subject position in sentences such as "John seems to have gone". Terms such as "transformation" can give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences. Chomsky is clear that this is not in fact the case: a generative grammar models only the knowledge that underlies the human ability to speak and understand.

One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to.       Chomsky originally theorized that children were born with a hard-wired language acquisition device (LAD) in their brains. He later expanded this idea into Universal Grammar, a set of innate principles and adjustable parameters that are common to all human languages. In generativist theory, the collection of fundamental properties all languages share are referred to as universal grammar (UG).

Universal grammar is a theory of linguistics postulating principles of grammar shared by all languages, thought to be innate to humans (linguistic nativism). It attempts to explain language acquisition in general, not describe specific languages. Universal grammar proposes a set of rules intended to explain language acquisition in child development.

According to Chomsky, the presence of Universal Grammar in the brains of children allows them to deduce the structure of their native languages from "mere exposure".        

Much of the nativist position is based on the early age at which children show competency in their native grammars, as well as the ways in which they do (and do not) make errors. Infants are born able to distinguish between phonemes in minimal pairs, distinguishing between bah and pah, for example. Young children (under the age of three) do not speak in fully formed sentences, instead saying things like 'want cookie' or 'my coat.' They do not, however, say things like 'want my' or 'I cookie,' statements that would break the syntactic structure of the Phrase, a component of universal grammar. Children also seem remarkably immune from error correction by adults, which Nativists say would not be the case if children were learning from their parents.

Grammatical theories   Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. The first was the distinction between competence and performance. Chomsky noted the obvious fact that people, when speaking in the real world, often make linguistic errors (e.g. starting a sentence and then abandoning it midway through). He argued that these errors in linguistic performance were irrelevant to the study of linguistic competence (the knowledge that allows people to construct and understand grammatical sentences). Consequently, the linguist can study an idealised version of language, greatly simplifying linguistic analysis.

 The second idea related directly to the evaluation of theories of grammar. Chomsky made a distinction between grammars which achieved descriptive adequacy and those which went further and achieved explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar which achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind; that is, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, so if a grammatical theory has explanatory adequacy it must be able to explain the various grammatical nuances of the languages of the world as relatively minor variations in the universal pattern of human language. Chomsky argued that, even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would only come if linguists held explanatory adequacy as their goal. In other words, real insight into the structure of individual languages could only be gained through the comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.

According to Chomsky the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way.. He argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a particular string of English words elicits a double take, or feeling of wrongness in a native English speaker, it can be said that the string of words is ungrammatical (when various extraneous factors affecting intuitions are controlled for). This (according to Chomsky) is entirely distinct from the question of whether a sentence is meaningful, or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colorless green ideas sleep furiously". But such sentences manifest a linguistic problem distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as being well formed.         

The use of such intuitive judgments permitted generative syntacticians to base their research on a methodology in which studying language through a corpus of observed speech became downplayed, since the grammatical properties of constructed sentences were considered to be appropriate data on which to build a grammatical model.


Cognitive linguistics

Cognitive linguistics refers to the school of linguistics that understands language creation, learning, and usage as best explained by reference to human cognition in general. It is characterized by adherence to three central positions:

1. it denies that there is an autonomous linguistic faculty in the mind;

2. it understands grammar in terms of conceptualization;

3. it claims that knowledge of language arises out of language use.

Cognitive linguists deny that the mind has any module for language-acquisition that is unique and autonomous. This stands in contrast to the work done in the field of generative grammar. Although cognitive linguists do not necessarily deny that part of the human linguistic ability is innate, they deny that it is separate from the rest of cognition. Thus, they argue that knowledge of linguistic phenomena — i.e., phonemes, morphemes, and syntax — is essentially conceptual in nature. Moreover, they argue that the storage and retrieval of linguistic data is not significantly different from the storage and retrieval of other knowledge and use of language in understanding employs similar cognitive abilities as used in other non-linguistic tasks. (A concept is a cognitive unit of meaning- an abstract idea or a mental symbol).

Concepts are bearers of meaning, as opposed to agents of meaning. A single concept can be expressed by any number of languages. The concept of DOG can be expressed as dog in English, Hund in German, as chien in French, and perro in Spanish. The fact that concepts are in some sense independent of language makes translation possible - words in various languages have identical meaning, because they express one and the same concept.

In Cognitive linguistics, abstract concepts are transformations of concrete concepts derived from embodied experience.

Departing from the tradition of truth-conditional semantics, cognitive linguists view meaning in terms of conceptualization. Instead of viewing meaning in terms of models of the world, they view it in terms of mental spaces.

The main area of Cognitive linguistics study is devoted to cognitive semantics, dealing mainly with lexical semantics.

Cognitive semantics is a part of the cognitive linguistics movement. The main tenets of cognitive semantics are, first, that grammar is conceptualization; second, that conceptual structure is embodied and motivated by usage; and third, that the ability to use language draws upon general cognitive resources and not a special language module.

The cognitive semantics approach rejects the traditional separation of linguistics into phonology, syntax, pragmatics, etc. Instead, it divides semantics (meaning) into meaning-construction and knowledge representation. Therefore, cognitive semantics studies much of the area traditionally devoted to pragmatics.

The techniques native to cognitive semantics are typically used in lexical studies by Leonard Talmy, George Lakoff, Dirk Geeraerts and Bruce Wayne Hawkins.

Leonard Talmy is a professor of linguistics and philosophy at the University at Buffalo in New York. He is most famous for his pioneering work in cognitive linguistics, more specifically, in the relationship between semantic and formal linguistic structures and the connections between semantic typologies and universals. He also specializes in the study of Yiddish and Native American linguistics.

George P. Lakoff is a professor of cognitive linguistics at the University of California, Berkeley. He is most famous for his ideas about the centrality of metaphor to human thinking, political behavior and society. He is particularly famous for his concept of the "embodied mind," which he has written about in relation to mathematics. Lakoff's original thesis on conceptual metaphor was expressed in his book with Mark Johnson entitled Metaphors We Live By in 1980. Metaphor has been seen within the Western scientific tradition as purely a linguistic construction. The essential thrust of Lakoff's work has been the argument that metaphors are primarily a conceptual construction, and indeed are central to the development of thought. He says, "Our ordinary conceptual system, in terms of which we think and act, is fundamentally metaphorical in nature." Non-metaphorical thought is for Lakoff only possible when we talk about purely physical reality. For Lakoff the greater the level of abstraction the more layers of metaphor are required to express it. People do not notice these metaphors for various reasons. One reason is that some metaphors become 'dead' and we no longer recognize their origin. Another reason is that we just don't "see" what is "going on".

For instance, in intellectual debate the underlying metaphor is usually that argument is war:

 

He won the argument.

Your claims are indefensible.

He shot down all my arguments.

His criticisms were right on target.

If you use that strategy, he'll wipe you out.

For Lakoff, the development of thought has been the process of developing better metaphors. The application of one domain of knowledge to another domain of knowledge offers new perceptions and understandings.

A key point to understand about conceptual metaphor is that these metaphors are not static idiomatic structures specific to the specific linguisitic expressions. Rather, it is the entirety of the source domain which is utilized as a means for understanding the target domain and many different elements from the source domain can be mapped to a corresponding element in the target domain. Thus, in mapping the source domain of “heat of a fluid” to the target domain of “anger”, various expressions associated with the source domain can be used to talk about and think about the target domain, thus:

I reached my boiling point.
She blew up on me.
His anger boiled over.
Their anger simmered during the presentation.
He erupted in anger.
I could feel my anger building up inside.

 

       Frame semantics. Charles J. Fillmore is an American linguist, and an Emeritus Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Professor Fillmore spent ten years at The Ohio State University before joining Berkeley's Department of Linguistics in 1971. He has been a Fellow at the Center for Advanced Study in the Behavioral Sciences. He has been extremely influential in the areas of syntax and lexical semantics; he was one of the founders of cognitive linguistics, and Frame Semantics (1976). Frame semantics, developed by Charles J. Fillmore, attempts to explain meaning in terms of their relation to general understanding, not just in the terms laid out by truth-conditional semantics. Fillmore explains meaning in general in terms of "frames". By "frame" is meant any concept that can only be understood if a larger system of concepts is also understood. It is a subconscious schematic representation of a particular type of situation together with a mental list of all the different participants, props, and other conceptual roles that are seen as components of such situations.

       Fillmore: framing Many pieces of linguistic evidence motivate the frame-semantic project. First, it has been noted that word meaning is an extension of our bodily and cultural experiences. For example, the notion of restaurant is associated with a series of concepts, like food, service, waiters, tables, and eating. These rich-but-contingent associations cannot be captured by an analysis in terms of necessary and sufficient conditions, yet they still seem to be intimately related to our understanding of "restaurant".

Second, and more seriously, these conditions are not enough to account for asymmetries in the ways that words are used. According to a semantic feature analysis, there is nothing more to the meanings of "boy" and "girl" than: BOY [+MALE], [+YOUNG]; GIRL [+FEMALE], [+YOUNG]. And there is surely some truth to this proposal. Indeed, cognitive semanticists understand the instances of the concept held by a given certain word may be existed in a schematic relation with the concept itself. However, linguists have found that language users regularly apply the terms "boy" and "girl" in ways that go beyond mere semantic features. That is, for instance, people tend to be more likely to consider a young female a "girl" (as opposed to "woman"), than they are to consider a borderline-young male a "boy" (as opposed to "man").

This fact suggests that there is a latent frame, made up of cultural attitudes, expectations, and background assumptions, which is part of word meaning. These background assumptions go up and beyond those necessary and sufficient conditions that correspond to a semantic feature account. Frame semantics, then, seeks to account for these puzzling features of lexical items in some systematic way. With the frame-semantic paradigm's analytical tools, the linguist is able to explain a wider range of semantic phenomena than they would be able to with only necessary and sufficient conditions. Some words have the same definitions or intensions, and the same extensions, but have subtly different domains. For example, the lexemes land and ground are synonyms, yet they naturally contrast with different things -- air and sea, respectively.

Fillmore’s current major project is called FrameNet; it is a wide-ranging on-line description of the English lexicon. In this project, words are described in terms of the Frames they evoke. Data is gathered from the British National Corpus, annotated for semantic and syntactic relations, and stored in a database organized by both lexical items and Frames. The project is influential -- Issue 16 of the International Journal of Lexicography was devoted entirely to it. It has also inspired parallel projects, which investigate other languages, including Spanish, German, Japanese and others.


 


Дата добавления: 2021-12-10; просмотров: 25; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!