Semantics

From Wikipedia, the free encyclopedia

Jump to: navigation, search
Linguistics
Language
Theoretical linguistics
Generative linguistics
Phonology
Morphology
Syntax
Lexis
Semantics
Lexical semantics
Statistical semantics
Structural semantics
Prototype semantics
Pragmatics
Systemic functional linguistics
Descriptive linguistics
Phonetics
Historical linguistics
Comparative linguistics
Etymology
Sociolinguistics
Corpus linguistics
Applied linguistics
Language acquisition
Language assessment
Language development
Language education
Psycholinguistics
Neurolinguistics
Linguistic anthropology
Cognitive linguistics
Computational linguistics
Stylistics
Prescription
History of linguistics
List of linguists
Unsolved problems

Semantics is the study of meaning. The word "semantics" itself denotes a range of ideas, from the popular to the highly technical. It is often used in ordinary language to denote a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal inquiries, over a long period of time. The word is derived from the Greek word σημαντικός (semantikos), "significant",[1] from σημαίνω (semaino), "to signify, to indicate" and that from σήμα (sema), "sign, mark, token".[2] In linguistics, it is the study of interpretation of signs or symbols as used by agents or communities within particular circumstances and contexts.[3] Within this view, sounds, facial expressions, body language, proxemics have semantic (meaningful) content, and each has several branches of study. In written language, such things as paragraph structure and punctuation have semantic content; in other forms of language, there is other semantic content.[4]

The formal study of semantics has many subfields, including proxemics, lexicology, syntax pragmatics, etymology and others, although semantics in and of itself is a well-defined field in its own right, often with synthetic properties.[5] In Philosophy of language, semantics and reference are related fields. Further related fields include philology, communication and semiotics. The formal study of semantics is therefore complex.

As a result, those who study meaning differ on what constitutes meaning. For example, in the sentence, "John loves a bagel", the word bagel may refer to the object itself, which is its literal meaning or denotation, but it may also referes to many other figurative associations, such as how it meets John's hunger, etc., which may be its connotation. Traditionally, the formal semantic view restricts semantics to its literal meaning, and relegates all figurative associations to pragmatics, but many find this distinction difficult to defend.[6] The degree to which a theorist subscribes to the literal-figurative distinction decreases as one moves from the formal semantic, semiotic, pragmatic, to the cognitive semantic traditions.

The word semantic in its modern sense is considered to have first appeared in French as sémantique in Michel Bréal's 1897 book, Essai de sémantique'. In International Scientific Vocabulary semantics is also called semasiology. The discipline of Semantics is distinct from Alfred Korzybski's General Semantics, which is a system for looking at the semantic reactions of the whole human organism in its environment to some event, symbolic or otherwise.

Contents

[edit] Linguistics

In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of discourse (referred to as texts). The basic area of study is the meaning of signs, and the study of relations between different linguistic units: homonymy, synonymy, antonymy, polysemy, paronyms, hypernymy, hyponymy, meronymy, metonymy, holonymy, exocentricity / endocentricity, linguistic compounds. A key concern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning. Traditionally, semantics has included the study of connotative sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax.

Formal semanticists are concerned with the modeling of meaning in terms of the semantics of logic. Thus the sentence John loves a bagel above can be broken down into its constituents (signs), of which the unit loves may serve as both syntactic and semantic head.

In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of lambda calculus. Thus, the syntactic parse of the sentence above would now indicate loves as the head, and its entry in the lexicon would point to the arguments as the agent, John, and the object, bagel, with a special role for the article "a" (which Montague called a quantifier). This resulted in the sentence being associated with the logical predicate loves (John, bagel), thus linking semantics to categorial grammar models of syntax. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives are basic to the language of thought hypothesis from the 70s.

Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as :

  • situation semantics ('80s): Truth-values are incomplete, they get assigned based on context
  • generative lexicon ('90s): categories (types) are incomplete, and get assigned based on context

[edit] The dynamic turn in semantics

In the Chomskian tradition in linguistics there was no mechanism for the learning of semantic relations, and the nativist view considered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense. This traditional view was also unable to address many issues such as metaphor or associative meanings, and semantic change, where meanings within a linguistic community change over time, and qualia or subjective experience. Another issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in mental rotation.[7]

This traditional view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate meanings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics[8] and also in the non-Fodorian camp in Philosophy of Language.[9] The challenge is motivated by

  • factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week). In these situations "context" serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed as context-change potentials instead of propositions.
  • factors external to language, i.e. language is not a set of labels stuck on things, but "a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things."[9] This view reflects the position of the later Wittgenstein and his famous game example, and is related to the positions of Quine, Davidson, and others.

A concrete example of the latter phenomenon is semantic underspecification — meanings are not complete without some elements of context. To take an example of a single word, "red", its meaning in a phrase such as red book is similar to many other usages, and can be viewed as compositional.[10] However, the colours implied in phrases such as "red wine" (very dark), and "red hair" (coppery), or "red soil", or "red skin" are very different. Indeed, these colours by themselves would not be called "red" by native speakers. These instances are contrastive, so "red wine" is so called only in comparison with the other kind of wine (which also is not "white" for the same reasons). This view goes back to de Saussure:

Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid') has its particular value only because they stand in contrast with one another. No word has a value that can be identified independently of what else is in its vicinity.[11]

and may go back to earlier Indian views on language, especially the Nyaya view of words as indicators and not carriers of meaning.[12]

An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the Generative Lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated on the fly based on finite context.

[edit] Prototype theory

Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch and George Lakoff in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members.

Systems of categories are not objectively "out there" in the world but are rooted in people's experience. These categories evolve as learned concepts of the world — meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the "grounding of our conceptual systems in shared embodiment and bodily experience".[6] A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate (see the Whorf-Sapir hypothesis or Eskimo words for snow).

English nouns are found by language analysis to have 25 different semantic features, each associated with its own pattern of fMRI brain activity. The individual contribution of each parameter predicts the fMRI pattern when nouns are considered thus supporting the view that nouns derive their meaning from prior experience linked to a common symbol.[13]


[edit] Computer science

In computer science, where it is considered as an application of mathematical logic, semantics reflects the meaning of programs or functions.

In this regard, semantics permits programs to be separated into their syntactical part (grammatical structure) and their semantic part (meaning). For instance, the following statements use different syntaxes (languages), but result in the same semantic:

Generally these operations would all perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x'.

Semantics for computer applications falls into three categories:[14]

  • Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.
  • Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.
  • Axiomatic semantics: Specific properties of the effect of executing the constructs as expressed as assertions. Thus there may be aspects of the executions that are ignored.

The Semantic Web refers to the extension of the World Wide Web through the embedding of additional semantic metadata; s.a. Web Ontology Language (OWL).

[edit] Psychology

In psychology, semantic memory is memory for meaning, in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience, while episodic memory is memory for the ephemeral details, the individual features, or the unique particulars of experience. Word meaning is measured by the company they keep; the relationships among words themselves in a semantic network. In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the network are few in number and kind; and include "part of", "kind of", and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, neural networks and predicate calculus techniques.

[edit] References

  1. ^ "Semantikos, Henry George Liddell, Robert Scott, A Greek-English Lexicon, at Perseus". http://www.perseus.tufts.edu/cgi-bin/ptext?doc=Perseus%3Atext%3A1999.04.0057%3Aentry%3D%2393797. 
  2. ^ "Semaino, Henry George Liddell, Robert Scott, An Intermediate Greek-English Lexicon, at Perseus". http://www.perseus.tufts.edu/cgi-bin/ptext?doc=Perseus%3Atext%3A1999.04.0058%3Aentry%3D%2329446. 
  3. ^ name=Neurath:1955
  4. ^ Otto Neurath (Editor), Rudolf Carnap (Editor), Charles F. W. Morris (Editor) (1955). International Encyclopedia of Unified Science. Chicago, IL: University of Chicago Press. 
  5. ^ Cruise, Alan. Meaning and Language: An introduction to Semantics and Pragmatics, chapter one, Oxford Textbooks in Linguistics, 2004; Kearns, Karen. Semantics, Palgrave MacMillan 2000; Cruise, D.A. Lexical Semantics. Cambridge, 1986.
  6. ^ a b George Lakoff and Mark Johnson (1999). Philosophy in the Flesh: The embodied mind and its challenge to Western thought. Chapter 1.. New York: Basic Books.. 
  7. ^ Barsalou, L. (1999). Perceptual Symbol Systems. Behavioral and Brain Sciences 22(4)
  8. ^ Ronald W. Langacker (1999). Grammar and Conceptualization. Berlin/New York: Mouton de Gruyer. ISBN 3110166038. 
  9. ^ a b Jaroslav Peregrin (2003). Meaning: The Dynamic Turn. Current Research in the Semantics/Pragmatics Interface. London: Elsevier. 
  10. ^ P. Gardenfors (2000). Conceptual Spaces. Cambridge, MA: MIT Press/Bradford Books. 
  11. ^ Ferdinand de Saussure (1916). The Course of General Linguistics (Cours de linguistique générale). 
  12. ^ Bimal Krishna Matilal (1990). The word and the world: India's contribution to the study of language. Oxford.  The Nyaya and Mimamsa schools in Indian vyakarana tradition conducted a centuries-long debate on whether sentence meaning arises through composition on word meanings, which are primary; or whether word meanings are obtained through analysis of sentences where they appear. (Chapter 8).
  13. ^ Mitchell TM, Shinkareva S, Carlson A, Chang K, Malave V, Mason R, Just M. date=2008-05-08. "Predicting Human Brain Activity Associated with the Meanings of Nouns". “Science” 320: 1191–1195. doi:10.1126/science.1152876. PMID 18511683. 
  14. ^ Nielson, Hanne Riis; Nielson, Flemming (1995), Semantics with Applications , A Formal Introduction (1st ed.), Chicester, England: John Wiley & Sons, ISBN 0-471-92980-8 .

[edit] See also

[edit] Major contributors in the field of Semantics

[edit] Linguistics and semiotics

[edit] Logic and mathematics

[edit] Computer science

[edit] External links

Look up semantics in Wiktionary, the free dictionary.
Personal tools