| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Whenever you search in PBworks, Dokkio Sidebar (from the makers of PBworks) will run the same search in your Drive, Dropbox, OneDrive, Gmail, and Slack. Now you can find what you're looking for wherever it lives. Try Dokkio Sidebar for free.

View
 

Semantics

Page history last edited by Cameron Smith 10 years, 2 months ago

"Whatever you say it is, it isn't."

- Alfred Korzybski

 

The study of semantics it not strictly a psychological field; semantics is studied by a variety of fields, including psychology, linguistics, philosophy, computer programming, and more. The English word "semantics" comes from the Greek word "sēmantikós." Semantics is literally the study of meaning. This is primarily explored through the meaning of expressions and phrases in human languages. As a whole, the study of semantics can be understood as the study of signifiers (words, phrases, etc.) and what they stand for, or denotata. Words have both denotations and connotations. The denotation of a word is the essential, central meaning of a word. On the other hand, a word's connotation is its implications and emotional associations. As an example, the word "cat" has the denotation that is the relationship between the word itself and the actual animal, a cat. The connotation of the word "cat" could be "nice," "friendly," or "aloof." Basically, the connotation of a word is anything associated with a word that is not the direct meaning.

 

As a part of language, semantics is often considered to be inseparable from other aspects of language, such as syntax. However, semantics can be completely separated from the other aspects of language. Noam Chomsky, in his dissertation entitled "Logical Structures of Linguistic Theory," illustrated that a syntactically correct utterance could occur that made no semantic sense whatsoever. His example of this was "Colorless green ideas sleep furiously." This sentence is syntactically sound (grammatically correct) however it makes no semantic sense. Chomsky used this as a method of illustrating the distinct difference between syntax and semantics as well as the fact that semantics is not a necessary aspect of a sentence.

 

 

Theories of Semantics:

 

Referential Theory of Meaning: The referential theory of meaning basically states that words mean what they refer to. While this is one of the oldest theories concerning semantics, it also has two flaws. The first of these flaws concerns how this theory treats abstract concepts. This is important because in order to give a word a meaning through this theory, you must be able to show the referent of the word. For example, how can you show the referent for the word "love" or "success"? Neither of these words has an universal referent. The second problem is the dissociation between a word and the thing(s) to which it refers. An example of this would be giving a name to the planet Mercury when you see it in the morning, and giving it another name when you see it at night. Each of these words will refer to the same thing, but will have different "senses." That is to say, they will essentially mean different things as one pertains solely to the morning and one solely to the night, despite them referring to the exact same thing. 

 

Model-Theoretic Semantics (Truth-Theoretic Semantics): Also known as formal semantics, this approach to semantics seeks to understand the meaning of words through the construction of mathematical models and algorithms that mimic the principles used by humans to define the relations between words and their meanings. While these theories seem to help refine what a word might mean, they do not mimic or provide much information as to how we, as humans, actually represent word meaning.

 

Semantic Networks: The semantic network approach to semantics is based on the idea that the meaning of a word is obtained through the manner in which it is embedded in a network of other meanings. An associative network (in which a word's meaning is derived from what we associate with the word) is insufficient to completely capture the entire meaning of a word. Semantic networks were proposed to fix this insufficiency. In a semantic network the connections between individual word meanings have meaning themselves. These networks thus organize semantic information hierarchically. For example, an emu is a type of flightless bird, which is a type of bird, which is a type of animal, etc. In this network, the connections between each of these categories contain certain meanings themselves. One example is the "ISA link" which basically means that the subcategory IS A subcategory of the preceding category.  While seemingly complicated, semantic network models are an economical way of organizing the meaning of words. In these networks, each level has a set of information stored with it (ex: the level of bird has the information "has wings" stored with it).

 

 

Semantic Features: This theory of word meaning is not based on the position of a word within a network, but instead word meaning is based on the decomposition of a word into smaler units of meaning. These theories are based on the semantic features of words. These semantic features are seen as lists, therefore not including any information hierarchically.  For example, the word "boyfriend" may be composed of the semantic features "male" and "human." The biggest problem with this theory is that it is difficult to test and to determine what counts as a "semantic feature" and not the word which you are defining.

 

Semantic Microfeatures: The semantic microfeature approach to semantics is very similar to the semantic feature approach, but there is one main difference between the two. Instead of a list of features that is accessed serially, this theory states the word meaning is accessed through the simultaneous activation of various microfeatures. One of the most important features of this model is that while microfeatures may correspond to simple semantic features, they may correspond to more abstract features that have no straightforward linguistic equivalent. 

 

 

External Links:

 

1: Powerpoint Overview of Semantics A relatively surface level powerpoint that serves as an introduction to semantics.

2: General Overview of Semantics An in-depth introduction to semantics.

3: Lecture Notes from University of Pennsylvania Lecture notes for a lecture on semantics from a introductory linguistics class at the University of Pennsylvania. 

4: What is Semantics? An introductory article to semantics from a professor at the University of Michigan. 

5: TED Talk: Steven Pinker: What our language habits reveal A TED talk from Steven Pinker dealing with semantics and our language habits.

6: Semantic Wars: Computer Game With a Linguistic Twist An online game that gives semantic categories and instructs you to guess the word as your castle engages another castle in war.

7: WordNet An English lexical database from Princeton that functions similarly to a thesaurus; however it also includes the senses of words, semantic relations, and word forms among words in the English language, unlike a thesaurus.

8: FrameNet A project by the University of California at Berkeley based around semantic frames (conceptual structures describing events, relations, or objects). 

9: A Downloadable Shallow Semantic Parser of Your Very Own! - The people over at Carnegie Mellon University have developed a program for your use that will identify the semantic roles of each of the words in any sentence you put into it (it uses the same technology as FrameNet!).

10: Brief History of Semantic Network Models This link contains a brief introduction to semantic network models.

 

This page was created by Cameron Smith, if there are an questions regarding the copyright material of this page, contact Cameron at Cameron.Smith@live.mercer.edu

Comments (0)

You don't have permission to comment on this page.