Semantics

Three disciplines are concerned with the systematic study of meaning in itself: psychology, philosophy and linguistics. Semantics is the study of meaning in language. It can be applied to entire texts or to single words. For example, “destination” and “last stop” technically mean the same thing, but students of semantics analyze their subtle shades of meaning.

The word semantic first appeared in English in 1894. It comes from the French semantique, “the psychology of language,”. It derived from the Greek semantikos, indicating “significant,” and semainein, “to show by sign, signify, point out, indicate by a sign.”

Syntax describes the rules by which words can be combined into sentences, while semantics describes what they mean.

semantics

In English, there are 44 phonemes, or word sounds that make up the language. They’re divided into 19 consonants, 7 digraphs, 5 ‘r-controlled’ sounds, 5 long vowels, 5 short vowels, 2 ‘oo’ sounds, 2 diphthongs.

Semantic Categories

Semantics can be broken down into two areas, lexical and logical. Logical semantics is the study of reference (the symbolic relationship between language and real-world objects) and implication (the relationship between two sentences). It is divided into the following three subcategories:

  • Formal semantics is the study of grammatical meaning in natural language. In other words, it intends to define the meaning of words and phrases based on its grammatical structure.
  • Conceptual semantics is the study of words at their core. It focuses on establishing universal definitions for words before they are taken into context.
  • Lexical semantics is the study of word meaning. It establishes meaning to words based on their relationships to other words in the sentence as well as their compositional structure. This can include a study of individual nouns, verbs, adjectives, prefixes, root words, suffixes, or longer phrases or idioms.

Conceptual semantics opens the door to a conversation on connotation and denotation. Denotation is the standard definition of a word. Meanwhile, connotation deals with the emotion evoked from a word. Connotation will be derived from the manner in which you interpret a word or sentence’s meaning. As such, semantics and connotation are deeply entwined. For a deeper dive, read these examples and exercises on connotative words.

At its core, we think of semantics as the “magic” that happens when people communicate. And, most importantly, when they understand each other. This magic is actually a well-balanced combination of:

  • understanding words and phrases;
  • having general knowledge;
  • and using real-world experience.

For example, to understand a work of art, one combines the objective representation with your knowledge of the world. When you consider words in context, you can understand the meaning and the message.

Semantic field

A semantic field is a set of words (lexemes) which are related in meaning. They cover a certain conceptual domain and which bear certain specifiable relations to one another. For example the conceptual domain of cooking, which in English includes the lexemes: boil, bake, fry, roast, etc.

There are seven types of semantic fields: conceptual meaning, connotative meaning, collocative meaning, affective meaning, social meaning, reflected meaning, and thematic meaning.

Some examples of semantic fields include colors, emotions, weather, food, activities and animals. Words or expressions within these fields share a common theme and are related in meaning. Writers often use this technique to keep a certain image persistent in their readers’ mind.

We classify the semantic field theory into three major categories, i.e., hyponymy, antonymy and synonymy.

The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

Rhetoric: the content of the text (i.e. instructions). The directed message to the audience
Semantics: the way the content is visualized (i.e. font style). This also refers to context; the placement where the text will be read, and how that context influences the rhetoric (content).

Related: English Through The Ages

Semantics in Poetry

Linguistic theories should ideally be able to account for creative uses of language, at the semantic level, what distinguishes poetry from other uses of language may be its ability to trace conceptual patterns which do not belong to everyday discourse but are latent in our shared language structure.

Shakespeare shows that Romeo’s love is fickle and that when he first sees Juliet, he bases his feelings purely on her appearance. He conveys this through a semantic field of beauty and wealth, for example ‘a rich jewel’ and ‘too rich for use, for earth too dear’.

Semantic Therapy

Semantic Therapy is a form of psychotherapy in which clients are trained to examine undesired word habits and distorted ideas so that they can think more clearly and critically about their aims, values, and relationships.

Natural Languages and Artificial Languages

The study of semantics is the study of how language and its different facets create meaning. The languages analyzed in semantics can include natural languages—ones that occur and evolve naturally, such as English, Farsi, or French—and artificial languages, such as those used in computer programming (JAVA, Python, etc.).

Semantics in Technology

Semantic technology is a way of processing content that relies on a variety of linguistic techniques.

Compared to traditional technologies that process content as data, semantic technology focuses not only on the data itself, but the relationships between pieces of data. When it comes to analyzing text, this network of relations enables both high precision and recall when performing search, automatic categorization and tagging activities.

Because of its ability to understand the meaning of the words in context the way that humans do, semantic technology can manage a huge knowledge base to integrate information and data and allow organizations to find the information necessary to make decisions.

Semantics in Technology examples

Text Mining

Today, text mining is making cybercrime prevention easier for enterprise organizations as well as law enforcement by establishing more context around the intelligence they are being fed.

As for knowledge management let’s look the healthcare industry where organizations have tremendous amounts of information (e.g., decades of research, clinical patient data, etc.) that could be of value to their largest profit center: new product development.

Entity Extraction

Entity extraction, also known as entity name extraction or named entity recognition (NER), is an information extraction technique that identifies key elements from text then classifies them into predefined categories. This makes unstructured data machine-readable (or structured) and available for standard natural language processing (NLP) actions such as retrieving information, extracting facts and answering questions. Entity extraction technologies must address a number of language issues to correctly identify and classify entities.

While it is easy for a human to distinguish between different types of names (e.g., person, place, organization, product, etc.), the ambiguities of language make this an especially complex task for machines. One of the primary challenges for machines is part of speech tagging. This is the process of breaking down sentences into their proper parts of speech (e.g., nouns, verbs, adjectives, adverbs, etc.) based on word definitions and context. With this information, machines can identify noun phrases which, in turn, help to identify the primary entities. Key to success though is context. (Fact extraction, Linking)

Concept Analysis

Formal concept analysis finds practical application in fields including data mining, text mining, machine learning, knowledge management, semantic web, software development, chemistry and biology.

Categorization

the document classification (or categorization) problem consists of assigning one or more category “labels” to documents, depending on the content (or other information). These labels belong to a predefined set of categories, shown as a list or a tree-shaped structure (known as a taxonomy), or in general a classification scheme.

Natural Language Processing

At the intersection of language and technology lies natural language processing (NLP) — the process of breaking down language into a format that is understandable and useful for both computer systems and humans. With the increased use of AI technology, NLP is now benefiting from a similar popularity. (email filters, language translation, search results, smart assistants, customized CX, text analytics, etc).

Related: Semantic Analysis

Normalization

Normalization is a technique often applied as part of data preparation for machine learning. The goal of normalization is to change the values of numeric columns in the dataset to use a common scale, without distorting differences in the ranges of values or losing information.

Sentiment Analysis

Sentiment analysis is way to better understand the feeling embedded in written communication. Natural Language Understanding (NLU). NLU comprehends language. To truly understand, we must know the definitions of words and sentence structure, along with syntax, sentiment and intent – refer back to our initial statement on texting.

For the enterprise, Sentiment Analysis is the Holy Grail. Sentiment Analysis (SA) takes NLU one step further. Sentiment Analysis identifies whether a message is positive, negative or neutral. Together, NLU and SA generate data that tell the story that businesses and enterprises are dying to understand: what customers think and feel about your brand, product or service. To put it another way, “Natural Language Understanding and Sentiment Analysis are brilliant ways to interpret what other people are feeling via their language.”

Related: Web Design Best Practices

Semantic Satiation

Satiation occurs when a person has been exposed to a reinforcer continuously until the item or activity loses its motivating effect on their behavior. This psychological phenomenon is called semantic satiation.

Which is the linguist equivalent of you have a bunch of leftover birthday cake, so you eat a slice (or two ) each day. When someone brings donuts to your office meeting, you easily turn it down because you’ve eaten cake every other day this week. 

Satiation is the opposite of deprivation, which occurs when an individual has been without a desired item or activity for an extended period of time.

Semantic satiation happens when you have to repeat a word over and over again, saying the word until it loses meaning. You start to question whether or not that word is even the right word. Are you saying it right? Is it spelled right? If you speak multiple languages or have been diagnosed with dyslexia, you may have experienced this.

This might happen with the simplest words: quiet, play, night. Once you get your mind off of the word for a few minutes, you automatically remember its meaning and spelling as if nothing happened. 


SOURCES

https://guides.lib.uw.edu/research/linguistics

%d bloggers like this: