Pin
Send
Share
Send


Entropy It is a notion that comes from a Greek word that can be translated as "return" or "transformation" (used figuratively).

In the nineteenth century Clausius coined the concept in the field of physics to refer to a measure of the disorder that can be seen in the molecules of a gas. From then on this concept would be used with different meanings in multiple sciences, such as the physical , the chemistry , the computing , the math and the linguistics .

Some definitions are:

Entropy can be the thermodynamic physical magnitude which allows you to measure the unusable part of the Energy contained in a system . This means that this part of the energy cannot be used to produce a job.

Entropy is also understood as the measure of the disorder of a system . In this sense, it is associated with a degree of homogeneity.

The training entropy of a chemical compound is established by measuring what conforms to each of its constituent elements. The greater the entropy of formation, the more favorable its formation will be.

In the theory of information , entropy is the measure of the uncertainty that exists before a set of messages (of which only one will be received). It is a measure of the information that is necessary to reduce or eliminate uncertainty.

Another way to understand entropy is like the average amount of information contained in the transmitted symbols . Words like "he" or "what" they are the most frequent symbols in a text but, nevertheless, they are those that contribute less information. The message will have relevant information and maximum entropy when all symbols are equally likely.

Entropy in the field of linguistics

The way in which information is organized and disseminated In a speech it is one of the most relevant and susceptible topics for linguistic research. And thanks to entropy, a deeper analysis of communication can be carried out.

In the case of written communication, the problem is simple to analyze (the basic units, the letters, are well defined); if you want to fully understand the message it is possible to decode it accurately and understand both literal and figurative . But in oral language, things change a bit, presenting some complications.

It is not easy to determine in the oral discourse the fundamental elements of the code ; the words sound different depending on who pronounces them and, likewise, can have different meanings. It is therefore not enough to classify them into vowel and consonant phonemes because this would not allow us to understand how information is organized because, for example, if vowel phonemes are suppressed, it is not possible to understand the message.

According to a study conducted at the University of Wisconsin-Madison, a good way to isolate and understand the oral code is through the spectral decomposition of the sound signals. Thanks to this technique, we try to understand how the cochlea filters and analyzes it. The cochlea is the part of our ears that has the function of transforming sounds in electrical signals and send them directly to the brain.

To carry out this experiment, a unit of measurement was used, known as “spectral entropy in cochlear scale” (CSE), which allows establish connections between a signal and the one that precedes it ; deciding what possibilities to predict a signal from the previous one.

The results returned that The more similar two signals are, the easier it is to predict the second ; This means that the information we take from the second one is almost nil. Likewise, the more they differ from each other, the greater the information provided by the second signal, so if it is eliminated, it will cause considerable consequences in the understanding of the discourse.

Pin
Send
Share
Send