Advertisement

Entropy

Advertisement

The physical is a very broad field, a natural science that is responsible for studying the energy , matter and space using fundamental laws . This science has several important branches among which is entropy , a term used to refer to the degree of equilibrium that exists in a thermodynamic type system . It is worth mentioning that it is a very important concept in the second law of thermodynamics.

Advertisement

What is entropy?

The entropy is a process that causes the loss of the order on a given system . It is a property of the state of matter in which only the initial state and the final state matter, regardless of the path taken to go from one to the other.

Advertisement
  • Meaning
  • Entropy characteristics
  • History
  • Entropy in thermodynamics
  • Types
  • Formula
  • Units
  • Reversibility
  • Entropy in chemistry
  • Entropy in other areas
  • Applications
  • Importance
  • Examples in everyday life

Meaning

Entropy is a word that comes from the Greek and refers to physical and thermodynamic magnitude . It is the study of the mechanical action that can produce heat or other forms of energy that allows establishing the order found within a system in a certain situation with respect to the one or those who carried out the action previously.

Advertisement

It is a properly state function where a physical magnitude gives the characteristics to the state of a system that has extensive equilibrium and then describes everything reversible that the system has in its physical part.

Entropy characteristics

Among the main characteristics of entropy the following are mentioned:

  • It can increase the moment heat is supplied regardless of the fact that dictates whether the temperature undergoes a change or not.
  • It has the ability to shrink when heat is rejected.
  • In adiabatic type processes , the entropy will always remain constant .
  • You can count the number of things that are presented on a certain property.
  • It may take into account only the characteristics that are of interest .
  • It depends on the distribution of things.
  • Can generally contradictory values as describing the object changes.
  • It is represented by the letter S.
  • It cannot be calculated using absolute terms .

History

The concept of entropy was born as a response to the observation of a certain amount of energy that was released in combustion reactions in which energy was lost through dissipation or friction without producing useful work . In the year 1850 , Rudolf Clausius managed to establish the concept of thermodynamics and postulated the thesis that said that in any type of irreversible process, a very small amount of thermal energy could be gradually dissipated.

It was in this way that this scientist developed more ideas regarding the lost energy, naming the process as entropy . Between the years 1890 and 1900 , several physicists created statistical mechanics , a theory that was strongly influenced by the concept of entropy. As the years went by, the term was improving and developing and today, the term is applied to many fields.

Entropy in thermodynamics

It is important to remember that thermodynamics is a branch of physics that is responsible for describing and relating the different physical properties that systems have through the study and analysis of the heat exchanges that occur in those bodies that are exposed to it. hot. In thermodynamics, entropy is related to the Second Law of thermodynamics in which it is established that heat cannot be spontaneously transmitted from a body that has a certain temperature to another body that has a higher temperature.

In thermodynamics, the entropy of a system is related and associated with a certain degree of disorder, for this reason it is established that the more disorder there is, the greater the entropy of the system there will be. Based on the Second Law, it was established that entropy in thermodynamics is the measure that exists regarding the restrictions so that a certain process can be carried out, also indicating the direction of the process.

Types

There are two types of entropy, which are:

  • Negative : it is also known as syntropy and is the one that is exported from a system or released in order to keep entropy levels low . The term was developed by Erwin Schrodinger in 1943.
  • Positive : in this case, entropy occurs in those things that have a greater molecular disorder and therefore there will be greater entropy .

Formula

The entropy formula is as follows:

ΔS = entropy increase = Sf – Si.

Units

The unit used in entropy is based on the international measurement system and the J / K is used for this .

Reversibility

In order to bring a certain system to its original state, a work greater than that which is produced must be applied , this can cause a transfer of heat to its surroundings, producing an increase in global entropy .

Entropy in chemistry

In the field of chemistry , entropy refers to the entropy that can be observed in the formation of a certain chemical compound . In this case, the minimum entropy is related to the presence of atoms that are present in a totally ordered substance producing a perfect crystalline structure .

Entropy in other areas

Entropy can be applied to other areas, for example:

  • Psychology : in this case it refers to the social systems in which human beings have differences and similarities at the same time, differences that must be preserved to increase the possibilities.
  • Computer science : in computer science, entropy is the measure of the uncertainty that is present before a series of messages, of which only one of them will finally be received. It is a type of information measure that is used to eliminate or reduce uncertainty.
  • Linguistics : in this field, the word refers to the way in which all the information is organized and disseminated in a speech in order to elaborate an analysis that has greater depth in the communication process.
  • Information theories : here is the degree of uncertainty that exists with respect to a given set of data.

Applications

Some applications of entropy are as follows:

  • The entropy formula can be used in all types of energy models , for example, in the combustion chambers of cars.
  • For the study of the process of formation of substances taking as a starting point the elements that constitute them.
  • In the study of black holes .
  • In the field of linguistics mainly applied in the field of semiotics .
  • It is also used in information theory to determine the degree of uncertainty that exists in a data set.
  • In topology , where it is related to the actual quantities in a system.

Importance

It is important because it is an ideal means to be able to quantitatively measure the function of a system. It is an important term mainly for the field of physics and chemistry and for other sciences such as linguistics , mathematics and computer science .

Examples in everyday life

Some examples of entropy that we can observe in everyday life are the following:

  • When breaking a plate . In this case the plate is seen as a completely ordered and balanced system. When it breaks into pieces, it becomes a natural event that does not occur spontaneously if we look at it in reverse.
  • The radioactive decay because it is a process that can not be reversed and causes the release of large amounts of energy.
  • room that was originally tidy but later found itself in complete disarray.

Leave a Comment