An Infinite Disorder: The Physics of Entropy
Entropy is an intrinsic property of matter and its constituents (atoms and molecules) and it has several definitions and meanings. It was first postulated by German physicist Rudolf Clausius in the year 1850, and it has played an important role in the study of thermodynamics and in formulas for determining the stability of atomic systems and chemical reactions since then.
This peculiar phenomenon was elucidated in the 19th century from in-depth scientific research work on heat and energy. It serves as an integral principle of thermodynamics and it further led to the emergence of various mathematical principles and formulas related to probability.
Moreover, entropy also tells us about the possibility of various chemical processes and the reason behind their reversible and irreversible nature.
Different Definitions of Entropy
Entropy is often referred to as the loss of energy available to do work. It is also associated with the tendency toward disorder in a closed system.
Entropy is used for the quantitative analysis of the second law of thermodynamics.
However, a popular definition of entropy is that it is the measure of disorder, uncertainty, and randomness in a closed atomic or molecular system. For example, if a system has a high value of entropy then it is difficult to predict the state of its atoms.
The more energy that is lost by a system to its surroundings, the less ordered and more random the system becomes. High entropy means high disorder and low energy. We can use the analogy of a teenagers' bedroom. If no energy or work is put in, the room quickly becomes messy and disordered and has a high level of entropy. If energy input into the system, in the form of cleaning up and putting everything away, the room is returned to a state of order or low entropy.
Entropy Formulas and Equations
There are numerous equations that derive entropy in relation to different scientific parameters. However, the entropy equation that is formulated for a heat engine is used more often because it efficiently describes both the second law of thermodynamics and the state of equilibrium for a system.
According to physicist Rudolf Clausius (1822-1888), if the temperature (T) for a heat reservoir is above zero and heat (Q) is flowing into the same reservoir then the entropy change or increase for such a system is given as:
ΔS = Q/T
In the case of two reservoirs with heat (Q) flowing through them and these reservoirs have temperatures (T1 and T2). Entropy change for such system is:
ΔS = Q (1/T1- 1/T2)
Here, T1 has a higher value than T2
If T1 = T2, then there is no flow of heat and the system is said to be in equilibrium
Now, if there is a heat engine taking Q1 amount of heat from one reservoir and Q2 amount of heat from the other, then according to the law of conservation of energy, the work done (W) by the engine in one cycle would be:
W = Q1 - Q2
For such system, the entropy change is calculated as
ΔS = (Q2/T2- Q1/T1)
Maximum work is done when the value of Q2 is very small as compared to Q1 but the value of Q2 can not be zero because in such case, the entropy change turns out to be negative. As per the second law of thermodynamics, the smallest possible value of entropy change (ΔS) is zero.
ΔS = 0
Then, (Q2/Q1)min = T2/T1
When entropy change is zero, a reversible process takes place, because in such a case even the smallest change is sufficient to run the heat engine backward.
To evaluate the entropy change for a system such as a gas cylinder equipped with a movable piston, let the gas intake dQ amount of heat at a given temperature (T). So, when an opposing restraining pressure (P) is applied, the gas in the cylinder expands by dV volume in a reversible manner. Here, maximum work is calculated as:
dW = PdV
If the change is internal energy (U) of the system during this process is dU, then:
dQ = dU + PdV
And entropy change for the reservoir would be:
dS(res) = −dQ/T
Now, to balance the entropy change between system and the reservoir, the entropy change for system is increased as:
dS(sys) = dU + PdV/T
dS(sys) = dQ/T
dS(sys) + dS(res) = 0.
The value of work done is always less than the maximum possible work in any practical process because there are always some unavoidable energy losses due to friction and other reasons.
In the case of a gas freely expanding in a vacuum, then the entropy change is
dS(sys)= dU+PdV/T ≥dQ’/T
Here, dQ’ = amount of heat absorbed
dQ = maximum possible amount of heat
and when dQ’>dQ, the process is reversible.
This entropy equation reveals that the amount of the material within a system and its current state affect the entropy value.
The connection between thermodynamics and statistical mechanics is enshrined in the formula given by physicist Ludwig Boltzmann (1844-1906). When a very large system exists in thermodynamic equilibrium, the quantity (Ω) denotes the total number of microstates accessible to the system, then:
S = k ln Ω
Here, k = Boltzmann constant
k = 1.38 ×10-23 m2 kg s-2 K-1
Entropy is denoted by ‘S’ and its SI unit is Joules per kelvin (JK-1) or kg m2 kg s-2 K-1
Enthalpy vs Entropy
Students often get confused between enthalpy and entropy, which are two entirely different parts of the thermodynamic system. The basic difference between the two is that entropy (S) denotes the degree of randomness or disorder within a system whereas, enthalpy (H) is the total amount of heat present in a system at a given pressure.
ΔH = ΔE + PΔV
here, E is the internal energy
Enthalpy change (ΔH) is also referred as the product of entropy change (ΔS) and absolute temperature (T):
ΔH = T ΔS
The SI unit of enthalpy is J mol−1
Some key differences between enthalpy and entropy are that the former is a form of energy derived from a system, but the latter is a property of the system. Moreover, any system tends to favor disorder in its state or molecular structure, so entropy is not limited by any conditions. But systems do not favor enthalpy, therefore, it is only applicable when standard conditions are provided.
In the context of macroscopic systems, both entropy and enthalpy are considered as fundamental quantities, but when quantum mechanics is taken into account, then these quantities become derivable.
Interesting Facts About Entropy
- Entropy is not just limited to physics and chemistry, there are various applications of this concept in sociology, psychology, business, philosophy, biology, climate change, information systems, and many other disciplines. This is mainly because randomness or chaos is considered an unavoidable part of almost every process.
- A cosy campfire, boiling water, melting ice, cooking popcorn, and brewing tea — all these activities are a result of entropy, and even the universe is said to be tending towards higher entropy.
- Human-made things such as machines, furniture, and devices are less entropic than natural objects, because they are built to stay or function in an optimum, concentrated, and organized manner.
- Recent research reveals that a clock which emits the maximum entropy shows the most accurate time.
- The concept of entropy is also a part of human physiology and behavior, it can affect your health, relationships, finances, and many other aspects of your life in a positive or negative way, depending on your response to its different levels.