Entropy: The Invisible Force Governing the Death of the Universe
Everything changes and some of us don’t always like that. But according to one view, the entropy of the universe, and of nature in general (that is the degree of disorder or randomness in a system) may be what enabled the creation of life in the first place.
According to this view, when a group of atoms is driven by an external source of energy, like the Sun, and surrounded by a source of heat, like the atmosphere, it will gradually restructure itself in a way that increasingly dissipates more energy. From then on, under certain conditions, the matter will inexorably acquire the attributes associated with life.
However, entropy has also been associated with the heat death of the universe. Here’s everything you need to know about entropy in thermodynamics and how it affects the universe, and ultimately, us.
What is the entropy of the universe?
While they’re not the same thing in physics, it’s somewhat useful to think about chaos theory and how it relates to entropy, and eventually what effect entropy may have on the universe.
According to the chaos theory, within the apparent randomness of chaotic, complex systems, there are underlying patterns and interconnectedness. If you know the initial conditions and figure out these underlying patterns, you can then predict the next irregularities that will take place in the future. In other words - chaos is not as chaotic and random as it may seem.
In its most basic form, entropy is defined as the measure of the thermal energy in a system, per unit temperature, that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, entropy is also a measure of the molecular disorder, or randomness, of a system.
Not only physics, but many disciplines have found this concept to be useful, including chemistry, biology, climate change, sociology, economics, information theory, and even business.
But let’s stick to physics, and specifically, to the fundamental laws of thermodynamics.
- The zeroth law of thermodynamics is the law of thermal equilibrium. It expresses that if two independent systems are in thermal equilibrium with a third system, then they’re in thermal equilibrium with each other as well. Meaning that if A = B and B = C, then A = C.
This is easily observable in real life. When you bring a cold glass of water close to a hot glass of water. They will exchange heat through their diathermal wall until they both attain thermal equilibrium with the temperature of the room.
- The first law of thermodynamics is the application of the law of conservation of energy to thermodynamic processes.
The law of conservation of energy postulates that energy can’t be created or destroyed, but only transformed or transferred. This is done via work and heat in the case of an isolated thermodynamic system. This is why the formula of the first law of thermodynamics is ΔU = Q − W, where ΔU is the change in the internal energy of the system, Q is the heat applied to it, and W is the work that the system performs on the environment.
- The second law of thermodynamics is also known as the law of entropy because it introduces that concept as the level of disorder of the system. It’s represented with the letter S.
There is a certain amount of energy in every process that can’t be converted into work. Instead, it becomes heat. The heat increases the disorder, or entropy, of an isolated system. And because there is always some degree of unusable energy that will turn into heat, the second law of thermodynamics establishes that there will always be an increase in entropy in isolated systems.
The change in entropy ΔS is equal to the heat transfer ΔQ divided by the temperature T: ΔS =ΔQ / T.
- The third law of thermodynamics states that the entropy of a system approaches a constant value as the temperature approaches absolute zero. If the temperature of the system is absolute zero (the lowest limit in the thermodynamic temperature scale), then entropy will also be zero.
Who introduced the concept of entropy?
In spite of its applications across several disciplines, the concept of entropy has its origins in physics. While studying the conservation of mechanical energy in his work Fundamental Principles of Equilibrium and Movement (1803), French mathematician Lazare Carnot proposed that the accelerations and shocks of the moving parts in a machine represent “losses of the moment of activity”. Carnot’s “moment of activity” is comparable to the current concept of work in thermodynamics. By extension, in any natural process, there exists an inherent tendency towards the dissipation of useful energy.
Other scientists researched this “loss”, and during the last half of the 19th Century, they pointed out that it wasn’t a real loss but a transformation. This is the concept of the conservation of energy that paved the way for the first law of thermodynamics. Scientists like James Joule, Julius Mayer, Hermann Helmholtz, and William Thompson (also known as Lord Kelvin) all produced work exploring this concept.
But the term entropy came out of the work of German physicist Rudolf Clausius, who is now considered one of the founders of thermodynamics.
In the 1850s, he presented a statement of the Second Law of Thermodynamics in reference to a heat pump. Clausius' statement underlined the fact that it is impossible to construct a device that operates on a cycle and produces no other effect than the transfer of heat from a cooler body to a hotter body.
In the 1860s, he coined the word entropy after the Greek word for transformation, or turning point, to refer to the irreversible loss of heat. He described it as a function of state in a thermodynamic cycle, specifically Carnot’s cycle, a theoretical cycle conceived by Lazare Carnot’s son, Sadi Carnot.
In the 1870s, Austrian physicist and philosopher Ludwig Boltzmann reimagined and adapted the definition of entropy to statistical mechanics. Closer to what the term entails now, this describes entropy as the measurement of all the possible microstates in a system whose macroscopic state has been studied. How can all of its observable properties rearrange the system? In how many ways? These questions encompass the concept of disorder, which is the basis of one concept of entropy.
This is written with the formula S = k ln Ω, where S is entropy, K is Boltzmann constant (1.38064852 × 10-23 m2 kg s-2 K-1), and Ω is the quantity, the number of possible microstates.
Is the universe in a state of entropy?
Back in the 19th Century, Rupert Clausius also deduced that the energy of the universe is constant and that its entropy tends to increase over time.
According to the most widely accepted model for the beginning of the universe, all of space and time were created by the Big Bang, an event that took place an estimated 13.8 billion years ago. The theory postulates that before this, the universe was a very tiny, very hot, dense point, similar to a singularity, from which the entirety of everything we see around us was created.
Cosmologists believe this point then 'exploded' outward, expanding and spreading out at a faster-than-light speed, and spawning all the particles, antiparticles, and radiation in the universe.
Yes, there had to be a lot of entropy for that to happen. However, if we think about the continuous increase in entropy that has been occurring through these years, we can infer that the entropy of the universe must be much greater now. In fact, the entropy of the universe today has been calculated as about a quadrillion times as large as it was at the Big Bang.
To some cosmologists, this can be explained with the idea of entropic time. Because the second law of thermodynamics states the entropy of an isolated system can increase, but not decrease, entropy requires a particular direction for time, sometimes called an arrow of time. Thus, the measurement of entropy is a way of distinguishing the past from the future.
Why is the entropy of the universe increasing?
The entropy of the universe will continue increasing, but what exactly is driving this increase? Remaining levels of radiation from the Big Bang, nuclear fusions in the stars… There are a lot of processes that keep the energy flowing, but black holes are thought to be the main contributors to this, because of the tremendous amount of particles they contain.
Black holes have an immense concentration of mass that supplies them with an exceptionally strong gravitational field. So they allow a multiplicity of microstates. In this context, Stephen Hawking theorized that black holes emit thermal radiation near their event horizons. This Hawking radiation may lead to the loss of mass and eventual evaporation of black holes.
But remember that black holes still follow the second law of thermodynamics, which says that entropy will always tend to increase. So they will gather more mass and merge with other black holes, turning into supermassive black holes. And when they eventually decay, the Hawking radiation produced by the decaying black holes will have the same number of possible state arrangements as the formerly-existing black hole itself. According to this view, the early Universe was low on entropy due to fewer, or much smaller, black holes.
Is there a limit to entropy in the universe?
As much as we talk about the entropy’s tendency to increase, the laws of thermodynamics also imply a state of maximum entropy.
In everyday life, we can observe this when our coffee goes cold in its cup. When the coffee reaches room temperature that means it has found thermal equilibrium with the environment. The boiling water used to make the coffee had a lot of excited atoms, but these slowed down, eventually reaching their maximum entropy for that system.
Thermodynamic equilibrium is a stationary state that is not reversible without “help” - an input of energy. The coffee would have to be re-heated by adding energy, such as placing it on a stove or in a microwave. However, we don’t have any way to input energy to the universe once it has reached thermal equilibrium. Eventually, the same values will be adopted everywhere.
With a constant, stable temperature all around the cosmos, there would be no more energy left to perform work, as the entropy would have reached its maximum level. All of these postulates constitute the theory of the heat death of the universe. This theory is also known as the Big Freeze because, in this scenario, the entropy of the Universe will continuously increase until it reaches a maximum value. On this fateful day, all the heat in our universe will be completely evenly distributed, allowing no room for usable energy.
However, this is just one theory on the ultimate fate of the universe. Other theories suggest that the energy contained in dark matter will lead the universe to contract and heat up again, leading to something similar to a new big bang.
Can the entropy of the universe decrease?
It's safe to say that entropy had decreased in the universe at some point because there is some order in it. Gravitational interactions can turn nebulas into stars, for example. That’s some sort of order.
Entropy can decrease without violating the second law of thermodynamics as long as it increases elsewhere in the system. After all, the second law of thermodynamics doesn’t say that entropy can’t decrease in certain parts of the system, but only that the total entropy of the system has a natural tendency to increase.
This being said, the overall entropy of the universe does not decrease. As stated above, entropy will tend to increase until it reaches its maximum levels and leads to heat death. This is a stationary state of thermodynamic equilibrium whose entropy is not only at its maximum but is also constant, and it will remain like this unless there is an energy input that reactivates the system.
The cycle could then repeat itself. With that new, additional energy doing work, there would be a portion of energy unable to do work that will turn into heat. This would increase the entropy of the system again. But where would that energy come from? What would lead the remaining leptons and photons, if any, to interact?
We won’t be there to see any part of this process, though. While it can be scary to think that all activity in the universe will cease someday, meaning the end of the world, and everything else, scientists believe that the heat death will occur in about 10100 years. So we can be calm for a while longer.
Two researchers become the first to map all the glaciers that end in the ocean and estimate their pace of change over the previous 20 years.