Monday, December 16, 2024

What Is Entropy? A Measure of Just How Little We Really Know.

https://www.quantamagazine.org/what-is-entropy-a-measure-of-just-how-little-we-really-know-20241213/?mc_cid=7f931e7b43&mc_eid=61275b7d81


[[This is the introduction to a fascinating article which reveals that even as fundamental and studied a concept as entropy is smothered in doubts and disagreements that show the state of theoretical science to be very immature indeed. I strongly recommend reading the whole article.]]

 

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

To keep track of this cosmic decay, physicists employ a concept called entropy. Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

I have long felt haunted by the universal tendency toward messiness. Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball. We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire. The second law demands that machines can never be perfectly efficient, which implies that whenever structure arises in the universe, it ultimately serves only to dissipate energy further — be it a star that eventually explodes or a living organism converting food into heat. We are, despite our best intentions, agents of entropy.

“Nothing in life is certain except death, taxes and the second law of thermodynamics,” wrote(opens a new tab) Seth Lloyd, a physicist at the Massachusetts Institute of Technology. There’s no sidestepping this directive. The growth of entropy is deeply entwined with our most basic experiences, accounting for why time runs forward and why the world appears deterministic rather than quantum mechanically uncertain.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance. Entropy is seen not as a property intrinsic to a system but as one that’s relative to an observer who interacts with that system. This modern view illuminates the deep link between information and energy, which is now helping to usher in a mini-industrial revolution on the smallest of scales.

Two hundred years after the seeds of entropy were first sown, what’s emerging is a conception of this quantity that’s more opportunistic than nihilistic. The conceptual evolution is upending the old way of thinking, not just about entropy, but about the purpose of science and our role in the universe.