The article discusses the evolution of the concept of entropy over 200 years. Initially conceived by Carnot and Clausius in thermodynamics as a measure of unusable energy and decay (the second law), it was later defined by Boltzmann using probability as a measure of disorder. Shannon linked it to information theory, quantifying uncertainty. Modern physics, influenced by Jaynes, increasingly views entropy as subjective and observer-dependent—a measure of ignorance about a system's microstate rather than an intrinsic property. This perspective connects information, energy, and the limits of knowledge, fueling research into information engines, quantum thermodynamics, and the fundamental nature of time.
This article from Quanta Magazine explores the concept of entropy, its historical evolution, and its implications beyond physics. Here are the key points:
- Entropy as Disorder: Originally introduced 200 years ago by Sadi Carnot in the context of steam engine efficiency, entropy later evolved to describe the universe's tendency toward disorder, as formulated in the second law of thermodynamics.
- Statistical Mechanics Perspective: Ludwig Boltzmann refined entropy in terms of probability, showing that systems naturally move from ordered to disordered states simply because there are more ways for disorder to occur.
- Link to Information Theory: Claude Shannon connected entropy with uncertainty in communication, demonstrating that entropy measures the unpredictability of a message. This aligns with physics, where entropy represents ignorance of microscopic details.
- Subjectivity in Entropy: E.T. Jaynes later argued that entropy is observer-dependent—its value changes based on what an observer knows about a system. This idea led physicists to rethink entropy in terms of information and perspective.
- Implications for Science and Technology: Scientists are now exploring entropy in quantum systems, decision-making processes, and energy-harvesting devices. Modern experiments, such as quantum information engines, aim to harness entropy to improve computation and efficiency.
The article ultimately reframes entropy as a reflection of our growing ignorance rather than an absolute physical law, influencing theories of time, information processing, and even the future of technology.