Top Podcasts
Health & Wellness
Personal Growth
Social & Politics
Technology
AI
Personal Finance
Crypto
Explainers
YouTube SummarySee all latest Top Podcasts summaries
Watch on YouTube
Publisher thumbnail
Curt Jaimungal
10:2410/14/25

Entropy Isn't One Thing!

TLDR

Entropy is not a single concept but rather a term with multiple distinct definitions used by physicists, leading to confusion and conflicting answers about a system's entropy.

Takeways

Entropy is not a single concept, leading to multiple definitions and conflicting interpretations.

Clausius originally coined 'entropy' for a specific thermodynamic quantity, intended for precision.

The core debate on entropy reflects differing views on whether thermodynamics studies objective system properties or includes observer-dependent information.

The concept of entropy is often misunderstood because the term 'entropy' is used in several different senses across physics, akin to how common words have multiple dictionary definitions. This ambiguity causes competent physicists to give conflicting answers when asked whether a system's entropy has changed. The historical coining of the term by Clausius in 1865 for thermodynamics differs significantly from later interpretations, highlighting that different notions of entropy are actually defined differently and stem from distinct conceptions of what thermodynamics itself is about.

The Problem of Definition

00:00:05 Conflicting answers arise when considering whether a system's entropy decreases upon gaining more information about its physical state. Some argue yes, linking entropy to information, while others contend no, asserting entropy is an objective property of a system independent of observer knowledge. This disagreement stems from the fact that 'entropy' is not a singular, precisely defined term in physics, unlike the intention of many scientific coinages, leading to different "entropies" (e.g., entropy sub 1, entropy sub 2) that are not clearly distinguished.

Clausius' Original Concept

00:01:49 Rudolf Clausius coined the term 'entropy' in 1865 from a Greek word meaning 'transformation,' deliberately making it sound similar to 'energy' due to their relatedness in thermodynamics. He intended it to be a precise, dignified name for a crucial quantity in thermodynamics, which was then a nascent field. If his original definition had been consistently applied, much of the present confusion about entropy might have been avoided.

Thermodynamics as Resource Theory

00:06:28 The word 'thermodynamics,' coined by Kelvin, is often misunderstood; it derives from Greek words for 'heat' and 'power,' and its roots lie in the study of extracting useful mechanical work from heat, as pioneered by Carnot's work on heat engine efficiency. This perspective aligns with a 'resource theory,' a concept from quantum information theory, where agents utilize physical resources to achieve specific goals. This contrasts with 'physics proper,' which traditionally focuses on intrinsic properties of physical systems, independent of agents' knowledge or goals.

Entropy as Physical Property vs. Information

00:03:40 The debate over whether entropy is an intrinsic property or observer-dependent information stems from fundamental differences in how thermodynamics is conceived. If thermodynamics solely studies objective physical properties, then entropy should be independent of knowledge, much like a cup's rest mass. However, if it incorporates concepts related to information or an observer's understanding, then entropy could be influenced by gained information, revealing a deeper division in the understanding of the science itself.