By Marcel van der Veer
November 2017

Published in Science

More on Chemistry, Physics, Thermodynamics, Tutoring

Recently I met during an event at my Alma Mater, the University of Nijmegen, my high school physics teacher. We discussed the chemistry curriculum and remarked that quite some chemistry students capture with difficulty the fundamental principles of thermodynamics. Needless to say that thermodynamics is one of those subjects that serve a chemist for a lifetime. In my humble opinion, one reason for this lack of retention may be the abstraction level in freshmen courses and that this might be improved by spending one or two introductory hours on the basic ideas of thermodynamics in a historic context.

Would such approach work? A first attempt during tutoring thermodynamics at an undergraduate level proved encouraging. I present that first attempt here.


Thermodynamics is the science of the conversion of energy. It is a discipline where fundamental science and applied engineering go hand in hand. This field made important progress in the time of the industrial revolution and the Napoleonic wars. Mozart and Beethoven composed their beautiful works. The steam engine had arrived and constructing optimally efficient machinery posed a powerful economic and military incentive - two common drivers for innovation and scientific progress.

Considering its roots in engineering practice, it is understandable that classical thermodynamics applies to large scale devices - to bulk matter at large time scales - and is not concerned with the microscopic structure of matter. Instead this classical theory is expressed in abstract notions as "free energy" or "entropy". The turn of the 18th-19th century was the era when ideas about the microscopic structure of matter were only emerging. Ancient philosophers had proposed "atoms" as building blocks of matter, but the idea only became actual when Dalton had found that chemical elements react in fixed multiples of mass. Do not expect classical thermodynamics to for instance explain where entropy resides in a polymeric solution - in this theory entropy has no physical meaning and molecules just are not a part of its concepts.

Today however, students are taught in high school that matter consists of molecules built from atoms that are miniature solar systems (elementary quantum mechanics only comes later in the high school curriculum). Our chemistry students are used since a young age to explain the natural world in terms of atoms and molecules and are afterwards confronted with a theory that does not give mechanistic explanations based on small scale structure. That requires a mental paradigm shift - and here may be the difficulty.

You might wonder why Newton's classical mechanics and his theory of gravity seems to be more easily "captured" by students. This may be psychological - people take for granted that what is part of their day to day experience. Gravity explains the common experience that things fall to the ground, the position of the Moon thereby enabling calculation of timetables for the tides of the seas, et cetera. The difference with classical thermodynamics is that for instance entropy has no physical meaning, while gravity and the motion of celestial bodies are "tangible".

Interestingly a "tangible" concept as gravity is taken for granted while only few ask about its nature - what causes gravity? Gravity was not understood until Einstein explained gravity as mass making a dent in the universe, or more precisely, a "pit" in space-time by which one mass traps another.

Modern physics still struggles with the link between macroscopic gravity and the quantum-world. It is the missing link before finding the holy grail of physics - a single theory that would explain the relation between the four fundamental forces: gravity, electromagnetism and the weak and strong nuclear force. This would be a "theory of everything".

Nowadays we can link thermodynamic quantities to the behaviour of atoms and molecules. That beautiful field is called statistical mechanics or statistical thermodynamics, and is built on classical mechanics and atomic theory. It involves the work of brilliant minds such as Liouville, Boltzmann and Gibbs, that also makes abstract terms as entropy more insightful. Statistical mechanics is the theory behind techniques as molecular dynamics simulations that are run on modern supercomputers. However here we will limit ourselves to classical thermodynamics.

Energy - work and heat

Energy is an abstract quantity that comes in two forms: work which is useful "directed" energy that can exert force and thereby displace weight, which is performing net work and heat that is "undirected" energy that does no net work. Before the industrial revolution, there was already consensus on the conservation of energy in an isolated system, from the empirical fact that no-one could demonstrate perpetual motion. The insight that conservation of energy is an essential consequence of the time-invariance of physical laws, only came much later at the beginning of the 20th century with the Noether theorem.

The sum of work and heat needed to create a system is called that system's internal energy. Its symbol is U and we do not know why - maybe because U is a linguistic variant of V while in earlier days potential energy was called Voltage, but really, we do not know. When you progress in the study of thermodynamics you will learn that under circumstances you will want to add to U the work needed to make the system attain its volume and pressure. You can appreciate this when you think of pressure being force per unit surface, or equivalently, energy per unit volume: pressure is an energy-density. With pressure-volume work added, internal energy becomes enthalpy with symbol H, most likely for Heat.

Pushing the steam engine to its limit

The idea of a heat engine predates the industrial revolution. Huygens and Papin already discussed heat engines in the 17th century. Papin, a frenchman, realised that steam could elevate a piston and gravity could reverse it upon condensing the steam, allowing the piston to make strokes. Note that work was done by introducing a vacuum upon condensation. Papin's engine was not yet a practical steam engine since water was boiled and condensed in the cylinder itself, making it very slow. Papin proposed improvements but metalworking had not yet advanced enough to build an industrial engine. It was not yet feasible to machine tightly fitting cylinders and pistons, for instance.

As always, necessity is the mother of invention. Before the industrial revolution, England suffered from an energy crisis. Building and maintaining the navy on the one hand and burning wood as fuel on the other, resulted in deforestation and thus in wood getting scarce. Coal was an alternative fuel, but production could not meet demand since many mines were flooded. Early in the 18th century, Newcomen managed to build a coal-fueled steam pump, very much like Papin's set-up. These pumps drained coal mines, relieving the energy crisis. Since these pumps operated in a coal mine, fuel efficiency was not a priority.

Many believe that James Watt invented the steam engine. Actually, he improved steam pumps like the Newcomen engine. Watt found that instead of cooling large parts of a Newcomen engine, a separate condenser for steam must be installed to increase efficiency, while keeping a versatile and easy-to-maintain design. Better efficiency meant that steam engines could be operated away from a coal mine. Watt invented an accompanying rotary engine that mechanised weaving, spinning and transport. So it is fair to say that Watt made the industrial revolution possible.

An important issue was the economy of mechanisation - for instance, what would be the best efficiency a steam engine could achieve? Watt coined the term "horsepower" and knew there was a limit to the efficiency of the steam engine, meaning that not all heat could be converted into work, but he did not identify what determines the maximum efficiency. For that, we need to return to England's opponent at the time, France.

In France, young army officer and engineer Sadi Carnot worked to give his country a military advantage by designing optimally efficient steam engines. He varied all parts of a steam engine and determined the effect on efficiency. Eventually, he found that efficiency only increases with increasing temperature difference between boiler and condenser, relative to the temperature of the boiler. This temperature differential determines the maximum efficiency.

Carnot generalised a steam engine to an abstract heat engine, now called a "Carnot engine", operating between a hot body (a boiler) and cold body (a condenser), and cleverly reasoned that if two optimally performing though different heat engines work between the same hot and cold body, their efficiencies must be the same.

Strangely enough, Carnot was initially ignored, even by fellow engineers. It took a good part of a century before his work was recognised, for instance by Clapeyron who worked on the design of steam locomotives and who described the operation of the Carnot engine with what is now called the "Carnot cycle". This Carnot cycle is a central theme in modern thermodynamics courses.

Nowadays we have an ingenious reductio ad absurdum showing that if a superior heat engine more efficient than Carnot's limit would exist, and a heat engine tandem would be constructed by letting the superior engine drive the ideal Carnot engine as a heat pump, then that tandem could do work by only taking heat from a reservoir, in casu the hot body. Such miracle, a perpetuum mobile of the "second kind", has never been seen so is thought impossible - you cannot propel a ship only by extracting heat from the water it floats on.

Carnot's work made clear an important asymmetry in nature: work can be completely degraded to heat, but heat cannot be completely converted into work. It takes work to convert "undirected" into "directed" energy. This is an example of engineering work resulting in a profound scientific insight, a remarkable achievement for an engineer as young as he was.

Efficiency of processes in general

If not all energy in a system is available to do useful work to let the industry hum, then an engineer will of course want to know what part of energy can be converted into work. That amount is "free energy", which is the maximum work you can get out of a process when you avoid leaks and friction and what have you. Note that in the Anglo-Saxon world the symbol for free energy is simply F though in German speaking countries the symbol is A for Arbeit. As for internal energy, under circumstances you must also include the work needed to give the system its volume and pressure. Then free energy becomes free enthalpy, with symbol G for Gibbs.

The part of energy not available to do useful work is related to a quantity called "entropy" from the Greek term for "in transition" or if you want "in degradation". Entropy is an abstract notion; it is not possible to assign a physical meaning to it like we can with internal energy. An amount of energy equal to temperature times entropy cannot be converted into useful work - it is hiding somewhere in your system in a form which cannot excert force resulting in work. The symbol for entropy is S and this is thought to be a tribute to Sadi Carnot.

In the mid-19th century work was collected into two basic governing rules. Carnot's work was generalised to other processes. Important scientists as Clausius, Kelvin and Thomson declared conservation of energy as the First Law and different variants of Carnot's and later insights as the Second Law. The First Law says that energy cannot be created nor destroyed, the Second Law says that entropy can be created but not destroyed. Stated alternatively (and informally), you cannot extract more energy from a system than was put into it and energy cannot be completely converted into work.

Spontaneous change - the fate of the universe

Entropy can decrease locally only when at least an equal amount is generated elsewhere in the system. Think for example of your refrigerator, which is a heat pump. The compressor does work and generates entropy compensating for entropy loss due to making heat flow from a cool to a warmer place, which is a forced process since heat naturally flows from a warm to a cool place.

Free energy is important in designing efficient engines but is also important in chemistry and physics. Through free energy, thermodynamics explains in what direction spontaneous processes will move. In a closed system at constant entropy and volume which is connected to other systems but unable to exchange matter, the internal energy will decrease and be at minimum value at equilibrium. The system, so to speak, will perform work on its surroundings until it is exhausted. An isolated system that is not connected to other systems, having fixed internal energy and mass, will attain maximum entropy at equilibrium. In both cases free energy will decrease, going downhill until it reaches a lowest point in the valleys in a mountainous landscape.

The universe is an isolated system with ever-increasing entropy moving towards thermal equilibrium at uniform temperature, a so-called "heat death". The second law is the only principle in physics that gives a direction to time, so you remember the past but not the future. Contemplate this. Since Bekenstein and Hawking we know that even the darkest of places in the universe, a black hole, has entropy with corresponding temperature, meaning that it emits thermal radiation and is not completely black. A consequence is that a black hole slowly evaporates and eventually disappears, but do not wait for this since the process may take a googol years.

Concluding remarks

Having read until here, you may want to impress your friends with your insights. You should now be able to understand the following. On a microscopic scale, looking at a gas or fluid and seeing all individual particles, you can appreciate the work involved to introduce or extract just one atom or molecule. Now you know this work equals free energy, in this case the Gibbs free enthalpy per particle, a quantity known among chemists as the "chemical potential". In a metal, electrons fill energy bands and the work needed to shoot an electron at the top of the bands out of the metal is called the Fermi level of that metal. Explain this as the Fermi level being the chemical potential of an electron and your peers are likely to slip from their chairs.

All blog posts