Temperature is a derived, not a fundamental, quantity depending on the atomic nature of matter

When school physics turns to Heat, the first item of business is the definition of temperature as a parameter determining whether a body will warm or cool when placed in contact with another. The superfluous 'Zeroth' Law of Thermodyamics is asserted, the first step in the axiomatic thermodynamics that ignores the structure of matter. The inadequacy of this description is soon evident, in that it gives no hint of how to measure temperature, or the meaning of a temperature interval, or how to proceed in certain difficult cases that come up later, for example with interacting systems or negative temperatures. This approach is actually quite difficult and ambiguous, leading to much inconclusive argument and mental confusion.

A much more profitable procedure begins with the recognition of bodies, or systems, that can be isolated, and only weakly interact with other systems. This is true of electrically neutral samples of matter where gravity can be neglected, and which interact with a neighbouring system only through the interfaces or surfaces. Since the volume scales as L^{3}, and the surface as L^{2}, for a sufficiently large, compact system the volume energy dominates the surface energy. For such a body, the internal energy E is well-defined. We postulate that the body consists of many parts (atoms), in interaction with one another. All degrees of freedom are included, not just translation, and energy is conserved in all interactions. Many possible distributions of energy among the atoms gives the same total energy, and the system is constantly undergoing redistribution of this energy among its parts. The logarithm of the number of possible states W for a given energy E and volume V of the system is a quantity called the entropy S. The fact that the logarithm is used, and the factor of proportionality is taken as unity, are only for later convenience, and have no fundamental significance.

We can, therefore, conceive of a function S = S(E,V) that is essentially the consequence of the atomic nature of matter. Without the atoms, there is no S. This is not a unique function, but depends on the history of the system. We can make it definite by saying that we take it to be the S that results when the system is allowed to come into macroscopic equilibrium. For example, if we consider a gas of energy E and volume V suddenly allowed to expand into a larger volume V', S' = S(E,V') will be different from S = S(E,V). We now assert that in any such process in an isolated system S' > S. This means only that the system has taken advantage of access to a greater number of states. Of course, this is the Second Law of Thermodynamics, but now it follows from principle, not as a postulate. The entropy of an isolated system can only increase. Entropy has the deciding role in determining the direction of a spontaneous process, such as a chemical reaction, not energy.

One must be careful to regard the energy E and the entropy S as functions of the whole system, not as some fluid quantities distributed around the system. One commentator has likened it to the beauty B of a painting, which cannot be considered as the summation of infinitesimal bits of beauty distributed over the canvas. The fact that most macroscopic bodies can be subdivided, and the whole considered as the sum of its parts merely obscures the essence of E and S. At some point, this procedure fails, and the unwary is trapped. Nevertheless, it is useful to distinguish E, S and V as *extensive* quantities that are proportional to the amount of matter, in a sense. The continual increase in entropy does not mean that it is piling up somewhere as a giant grey mountain.

Let us now solve S = S(E,V) for E: E = E(S,V). Incidentally, for many systems this function has the useful property that E(λS,λV) = λE, or E is a homogeneous function of first order. However, now we are chiefly concerned with how E varies with S at constant V. This corresponds to an exchange of heat, as contrasted with an exchange of work. For small changes, dE = (dE/dS)dS, where for normal systems dE/dS > 0. Now, suppose 1 and 2 are isolated bodies that are brought into contact, so they can exchange energy, perhaps through their surfaces. The interaction is assumed sufficiently weak that the separate identities of the bodies is not compromised. The total change in energy dE = dE_{1} + dE_{2} = (dE/dS)_{1}dS_{1} + (dE/dS)_{2}dS_{2} = 0. (dE/dS)_{1} = < (dE/dS)_{2}(-dS_{2}/dS_{1}). Let us suppose that dS_{1} > 0, which in normal cases means that system 1 receives energy, and system 2 loses an equal amount. dS_{2} must be negative, and may not be less than -dS_{1}, by the Second Law, so (dE/dS)_{1} < (dE/dS)_{2}. In equilibrium, the two quantites will be equal.

The quantity kT = dE/dS can be called the Temperature, since it has all the necessary properties of this magnitude as defined in school physics. However, we now have a definition that is independent of the properties of any substance, that gives meaning to temperature intervals and even to the absolute value of the temperature, and which can be extended to unusual cases (such as when dE/dS < 0). The constant k has the dimensions of energy per temperature unit. If the temperature unit is the kelvin K, and the energy unit the joule J, then k = 1.381 x 10^{-23} J/K, called *Boltzmann's constant*. It is usual to include k in the definition of the entropy, S = k ln W, where W is the *thermodynamic probability*. The kelvin is defined as 1/100 of the absolute temperature difference between the ice point and the steam point, incidentally.

The derivative dE/dV = p, where p is the pressure exerted by the system on its boundary. There are other ways in which work can be done by or on the system, but this is a good example. We now have dE = TdS + pdV, which is the differential form of the First Law of Thermodynamics. By considering the atomic structure of matter, we can bypass completely the sterile postulatory basis of axiomatic thermodynamics, and gain a deeper insight into what is going on, as well as an appreciation of the limitations of our analysis.

Judging from a good school physics reference (Whelan and Hodgson), all material on entropy is beyond "A" level. Even that material does not contain the argument given here, and does not show that a statistical definition of temperature is related to the direction of heat flow. Only the concept of entropy is needed above as one of the variables determining the internal energy, not its explicit calculation by statistical mechanics. The Second Law of Thermodynamics is actually much more easily understood than the First Law. It is a shame that thermodynamics is made so difficult and inaccessible to the student.

Return to Physics Index

Composed by J. B. Calvert

Created 23 June 2000

Last revised