Entropy as a Measure of Order & Disorder
Revised 24 May 2019
There are many different forms and interpretations of the law of entropy that incorporate the statistical mechanics concept of entropy as comprising a measure of order and disorder. In developing the kinetic theory of gases, Maxwell (1871) set up a model depicting the molecules of a gas as bounding in all directions and velocities off each other and the walls of a perfectly elastic container. Along with Boltzmann (1877), he developed an equation which showed that the distribution of molecular velocities of a gas at a particular temperature comprised a few molecules moving very slowly or very quickly with the larger percentages moving at intermediate velocities. A rise in temperature causes the average velocity of the molecules to rise ─ an increase in the kinetic energy of the molecules ─ while a drop in temperature caused the average velocity to fall. Mechanical work, as opposed to the haphazard motions of individual molecules against intermolecular forces, involves the orderly motion of molecules. Whenever mechanical work is dissipated into heat, the temperature of the system rises and the disorderly motion of molecules increases.
The kinetic theory of gases pictures temperature and heat as involving molecular movement and an interpretation of the Second Law of Thermodynamics as being a tendency for nature to proceed from a state of order to disorder thus providing a qualitative and supplementary understanding of the laws of classical thermodynamics.
The Law of Entropy can be stated as:
In spontaneous processes, concentrations tend to disperse, structure tends to disappear, and order becomes disorder.
There is a degree of similarity between statistical mechanics and quantum theory. Both theories have an explanatory and predictive value relating to microscopic phenomena and utilise the theory of probability. However, by making use of probability theory, statistical mechanics allows the possibility of water being able to spontaneously separate into a hotter and colder region. Human experience denies this possibility. According to statistical mechanics, even though this possibility is very small and may not occur within the time frame of any human experience, the possibility nonetheless exists.
Shannon and Weaver (1949) also used the theory of probability in the development of information theory in response to a study of how to most efficiently transmit a signal of information through telephone lines subject to noise interference. Shannon used the term 'entropy' to describe the measure of the 'amount' of information in the transmission of a signal perhaps on the basis of similarity in mathematical form to Boltzmann's equation for thermodynamic entropy. The information theory use of the term ‘entropy’ would seem to be metaphorical rather than a relationship of process because entropy is a measure of order and disorder in statistical mechanics, whereas the meaning of a message is irrelevant. The concept of entropy has also been utilised to define the arrow of time as being in the same direction as followed by all natural processes due to dictates of the Second Law of Thermodynamics (Coveney & Highfield, 1990).
Georgescu-Roegen (1971) regards Boltzmann's statistical approach to entropy opens the door to vacuous interpretations of what entropy means and that statistical mechanics is logically flawed by being underpinned by classical mechanics which denies qualitative change in the universe. The position adopted by Georgescu-Roegen is that entropy is neither reducible to locomotion nor to probability nor any subjective element. According to Georgescu-Roegen (1971, p. 9):
“The entropic phenomenon of a piece of coal burning irrevocably into ashes is neither a flow of probability from a lower to a higher value, nor an increase in the onlooker's ignorance, nor man's illusion of temporal succession.”
The law of entropy was derived as a physics of the economic use of heat in an engine and not from the principles of classical mechanics that reduces all phenomena to reversible locomotion. As Georgescu-Roegen (1971, p. xiii) emphasises:
“...the discovery of the Entropy Law brought the downfall of the mechanistic dogma of classical physics which held that everything which happens in any phenomenal domain whatsoever consists of locomotion alone and, hence, there is no irrevocable change in nature.”
The First Law of Thermodynamics does not contradict the laws of mechanics, but the Second Law of Thermodynamics, the law of entropy, is in direct contradiction with the laws of classical mechanics in that the law of entropy introduces the element of an irrevocable qualitative change when systems undergo any process.