Revised 24 May 2019

There are many different forms and interpretations of the law of entropy that incorporate the statis­tical mechanics concept of entropy as comprising a measure of order and disorder. In developing the kinetic theory of gases, Maxwell (1871) set up a model depicting the mol­ecules of a gas as bounding in all directions and velocities off each other and the walls of a perfectly elastic container. Along with Boltzmann (1877), he devel­oped an equa­tion which showed that the distribution of molecular veloc­ities of a gas at a par­ticular tempera­ture comprised a few mol­ecules moving very slowly or very quickly with the larger percen­tages moving at intermediate veloc­ities. A rise in tempera­ture causes the average velocity of the molecules to rise ─ an increase in the kinetic energy of the molecules ─ while a drop in tempera­ture caused the average veloc­ity to fall. Mech­anical work, as opposed to the haphazard motions of indi­vid­ual molecules against inter­molecular forces, involves the orderly motion of molecules. Whenever mech­an­ical work is dissi­pated into heat, the temperature of the system rises and the disorderly motion of mol­ecules increases. 

The kinetic theory of gases pic­tures tempera­ture and heat as involving molecu­lar move­ment and an interpre­tation of the Second Law of Thermodynamics as being a tendency for nature to proceed from a state of order to disorder thus pro­viding a qualitat­ive and supplemen­tary under­standing of the laws of classical thermody­namics.

The Law of Entropy can be stated as:

In spontaneous processes, concentrations tend to dis­perse, struc­ture tends to disappear, and order becomes disorder.

There is a degree of simi­larity between statisti­cal mech­anics and quantum theory. Both theories have an explana­tory and predic­tive value relating to micro­scopic phe­nom­ena and utilise the theory of probability. However, by making use of prob­ability theory, statis­tical mechanics allows the possibility of water being able to sponta­neously separate into a hotter and colder region. Human experi­ence denies this possibil­ity. Accord­ing to statis­ti­cal mechanics, even though this possibility is very small and may not occur within the time frame of any human experi­ence, the possibility none­theless exists.  

Shannon and Weaver (1949) also used the theory of prob­ability in the development of information theory in response to a study of how to most effi­ciently transmit a signal of information through telephone lines subject to noise interference. Shannon used the term 'entropy' to describe the measure of the 'amount' of infor­ma­tion in the trans­mission of a signal perhaps on the basis of simi­larity in math­ematical form to Boltzmann's equation for thermodynamic entro­py. The information theory use of the term ‘entropy’ would seem to be metaphorical rather than a relationship of process because entropy is a measure of order and disorder in statisti­cal mech­anics, whereas the mean­ing of a message is irrel­evant. The con­cept of entro­py has also been utilised to define the arrow of time as being in the same direction as followed by all natural pro­cesses due to dic­tates of the Second Law of Thermodynamics (Coveney & Highfield, 1990). 

Georgescu-Roegen (1971) regards Boltzmann's statisti­cal approach to entropy opens the door to vacuous interpreta­tions of what entropy means and that statisti­cal mech­anics is logical­ly flawed by being underpinned by classi­cal mech­anics which denies qualitat­ive change in the uni­verse. The posi­tion adopted by Georgescu-Roegen is that entro­­­­­­­­­­py is neither reduc­ible to loco­motion nor to probability nor any sub­jec­tive element. According to Georgescu-Roegen (1971, p. 9):

“The entropic phenomenon of a piece of coal burning irrevocably into ashes is neither a flow of probability from a lower to a higher value, nor an increase in the onlooker's ignor­ance, nor man's illusion of temporal succession.” 

The law of entropy was derived as a physics of the economic use of heat in an engine and not from the prin­ciples of classical mech­anics that reduces all phenomena to reversible locomo­tion. As Georgescu-Roegen (1971, p. xiii) emphasises: 

“...the dis­covery of the Entropy Law brought the downfall of the mechanis­tic dogma of classical physics which held that everything which hap­pens in any phenomenal domain whatsoever consists of locomotion alone and, hence, there is no irrevocable change in nature.” 

The First Law of Thermodynamics does not con­tra­dict the laws of mechanics, but the Second Law of Thermodynamics, the law of entro­py, is in direct contradiction with the laws of classical mech­anics in that the law of entropy introduces the element of an irrevocable quali­tative change when systems undergo any process.