Primer on Entropy - Part III C LO20049

AM de Lange (amdelange@gold.up.ac.za)
Tue, 24 Nov 1998 13:21:33 +0200

Replying to LO19979 --

PART III -- THE GENESIS OF ENTROPY PRODUCTION (Continue)

Entropy of molecules
--------------------------
Through the equation E + F -TS, Gibbs made it possible for chemists to
pursue Chemical Thermodynamics. They already knew many so-called
"equilibrium constants" (solubility, acid strength, redox potential,
etc. But it was Gibbs' Systems Thinking which gave them the common web
to see how all these constants are related. Gibbs' work also helped
them to calculate something very important, namely the thermodynamical
properties of elements and compounds, including their standard
entropy. This was a slow and mind boggling process.

Here is a very interesting result -- a pattern which occur with regularity
among chemical compounds.. The standard entropy (in units J/K/mol) of the
compounds (gases) methane (CH4) ethane (C2H6), propane (C3H8) and butane
(C4H10) are as follows:

CH4 C2H6 C3H8 C4H10
186.2 229.5 269.9 310.0

A CH4 (methane) molecule has only one C (carbon) atom in its centre to
which 4 H (hydrogen) atoms are bonded. But a C3H8 (propane) molecule has
three carbon atoms C as its backbone. At each of the end C atoms three H
atoms are bonded while at the C atom in the centre two H atoms are bonded.
Thus C3H8 has a much richer internal structure than CH4. This increase in
the internal organisation or order is reflected by -CH2- units. It is also
reflected by an increase of approximately 41J/K/mol in entropy for each
-CH2- unit.

But what about the increase from the gas C4H10 (butane) to the liquid
C5H12? (We will consider only the normal chains.) The values of the
standard entropy (J/K/mol) calculated from empirical data are:

C4H10(gas) C5H12(liq) C6H14(liq)
310.0 262.8 303.5

Students are taught in terms of the fixed chaos interpretation that the
entropy of C5H12 is LOWER since it is a liquid and not a gas (less
disorder is involved). As far as that which meets the eye, there is
nothing wrong with this "explanation". But when we compare C5H12 and C6H14
which are BOTH liquids, the increase is again 40.7J/K/mol, something which
cannot be explained by the chaos interpretation.

Let us see if we can find a "chaos+order" explanation for the lowering
between C4H10 and C5H12. The unit J/K/mol refers to the entropy of one
"mole"of molecules. (One mole of entities is roughly equal to a million
times a billion times a billion of individual entities, i.e
1[mol]=6x10^23[units]. The unit [mol] is a collection of number like
1[dozen]=12[units] or 1[gross]=144[units].) What exactly are the "units"
involved? In the gaseous state one C4H10 molecule is the "unit of whole".
Except at the moment of collision, the molecules are independent of each
other. But in the liquid state at least two molecules are closely
attracted to each other to form a complexer, yet looser, unit. Let us
consider the "liquid molecule" to consist of at least two "gaseous
molecules". Thus the standard entropy of C5H12 per mol of "bimolecular"
molecules is 525.6J/K/mol (by doubling the amount). Clearly, the jump of
the entropy of mono-molecular molecules (C4H10) in gases to (at least)
bimolecular molecules (C5H12), is 215.6J/K/mol. This an increase of
entropy! It consists of two increases. The one increase is 82J/K/mol for
the addition of another -CH2- unit (2x41J/K/mol). The other increase is
133.6J/K/mol (215.6-82.0) for the change of the simple gaseous
organisation to the more complex liquid organisation.

Some people still try to argue that the increase in entropy has nothing
directly to do with the increase in organisation of the molecules. They
say that the increase is merely due to the increase in molar mass (mass
per mole) of the compounds. But what is mass itself? Is the mass m not
according to E=mc^2 the "freezing" of electromagnetic energy E into a
space-time point? Is this not an increase in organisation?

Entropy production in general for physical world
-----------------------------------------------------------
This is our penultimate section. Thus the end is in sight.

I will now have to introduce one more symbol to make things easier for all
of us. When I think of "change in entropy S", I will write "/\S". When
you see "/\S", you then think of "change in entropy S". It is as easy as
that.

Let us consider our experimental bar once again.
(a)=<=========<==========<=(b)
300K 400K
+1200J -1200J
+4J/K -3J/K

Please note that the shorthand for "change in entropy at ^Qa' " is
"/\Sa". Likewise let Q be the symbol for "heat flow" and Ta be the
symbol for the absolute temperature T at "a".

We have seen that:
change in entropy at "a" is heat flow divided by temperature, or
/\Sa = Q/Ta
= +1200J/300K
= +4J/K
Similarly
/\Sb = Q/Ta
= -1200J/400K
= -3J/K
Thus, if /\S symbolises the total entropy production, then
/\S = /\Sa + /\Sb
= +Q/Ta - Q/Tb
= +1200J/300K -1200J/400K
= +1J/K

Let us go over it again. The equation for calculating the total
entropy production when heat O flows from a higher temperature Tb to a
lower temperature Ta is given by

/\S = Q/Ta - Q/Tb

The two parts Q/Ta and Q/Tb of the equation are called "terms" because
they are separated by the signs +/- for "addition/substraction". If we
have two parts separated by the sign x for "multiplication", we call
them factors. Can we arrange the equation above into factors? Yes, we
make use of the distributive law of algebra which gives:

/\S = [1/Ta - 1/Tb]xQ

The one factor is [1/Ta - 1/Tb] and the other factor is Q. They are
"complementary duals". (The factor [1/Ta - 1/Tb] itself now consists
of two terms!)

Have we gain anything? Yes, very much. The one factor Q is the "heat
flow"or a kind of "becoming". We may easily think of the other factor
[1/Ta - 1/Tb] as a kind of "being". We will now call the "becoming"
(heat flow Q) an "entropic flux" (or a thermodynamic flux). We will
furthermore call the "being" (the factor [1/Ta - 1/Tb]) an "entropic
force" (or a thermodynamic force). Thus, when we look at the equation

/\S = [1/Ta - 1/Tb]xQ

we see that it is made up of an "entropic force-flux pair" (or a
thermodynamic force-flux pair).

Prigogine uses the word "thermodynamic", but I prefer the word "entropic"
because it is a closer description of what we are trying to comprehend.

The words "thermodynamic force" and "thermodynamic flux" are post WWII
terms, known to relatively few physicists and chemists who have studied
irreversible thermodynamics. Thus they were not known to any other
physicist or chemist before WWII. Why not?

Before WWII nobody realised that the entropy production could be
calculated by

/\S = [1/Ta - 1/Tb]xQ

when the flow of heat is the ONLY irreversible process taking place
like in our experimental bar. Thermodynamists had to make calculations
based on many formulas for a complex path of reversible processes to
determine entropy changes for any process in which more than merely
heat flows. In chemistry these complex paths of reversible processes
become so hideous that most students in physical chemistry develop an
aversion to chemical thermodynamics. Please have compassion for any
lecturer in physical chemistry -- they live in a nightmare.

But what happens when more forms of energy than merely heat begin to
flow? What happens when these forms of energy are converted from one
to another? Can we still use the equation

/\S = [1/Ta - 1/Tb]xQ

for the total entropy production? No. This equation can be used for only
the contribution of the heat flow to the total entropy production. How can
we then calculate each of the other contributions to the total entropy
production?

For eighty years since the days of Clausius (1865) up to 1945, nobody
asked this question. Why not? Because almost all physicists, chemists and
engineers considered the real physical world to be made up of a
reversible, conservative systems. When they allowed for irreversibility,
it was as a virtual complication upon this view point. But in 1945 a young
man, Ilya Prigogine, began to look from a different view point at the
whole physical universe. Irreversibility was the reality and reversibility
was a virtuality imposed on it. So he began to try and discover how
entropy is produced. Eventually, by making use of Gibbs' equation and LEC,
he was able to derive the general equation for entropy production. The
equation is hideous. Thus it is much better trying to understand how the
general equation works as follows.

Already in the previous century, scientists noticed that when a system is
divided in half, some of its properties (measurable quantities) also
divide in half while others stay the same. For example, when we divide an
electrical dry cell (torch battery) in half, some quantities like the
mass, volume and charge for the two halves will be half of that for the
whole, but the density, temperature and voltage will be the same for the
two halves as for the whole. Those quantities which divide (becomings) are
called extensive quantities while those which stay the same (beings) are
called intensive quantities. For almost a century this classification
between extensive and extensive quantities remained a curiosity. Nobody
even thought that this classification is vitally important to the
functioning and organisation of the physical universe. (Mathematically,
such a relationship is called an Euler function. Its physical basis is the
very quantum effect itself.)

But because of Prigogine's ground breaking work, it is now clear why.
Every form of energy is made up by two factors, the one intensive (being)
and the other one extensive (becoming). These two factors could be
measured and thus their product be calculated to give the amount for that
form of energy. For example, electrical energy is made up by the voltage
factor (intensive) and the charge factor (extensive). Expansion energy is
made up by the pressure P factor (intensive) and the volume V factor
(extensive).

Here then is the key to some great insights: A difference in the
intense factor of a form of energy gives rise to an entropic force
while the change in the corresponding extensive factor gives rise to
the complementary entropic flux. Thus we have a series of entropy
production equations, one for each form of energy. Here are three such
equations, beginning with the faithful horse of Clausius:

/\Sheat = [1/Ta - 1/Tb]x/\thermal_energy
/\Sexpand = [(press/T)a - (press/T)b]x/\volume
/\Selectric = [(volt/T)a - (volt/T)b]x/\charge

Remember that the /\ means "change". The second equation says that the
change in entropy for expansion energy, namely "/\Sexpand", is equal
to the entropic force "[(press/T)a - (press/T)b]" multiplied by the
entropic flux "/\volume". The entropic flux "/\volume" is nothing else
than a "change in volume". The entropic force "[(press/T)a -
(press/T)b]" is made up by the difference in two terms, namely the
term "(press/T)a" or "pressure divided by absolute temperature for
region a" and the term "(press/T)b" or "pressure divided by absolute
temperature for region b". The third equation does the same thing for
electrical energy.

In other words, each of these equations for a form of energy "s"have
the following form

/\Ss = [(ints)a - (ints)b]x/\exts

where "(ints)a" symbolises "intensive quantity of energy form ^Qs'
divided by temperature for region a" and "/\exts" symbolises "change
in extensive quantity of energy form ^Qs' ". Prigogine shortens this
symbolism even further for an energy form "s" by writing

/\Ss = XsJs

where Xs is the thermodynamical (entropic) force and Js is the
thermodynamical (entropic) flux. Thus the total entropy production is
given by

/\S = Sum[/\Ss] = Sum[XsJs]

Remember that for energy form "s" the entropic force Xs is given by
the difference in being "[(ints)a - (ints)b]" while the entropic
flux Js is given by the becoming flow "/\exts". You will probably
remember the form of the equations best by the following expression:

/\Ss = (entropic force s)x(entropic flux s)

so that

/\S = Sum[/\Ss]

and hence NB NB

/\S = Sum[(entropic force s)x(entropic flux s)]

I have given only three such partial entropy production equations. Tens
of others could be described. In ALL of them the absolute temperature T
occurs in the "dominators" of the differences making up the entropic
forces. In only ONE of them the pure number "1" rather than an intensive
quantity like pressure or voltage occurs in the "numerator", namely the
faithful old horse of Clausius. What does this "1" mean? Lower than the
level of molecules, i.e at the level of "subatomic particles" (nuclei and
electrons), or at even a lower level of "fundamental particles, there are
no definite stable units. It is only at the level of basic chemistry
(atoms, molecules and ions) that some units are stable, particularly in
gases. Thermal energy becomes measurable at that this level through
temperature. At a higher level of complexity we measure quantities like
pressure and volume, voltage and current, chemical potential and mole
amounts. This means that in the entropy production equation the one "1"
(for thermal energy) now gets generated into a hierarchy of intensive
quantities (like pressure or voltage) for all the other forms of energy
which have emerged.

Jan Smuts, the father of holism who wrote the book "Holism and Evolution"
already in 1926, was deeply under the impression of the role of "evolution
fields". What are these "evolution fields" in terms of what we know now?
They are nothing else than the entropic forces given by the being
difference "[(ints)a - (ints)b]". Can you visualise in your mind how, for
two regions "a" and "b", their difference in "intensive quantity of an
energy form divided by temperature" sets up what he calls an "evolution
field"? Furthermore, can you visualise in your mind what the ancient Greek
Heraclitus said 2500 years ago by "panta rhei" -- all flow? Whatever you
visualise, never forget that both the "entropic force"and the "entropic
flux"of an energy form has to exist before entropy is produced. Only one
(either force or flux) is not enough.

Exactly how does this entropy production (or irreversibility) given by
/\S = Sum[(entropic force s)x(entropic flux s)]
causes the change in organisation (chaos and order) of the physical
world? This is not any more a subject of this "Primer on Entropy". For
that subject you will have to consult books like "Order out of Chaos"
by Ilya Prigogine and Isabella Stengers or "The Self-Organising
Universe" by Erich Jantsch. These books concern self-organisation in
the physical universe. I do hope that this primer will give you enough
background to study such books with more confidence. I also do hope
that you will experience this last section "Entropy production in
general for physical world" as part of the "Primer on Entropy" and not
as something beyond the comprehension of ordinary people.

Cross inductions or Onsager complexity
------------------------------------------------
A "entropic force"force like pressure difference, will normally give
rise to its own "entropic flux", namely a change in volume. The
pressure (intensive) and the volume (extensive) form the two factors
of the "expansion form"of energy. But the pressure difference may also
lead to a "chemical form" of energy like in chromatography or an
"electrical form" like in Piezo crystals. In other words, an entropic
force may not only give rise to its corresponding flux, but to many
other fluxes in a dazzling variety. Such a proliferation of fluxes in
a complex system are described by the so-called Onsager Reciprocal
relationships. It simply means that one cause can result in many
effects.

The Onsager cross inductions are the main reason why it is so difficult tp
perceive causality in complex systems. One cause, many effects -- which
one will it be? The answer is important. Not one is the effect, but all of
them, the whole spectrum! Take extreme caution not to set up a
"dialectical dual" between the normal flux and any one of the cross
induced fluxes.

The complexer a system becomes, the more the likelihood of Onsager cross
inductions. Plan an antecedent A (or cause) with an effect B and suddenly
a non-linear Pandora box of effects B1, B2, B3, .... open up. It is these
Onsager cross inductions which make life so seemingly unpredictable and
give managers ulcers. But it is also these very Onsager cross inductions
which are at the heart of the majority of modern day technologies.

Entropy production in general for spiritual world
-----------------------------------------------------------
This is where the author of this primer gets into the picture.

During 1982-1983 I have discovered empirically that entropy production
also happens in the abstract world of mind. It helped me to see many
things in a new perspective. But it also made me realise how important it
is for you to have a "Primer on Entropy", even though the primer is merely
concerned with the physical world. To bring in the spiritual world is far
beyond the scope of this primer. But if you look in the physical world
with all your powers of perception, you will begin to visualise by
emergences how it mirrors the spiritual world. May neither materialism nor
spiritualism be operating, but a harmony between these two worlds.

Best wishes

-- 

At de Lange <amdelange@gold.up.ac.za> Snailmail: A M de Lange Gold Fields Computer Centre Faculty of Science - University of Pretoria Pretoria 0001 - Rep of South Africa

Learning-org -- Hosted by Rick Karash <rkarash@karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>