Skip this long contribution if you are not interested in understanding the
concept entropy and how it is related to organisations.
You will increasingly experience chaos in the coming years. You will also
hear more and more about chaos. You will want to know more about chaos.
You will eventualy encounter statements such as "entropy is chaos" and
"The Second Law entails that the world will end in chaos". If you accept
propsitions such as these to represent the truth, then there is little
hope for you, if any.
Some of you, despite such little hope, will still deparately try to
understand more about chaos. You will read that Prigogine discovered that
all material self-organising systems are dissipative systems - a discovery
which earned him the Nobel prize. This is true. But you will also read
that Prigogine discovered how disspative systems obviate the Second Law by
opening themselves up since the Second Law only applies to closed systems.
This is a farce - Prigogine would never make sush an error of restricting
the Second Law to closed systems. On the contrary, his main drive up to
the end of the sixties was to understand how the Second Law extends to
open systems.
Now what makes a system dissipative? A system is dissipative when it
creates entropy. It does not matter whether the system is closed or open.
A system is open when it can exchange energy and matter with its
surroundings. But if it can exchange only energy, then it is closed. Only
if it cannot exchange enything, is it isolated. The advantage of an open
system is that can create entropy faster and also that it can
import/export entropy faster than closed systems. The reason is that
matter is a very effective carrier of entropy than pure energy.
What then is entropy? Entropy is a physico-chemical quantity like mass,
length, volume, pressure or energy. Each physico-chemical quantity has its
own defining unit(or interrelated units) by which it can be measured and
calculated. A physico-chemical cannot exist without a unit, just as a
mammal cannot exist without a heart. The quantity entropy does have its
own unit. But you might hear about negentropy which obviate chaos and thus
might believe that it is also a physico-chemical quantity. It is not
because it does not have a unit.
Entropy (symbolised by S) is closely related to energy (symbolised by E).
Yet S and E are different quantities because they have basically different
units. However, they are closely related because both comply to universal
laws which have an extraordinary "correspondence" between them.
The universal law for energy E may be formulated as follows:
When anything changes in any material system, the energy of
the universe (sum of energies of the system and its
surroundings) never changes.
Thus this law is also known as the Law of Conservation of Energy
(LCE). It is also called the First Law of Thermodynamics because in
trying to establish or refute this law, the Second Law of
Thermodynamics was disovered.
The universal law for entropy S may be formulated as follows:
When anything changes in a system, the entropy of the universe
(sum of entropies of the system and its surroundings) always
increases.
Thus the law can also be called as the Law of Production of Entropy
(LPE). However, few have done so because few realised how important
the productive feature of the law is to life. The law is usually
known as the Second Law of Thermodynamics.
Both the energy E and the entropy S have their own operational
definitions. In order to measure or calculate any physico-chemical
quantity, we need the so-called operational definition of that quantity.
The effect of the operational definition is to establish a unit for the
quanitity. By now measuring the various quantities of any phenomenum as it
develops and then comparing these measurements, scientists discover
patterns between the quantities. Thus our knowledge on each of these
quantities complexifies, but beginning with the operational definition of
each as the starting point. Consequently the operational definitions of
physics and chemistry are the minimum, innate organisation that we have to
accept before we can complexify our physical and chemical knowledge.
I am not going to help you to complexify your knowledge on entropy by
beginning with its operational definition. For our purpose we will have to
wade through too much physics and chemistry and too little of your
expertise or experience of human organisations. (If you want to complexify
your knowledge on entropy by begining with its operational definition, you
are welcome to do so. Just consult a standard textbook on physics or
chemistry in which the topic thermodynamics is treated.) I will rather
begin with a most important discovery in the previous century by the
emminent American scientist JS Gibbs.
Gibbs discovered a way to combine the entropy S and the energy E of a
system into one single quanitity, called the free-energy F of the
system. The following became very clear through many studies in
especially chemistry:
The free-energy F of any system affects (has a decisive say
in) the FUTURE organisation (development) of that system.
Unfortuantely, it is usually said in much enshrined terminology. For
example, the free energy is a "state function of the system such that
the change between two states is independant of the path followed".
Let us now assume the indented sentence to be the minimum, innate
organisation on which you will complexify your knowledge of entropy. Study
that sentence closely. I am almost sure that the only part of it unknown
to you is "free-energy F". The other words like system, decision, future
and organisation will be well known to you. Thus I will assume that the
rest should be very clear to you.
I know that many of you will now object to the implication of that
sentence. Does it not amount to hubris (ignorant arrogance) to say that
there is indeed a quanitity (called the free-energy F) which determines
the future organisation of the system? Is it not so that most commentators
(speculators?) on complexity say that the determinism of Newton mechanics
and hence its influnece on philosophy as positivism are preventing us to
understand complexity? Yes. But when we have to get off our nests to make
our first maiden flight, we can think of even worse things to say to
express our dark fears. However, fear, unlike flying, will bring us
nowhere. Even eagles know it.
Let us consider any material system, alive or inanimate. Think of the
total energy E of the system. The "total" means that no form of energy is
left out - all forms are taken into account. Think also of the free-energy
F of the system, the energy needed to affect the future organisation of
the system. Now think of the difference (E - F) between the total energy
E and the free-energy F. What does this difference represent? If we
carefully think of what F represent, then (E - F) represents the remainder
of the total energy which CANNOT affect the future organisation of the
system. In other words, it is that part of the total energy E which is not
available for developmental purposes.
In the relationship which Gibbs discovered between energy E and entropy S,
the difference (E - F) plays a central role. However, at the end of the
previous century and the beginning of this century when scientists and
engineers had to interprest this difference (E - F), the concept
organisation played a minor role in their lives. Furthermore, they were
mainly interested how much they could get out of a system's future
changes. Thus they simply said that (E - F) represents the unavailable
energy. Unavailable for what? They did not complexify themselves so far
that they could say that it was unavailable for future organisational
purposes.
Furthermore, they did not even try to reformulate this negative statement
into a positive statement? In other words, if F is necessary for future
organisation, for what is (E - F) necessary? The answer, although trivial,
is most instructive. (E - F) is necessary to maintain the present
organisation of the system. The development into the future organisation
must have as starting point the present organisation (structures and
processes) of the system. The future cannot develop from a present state
devoid of any organisation.
The fact that (E - F) is necessary to maintain the present organisation of
the system has profound ramifications. Furthermore, the more these
ramifications become evident, the more their complexity makes us fear the
other profound ramifications of (E - F). For if it is true, then why do we
know so little about (E -F)? What will become of us if keep on
transforming present organisations without taking notice of (E - F), the
energy needed to maintain the present organisation of the system? Do we
not know from experience in business that if we keep our eyes closed to
the money (called cash flow) needed to maintain the present organisation,
we will soon be out of business. What then is this (E - F)?
Here then is the relationship between the energy E and the entropy S
disovered by Gibbs:
S = (E - F)/T
The entropy S of the system is equal to the difference (E - F)
divided by the absolute temperature T. This means that our
interpretation of entropy S will be related to our interpretations to
(E - F) as well as T. Let us leave T aside for a moment and consider
(E - F). The factor (E - F) means that the entropy S is somehow
related to maintaining the present organisation (structure and
process) of the system.
But almost everybody is chanting that entropy is chaos - the first ever
interpretation given to entropy almost 150 years ago. Which interpretation
should we follow: "entropy is chaos" or "entropy maintains present
organisation"? The choice is yours. But if you want to understand
Prigogine's discovery that self-organising systems are dissipative, then
you will seriously have to consider the second interpretation. Why?
Well, if the present entropy S of a system is needed to maintain the
PRESENT organisation (structure and process) of a system, then that same
entropy cannot also affect the FUTURE organisation of the system. We know
that the free-energy F affects the future organisation. But is it possible
to reformulate this last sentence by using entropy rather than
free-energy? Yes. Since the present entropy S has to maintain the present
organisation of the system, some other entropy has to be created to affect
the future organisation of the system! This creation of additional entropy
has become known as "dissipation" through unfortunate events. We will use
the phrase "entropy production" rather than "entropy dissipation".
I always find it most exciting that the quantity entropy and its law
behave in this manner. Say, for argument's sake, that it is not possible
to create (produce) entropy - that the Second Law does not exist or that
it can be sidestepped - that entropy has to be conserved just like energy.
Then it mean that the we will be stuck with the present. We will also
remain to be victims of the past because we will have to conserve entropy
just as energy is conserved. Furthermore, the future cannot be different
from the present since its entropy cannot be different. What a bleak
outlook!
Before we go deeper into the production of entropy and how it affects
self-organisation, let us question the quantity T called the
thermodynamical or absolute temperature. It is not a temperature based on
measurements made by an arbitrary designed thermometer. It is a
temperature of which its zero point is suggested by all thermodynamical
systems themselves. (It is thus a phenomenological quantity.) The most
important feature of this absolute temperature is its zero point. It is
the lowest temperature possible.
Again, almost a century ago, long before they even could get close to it,
scientists and engineers suggested that at absolute zero all systems are
at perfect order. It is an acceptable suggestion if entropy is chaos. But
if entropy rather has to do with the present organisation (structure and
process) of the system, this suggestion becomes questionable. It is far
better to suggest that at absolute zero the organisation of the system
constitutes merely structures - all processes by then have ceased. Thus,
as the temperature increases, the processes become more important.
Let us consider the relationship between entropy production and
self-organisation. Again it was Prigogine, almost fifty years ago, who
discovered how entropy is produced. Entropy is produced by so-called
force-flux pairs. Each form of energy which changes in the system has its
own force-flux pair. The force is generated by the intensive (qualitative)
factor of that form of energy while the flux is generated by its extensive
(scaleable) factor. Hence the more complex a system, the more and the
greater the variety in its possible force-flux pairs.
As the entropy in the system gets produced by its force-flux pairs, it has
to be transported (commuted) to the rest of the universe. Each transport
avenue has an innate resistance (transmibility). Thus, if the production
of entropy is ACCELERATED, not all of it can be transported away fast
enough. Thus there is a build up of produced entropy. The point of
saturation is better known as the bifurcation point. After that the
point, the additional entropy gets locked up inwards by the emergence of a
new order with its own structures and processes. The organisation of the
system changes significantly. The netto result is very well described by
the term "self-organisation".
Only some of all the possible force-flux pairs in a system will be
actively contributing to the entropy production. Thus they each will have
to be very large to drive the system to its bifurcation point. Usually,
before this can happen, the system's organisation breaks up (immerge).
Thus, for the system to reach its bifurcation point more easier and
safely, as many as possible force-flux pairs have to be activated. To do
this, seven patterns in the system have to be intact. (I have named them
the seven essentialities of creativity.) If one or more of these patterns
are impaired, the emergence will not happen. The system will not
self-organise.
Up to just before the previous paragraph, we learnt that all material
self-organising systems have to be dissipative. But in the last paragraph
we learnt that this does not mean and it is not the case that all
dissipative systems are self-organising. Those dissipative systems which
are not self-organising, are destructive systems. In other words, the very
conditions which favour the self- organisation of a system, also favour
the unorganisation of the system.
Oh, how confusing! It almost appears as if we are now subscribing to
undeterminancy - as if we will never know what will become of the system's
organisation - as if we will have to stop predicting the future of
organisations. The reason why it all appears so confusing, is our own lack
of self-organisation and our myth that the understanding of complex things
is itself simple. What we have to try and understand is that the
conditions which favour both self- organisation and destruction, merely
concerns the dynamics of organisations. That sufficiencies which allow us
to distinguish between self-organisation and the destruction of
organisation, to manage self-organisation and avoid possible destructions,
is of an emergent nature and thus a higher order. To understand these
seven sufficiencies, we will have to step beyond the edge of mental chaos.
Best wishes
--At de Lange Gold Fields Computer Centre for Education University of Pretoria Pretoria, South Africa email: amdelange@gold.up.ac.za
Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>