Dear Organlearners,
Ray Harrell <mcore@idt.net> writes:
>>Therefore two house or cheese are halfed to one hou or chee.
>Unless of course half is not less but more (as in "less is more")
>with a change of vowel from (u) to (i) in which case it would be
>a change like heese (as in quantum geese) or an addition like
>one cheese halfed would become two chiafim. That is if the
>cheese is truly angelic.
Greetings Ray,
Your contributions of lately had been scarce. Shame on you. I enjoyed them
because they are rich in "otherness", for one thing.
Anyway, this "halving" has a long track record which you might not have
followed. I will try to give a short summary. Then I will ask you to do
something which is very important to me.
Early this century, after enough physical quantities have been discovered
to paint a complex picture, scientists began to notice the following
peculiarity. When a system is scaled (increased or decreased in size),
some quanities get scaled while others remained the same. The scalable
quantities are called extensive quantities and the invariant quantities
are called intensive quantities. Electrical potential (measured in volt)
is intensive while electrical charge (measured in ampere-second, i.e
coulomb) is extensive.
Quantum mechanics is nothing else than a study of differential equations.
Many years, before the birth of quantum mechanics or the discovery of the
extensive/intensive categorisation of physical quantities, the
mathematician Euler noted in differential equations that some scalable
functions have the property that their first order differentials are
invariant to the scaling. They became known as Euler functions.
However, except for a possible connection with Euler functions, physicts
did not know what to make of this extensive/intensive categorisation of
physical quantities. It appeared to be an enigma because it did not seem
to relate to any of the known laws of physics.
Then engineers began to get into serious designing problems. The known
laws of physics were either too complicated or too restricted to help them
when designing for conditions far from equilbrium (at the edge of chaos).
A typical example is the design of a jet plane which has to move faster
and faster until it even crosses the sound barrier. They began to build
small models to test and improve on their designs. But as soon as they
scaled that models to the actual requirements, the full size models
failed. Bridgeman began to pave a solution in terms of what he called
dimensional analysis. The idea was to combine related physical quantities
in such a manner that they produced unitless numbers. (The Reynold's
number which Leo Minning wrote about some time ago, is one such a number.)
The scaling had to take these dimensionless numbers into account. It was
noticed that the dimensionless numbers were intensive.
Meanwhile, thermodynamists began to notice that it was possible to
formulate thermodynamics in terms of Euler functions. Yet the deep nature
of the the extensive/intensive categorisation of physical quantities
remained a mystery.
A few years after WWII Prigogine managed to derive the equation for
entropy production, beginning with the order relation of Gibbs. Suddenly
the picture became much clearer. Extensive and intensive quantities
appeared in pairs. Each pair pointed to a particular form of energy. The
difference in the intensive quantity of a pair gives rise to an entropic
force. The flowing change in the extensive quantitiy of a pair gives rise
to an entropic flux. Entropy is produced by means of force-flux pairs. In
other words, the extensive/intensive categorisation of physical quantities
is directly related to one of the two great laws of physics, namely the
Law of Entropy Production. The other one is the Law of Enegy Conservation.
Entropy production is not unbounded, It operates between two limits. At
the lower end of the scale the limit is zero. When no entropy is produced,
a system is in dynamical equilibrium. No irreversible changes occurs. The
"dynamical" means that inside the system minute changes may occur, but
they are immediately rectified by a reversed change. Millions of
physicists, chemists, geologists and biologists work every day with such
dynamical equilbria, even long before WWII.
But one of the remarkable insigts of Prigogine was to realise many years
later that there is also a limit at the upper end of the scale. Entropy
production cannot be increased indefinitely. Every system has the property
that when the entropy production is increased fast and high enough, it
reaches the edge of chaos where the system becomes unstable in its present
structures. Push the entropy production a little higher and some of the
present structures of the system, if not all of them, give way. New
structures begin to appear. Some may be more complex and higher ordered
while the others may be less complex and lower ordered. Thus the term
bifurcation was used to denote this dual forking of the outcome.
It is now becoming clear that every living organism follows a meandering
course between these two limits or asymptotes of entropy production. The
course is not the simple harmonic oscillation of a pendulum. It is rather
like the meandering of a river or the fractal edge of a coast line or a
snow crystal. Except for moving between these two limits, these courses
are telling us that that there are many things which are still not clear
to us. The more we study them, the more our intellects become dumb of the
complexity unfolding itself before us. But also the more our hearts are
telling us -- this is the music of life.
OK, so what has this to do with Ray Harrel?
Ray, think of that simpel harmonic oscillation of a pedulum as the
metronome of the musician. Actually, they are almost one and the same
thing. The difference is that the metrome is an advanced pendulum of which
its period of oscillation can be adlusted -- largo, allegretto, etc.
The meandering course of living organisms is like the course of music
which the musician has to steer. For example, think of rythm, one of the
qualities of music. The lower limit (equilbrium) is regular beats, one at
each bar with nothing in between. A dreadful order. The upper limit is
beats with no pattern between them. A dreadful chaos. (To experience what
I mean by complete chaos, listen to the noise made by a Geiger counter
which measures radio-activity.) The musicians (composer and performer)
have to steer a course between these two extremes of rythm. The same for
every other quality of music.
Ray, now think of some composers who actually tried to use music, rather
than a natural langauge (like me above) or a symbolic language of a
theoretical scientist (which we cannot repesent on screen unless we have
an advanced word processor), to describe some of the things which I wrote
about above. Can you think of any person and offer examples? Let me help
you a bit.
Think about the meandering course between the two limits of entropy
production. Think about Beethoven and his nine symphonies. The even
numbered ones approaches the lower limit. The odd numbered ones approaches
the upper limit. Think of the fabulous bifurcation in the ninth leading to
an extraordinary constructive emergence -- the human voice becomes part of
the symphonic organisation. Do you see any parallels?
Think about the entropy producing force-flux pairs. The entropic forces
depends on intensive (non-scalable) patterns. Think how carefully
Beethoven worked them into many of his compositions (like the
Hammerklavier sonanta), how he showed that they are invarant (reoccuring),
how he uses differences between them to create immense tensions, etc.
Think of how he arranges the flow of extensive (scalable patterns) between
them, how he scales them until the heart wants to burts with symbiosis.
What I am going to ask you, is not to comment on the parallels which I
have drawn in the last two paragraphs. I suspect that you will need a lot
of thinking time -- "dwell time" as Terry Preibe likes to call it. You
will have to think of other composers also, for example Rachmaninoff with
his musical comments on emergences and immergences. And obviously, you
will have to make sure which composers had these "fundamental patterns of
organisation" in mind. Fortunately, humankind does not consist of a bunch
of "fundamentalists" alone. They are but few in number. And above all, do
not make Beethoven or Rachmaninoff out as "fundamentalists" -- their music
is far more complex than a mere excercise in "fundamentalism".
You can even respond immediately that all above is nonsense, that there
are no correspondences which you are aware of and that you consider the
ensuing obfuscation too dangerous to take the risk. You are really free to
do as you wish.
But there is one which I want to ask you. I think it is much easier to do
because you have had already much experience in it. It will also help me
much in my own thinking. This request has to do with the topic name of
this contribution "Scaling - Extensive and Intensive". You have many
experiences in the scaling of orchestras, making them bigger and smaller.
There are many reasons for this, some which may have driven you up against
the wall.
The scaling of an orchestra is not merely a case of multiplying or
dividing all kinds of players with the same factor. For example, consider
the conductor. When the orchestra has to be downsized (say halved), you
cannot downsize the conductor by cutting him in half or finding one with
half the capabilities. Likewise, when the orchestra has to be enlarged
(say doubled), you cannot appoint two conductors or one with twice the
capabilities.
What I want to know, is what "things" have to be left unchanged when
scaling an orchestra and what things have to be changed (sometimes with
more and sometimes with less than the scaling factor). I would also
appreciate it very much of you could pair up some of these "unchanged" and
"changed" things where you see such a possibility, especially trying to
motivate why you peceive such a pairing. That motivation is not to defy
criticism, but to help me to see what "form is behind" such a pairing.
Rick, since he is our host and has to make sure that contributions are
related to LOs gets distributed, will want to know why I ask this question
on the scaling of the orchestra, otherwise he will not distribute my
contribution or your reply. I can think of two reasons.
Firstly, you have opened my eye to the possibility that the incidence of
LOs among organisations is the highest in the kind of organisations known
as orchestras. When we study LOs, we have to study actual LOs and not
merely organisations assumed to be a LO or which have not yet emerged into
a LO. Orchestras can provide exquisite case studies. Another example.
Somebody wrote to me in private, asking on how to teach the card game
called Bridge. Four Bridge players can also emerge into an exciting LO.
And while we are on games, have anyone of you tried to predict the outcome
of the recent World Cup soccer series by using LO knowledge? Try it in the
next series! You might be able to make a bunch of money by gambling on
your LO knowledge.
Secondly, and that is why I used the conductor as an example, all
organisations require a leader in order not to disintegrate. The leader
can be a single person, or a group of people with a leader internally
selected. The leader can be a fixed to a person, or rotate among a number
of suitable candidates. However, one thing is becoming clear to me. The
leader of an organisation has to do with an intensive property of the
system. As such the leader cannot give rise to an entropic force because a
difference between two values of an intensive property are required. The
leader represents the one value of that property. What is the other value
of that property and who is/are related to it? What is that property
itself? What is the the associated flux of the entropic force?
Best wishes
--At de Lange <amdelange@gold.up.ac.za> Snailmail: A M de Lange Gold Fields Computer Centre Faculty of Science - University of Pretoria Pretoria 0001 - Rep of South Africa
Learning-org -- Hosted by Rick Karash <rkarash@karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>