Our Economic System: Badly Designed? LO19494

Tom Christoffel (tjcdsgns@shentel.net)
Wed, 7 Oct 1998 07:46:08 -0400 (EDT)

Dear Facilitators of Learning and Learning Organizations:

I found the following pertinent to the issues of system design and
learning. It gives me perspection on my own regional approach to
organization. I contacted the author and he gave permission to post it to
the LO list.

"May you live in interesting times," is reputed to be an old Chinese
curse. When we are young in years it doesn't make much sense. After 20 to
30 years of interesting times, it starts to get old and one dreams of
managing change, even as the number of actors and volume of transactions
spiral upward. Dividing to conquor/to manage - to moduralize - is a
systems approach to complexity.

Tom Christoffel <tjcdsgns@shentel.net>

---------- Forwarded message ----------
Our Economic System: Badly Designed?
by Roberto Verzola*

The international financial crisis which struck Asian countries in 1997
and continues to cause widespread damage this year is a perfect example of
what systems analysts call "the side effects of global variables."

Take the most complex systems ever designed by people -- like the Apollo
spacecraft system which took men to the moon and brought them back, or
computer chips that are made of tens of millions of components, or a
complex operating system with one hundred million lines of code. They work
as designed because the system designers followed certain rules of design
which time and again have been proven correct.

Follow the design rules, and you get a system that is robust and reliable.
Violate the design rules, and you get a system that is unreliable and
crash-prone.

One of the most important rules that good designers will never violate is
modularization: breaking up a complex system into relatively independent
modules, which are isolated from each other except for a few well-defined
interfaces. This design rule can be found in all engineering and computer
science texts. It is true for hardware and software designs. Most complex
systems that violated this rule ended as miserable failures, while those
which tried to implement it showed much better rates of success.

The reason for the rule is simple: as the number of components in a system
increases, the number of possible interactions between components rises
exponentially. Normally, all possible interactions must be checked for the
possibility of unintended and undesirable results, called "side effects."
But beyond a certain number of components, it becomes impossible to
double-check or even to trace the results of every possible interaction.
Because these potentially undesirable side effects increase at a faster
rate than the number of components, they eventually bring the whole system
crashing down.

Designers had earlier argued against modularization because it was
"inefficient." Modular designs tended to use more components; a lot of
thought and effort had to go into the interfaces between modules; some
level of redundancy was required among the modules. But the loss in
efficiency was gained in reliability. Modular designs failed less often
(the average time between failures is a standard measure of system
reliability); and when they failed, errors were corrected faster.

The history of systems design is replete with crashed spacecrafts and
crashed computer operating systems that drove home the point: complex
systems must be broken up into smaller, more managable, independent
modules; otherwise, you get an unreliable, failure-prone, or unworkable
design.

The opposite of modularization is globalization. It is true: that favorite
word of World Bank and IMF economists is an absolute no-no among systems
designers. Open any respectable textbook on computer science or system
design, and one of the first design rules you are going to meet is: avoid
anything that affects the entire system globally. Break up large systems
into smaller modules. Protect your modules from interference by other
modules. Isolate your modules from each other. Hide information. Build
firewalls.

Most of all, avoid global variables.

In an economic system, a global variable would be anything that can affect
many portions of a large system. Global corporations, because they operate
worldwide, are a good example. The IMF, the World Bank, and the World
Trade Organization (WTO), because they intrude into almost every economy
in the world, are also good examples. Their moves and decisions affect
many other economies in the world, and result in consequences and other
interactions, that are so numerous that it becomes impossible to
anticipate and correct for undesirable side effects. These side effect
then proliferate; eventually, they can bring the whole system down.

Unfortunately, most economists appear to have little understanding of
system design. (When I was in college, many of those who failed our
engineering subjects shifted to economics.) Instead of following good
principles of design, our economists repeat the most common mistake of
amateur programmers: they rely on global variables.

Instead of building protective firewalls around our economy, they tear
down existing walls of protection. Instead of strictly regulating those
global variables that breach the walls that remain, they launch a perverse
program of "deregulation," enlarging instead of restricting the impact of
global variables. All those legal infrastructures which in the past
protected us from the side effects of global variables -- such as
protective tariffs, foreign exchange controls, regulatory mechanisms and
others which would have dampened the impact of global side-effects on our
economy -- are being torn down.

Instead of blocking IMF, WTO and World Bank interference, they kneel and
bow before them. Instead of relying on local variables and local
interactions, which are manageable locally, they put greater reliance on
global markets and global players, touting this reliance as "sound
economic fundamentals."

It is interesting that neo-liberal economic theory conflicts with systems
theory, though both of them claim to be a science. Real science, however,
anticipates reality better than pseudo-science. Looking at the current
global financial crisis, tt should be obvious which is which.

Until we learn the basic lessons of systems design, and apply these to our
own economy, we will be saddled with an unreliable, crash-prone economic
system, one which will cause us endless suffering.

There is another lesson we can learn from successful designs of the past.
If a system is badly-designed, and suffers from too many global variables,
any attempt at modification will likely produce even more unintended side
effects. Often, it is better to junk the misdesigned system altogether and
to start again from scratch.

Saddled with a system that embraces globalization and leaves us at the
mercy of its side-effects, this is perhaps what we should also do.

* Roberto Verzola <rverzola@phil.gn.apc.org> is an engineer who
specializes in computers. Funded by the Philippine government, he designed
a computer system in 1981, the first Filipino to do so. He also designed
the software for first online systems used at the Philippine Senate and
House of Representatives in 1991. He is also an activist, and is the
coordinator of Interdoc, a loose international network of NGOs tracking
the social impacts of new information technologies. 02 Oct 98 18:14:58

-- 
Thomas J. (Tom) Christoffel, AICP * e-mail: tjcdsgns@shentel.net
Planner & Futurist - My mission: "Regions_Work_by_networking!" 
Why?  "Production is local; markets are regional; the economy is global. Two
or more crossing boundaries to produce a solution is the basis of regional
community and regional cooperation."
*TJCdesigns * Box 1444 * Front Royal, Virginia (VA) 22630-1444 *
 "True peace is dynamic. For sustainability, design with re-use in mind." 

Learning-org -- Hosted by Rick Karash <rkarash@karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>