Summary
Charlie Munger has the classic line “All I want to know is where I’m going to die so I’ll never go there”.
Systemantics is in that vein, with the equally important, but probably less catchy angle of “All I want to know is how complex systems fail so I can avoid them.”
Part of the reason this is less catchy is that the term “systems” just doesn’t feel very personal or applicable. Like, no wakes up in the morning and thinks “wow, I’m really worried about systems failure today.”
It’s important to recognize then that systems here are used in a very broad way. Gall points out:
- Everything is a system.
- Everything is part of a larger system.
- The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).
- All systems are infinitely complex.
You are part of a system that is part of a business, country, planet, solar system, galaxy and universe.
A general understanding of systems is really important because pretty much everything we care about as humans can be viewed as a system: our health, our families, our careers, and our countries. These are the things that we wake up in the morning worrying about and studying how complex systems operate is a good framework for thinking about all of them.
There are a few key points that Gall makes in his book.
One is that not only do systems expand well beyond their original goals, but, as they evolve, they tend to oppose even their own original goals. Indeed, “Systems tend to oppose their own proper function”.
The end result is that systems rarely do what they say they do and people in systems rarely do what their title says they do.
In George Orwell’s dystopian novel 1984, the Ministry of Peace was about promoting war and the Ministry of Love is about controlling the population. The Ministry of Love building has no windows and is surrounded by barbed wire entanglements, steel doors, and hidden machine gun nests. It’s funny because we recognize it’s true.
It’s not entirely clear what the purpose of the system known as The Center for Disease Control actually does, but one can hardly argue in the wake of COVID-19 that it actually controls diseases.
This is not me ragging on the people that work for the CDC. Indeed, it’s quite the opposite. Complex systems can produce an outcome very different from what anyone inside them intends or wants.
The Congressional disapproval rating from Americans has been hovering around 70-80% for years. Basically, no on likes Congress, no matter which party is in control. How can it be that the body elected by Americans is disapproved of by Americans? Why wouldn’t they just elect people they like better?
Well, in fact, they do. People generally approve of their congressperson. It’s the system of elected officials known as Congress of which people disapprove.
Systems are other than the sum of their parts. Just as a collection of Congresspeople that are supported by their constituents can produce a Congress that no one likes, many systems in general “work poorly or not at all” for reasons that are not obvious nor intended.
As the more popular Murphy’s Law goes, “if anything can go wrong, it will.” This is not just an offhand, pessimistic remark, but a point that as systems become more complex, there are more ways for them to fail and the failures tend to be more expected.
A simple system, like say a can opener, tends to fail in rather predictable ways: the blade becomes too dull or the gears don’t turn properly. A complex system, like the global economy, tends to fail in new and unexpected ways, say a supply chain breakdown following a global pandemic or a financial crisis following a mortgage bond structuring problem.
The general reaction to these problems is usually some version of “well, let’s just re-organize the complex system to work better.”
This is a version of The Maginot Line Problem. The French tried to reform their defensive system after the First World War to prevent a German invasion, and yet the German invasion that launched World War II was far more effective than the prior one.
This approach always seems like a reasonable response. Every economic crash results in reforming the system and, yet, they keep on coming.
Humans tend to vastly overestimate our ability to design effective complex systems and it is this hubris that tends to get us in trouble, creating an illegible margin.
It is extremely difficult (read: effectively impossible) to design a complex system from scratch and have it function well. The best complex systems start as very simple systems and evolve from there.
Gall shows this in his list of elementary systems principles:
- A complex system cannot be “made” to work. It either works or it doesn’t.
- A simple system, designed from scratch, sometimes works.
- Some complex systems actually work.
- A complex system that works is invariably found to have evolved from a simple system that works.
- A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
The lesson from Gall then is that we should not focus on top-down reforming of complex systems, but a more bottom-up approach. We should start with a working simple system and then let it build up over time.
Perhaps the best example of this would be why seemingly small and inconsequential startups seem to routinely overcome large incumbents.
Though it seems like they could shift, incumbents with the baggage of a built-up complex system cannot reform effectively. As Clayton Christenson showed in his book The Innovator’s Dilemma, the popular explanation for this, incompetent management, is completely wrong.
In fact, the problem is that managers are extremely competent! They are just operating in an established and complex system with many competing parts that make it very hard to adapt.
What seems like the harder and longer path, starting with a new, simple system and evolving it over time, is in fact, the easier and shorter path.
The key lesson across all domains is to not focus on top-down solutions, but keep things simple and robust, giving up short-term efficiency for long-term resilience, ultimately creating a more ergodic outcome.
My Notes and Highlights
- Charles Darwin made it a rule to write down immediately any observation or argument that seemed to run counter to his theories. He had noticed that we humans tend to forget inconvenient facts, and if special notice is not taken of them, they simply fade out of awareness. Therefore, urged Darwin: CHERISH YOUR EXCEPTIONS Along similar lines, we propose: CHERISH YOUR SYSTEM-FAILURES.
- THE MOST EFFECTIVE APPROACH TO COPING IS TO LEARN THE BASIC LAWS OF SYSTEMS-BEHAVIOR As Virginia Satir has aptly reminded us:[x] PROBLEMS ARE NOT THE PROBLEM; COPING IS THE PROBLEM.
- Stated as succinctly as possible: the fundamental problem does not lie in any particular System but rather in Systems As Such (Das System an und fuer sich)[a.] Salvation, if it is attainable at all, even partially, is to be sought in a deeper understanding of the ways of all Systems, not simply in a criticism of the errors of a particular System.
- THINGS (THINGS GENERALLY/ ALL THINGS/THE WHOLE WORKS) ARE INDEED NOT WORKING VERY WELL. IN FACT, THEY NEVER DID In more formal terminology: SYSTEMS IN GENERAL WORK POORLY OR NOT AT ALL More technically stated:[c. ][xiii] COMPLICATED SYSTEMS SELDOM EXCEED FIVE PERCENT EFFICIENCY.
- ANERGY. ANERGY-STATE. Any state or condition of the Universe, or of any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures is defined as an ANERGY-STATE. Anergy is measured in units of effort required to bring about the desired change. We are now in a position to state a Theorem[a. ] of sweeping generality: THE TOTAL AMOUNT OF ANERGY IN THE UNIVERSE IS CONSTANT.
This Theorem is known, naturally, as the Law of Conservation of Anergy. We offer without proof the following Corollary: SYSTEMS OPERATE BY REDISTRIBUTING ANERGY INTO DIFFERENT FORMS AND INTO ACCUMULATIONS OF DIFFERENT SIZES.
- THE SYSTEM ITSELF (DAS SYSTEM AN UND FUER SICH) TENDS TO GROW AT 5-6% PER ANNUM Again, this Law is but a preliminary to the most general possible formulation, the Big-Bang Theorem of Systems Cosmology: SYSTEMS TEND TO EXPAND TO FILL THE KNOWN UNIVERSE.
- COMPLEX SYSTEMS EXHIBIT UNEXPECTED BEHAVIOR One is merely a pessimistic feeling; the other conveys the exhilaration that accompanies recognition of a Law of Nature. Because of its fundamental importance for all that follows, we have termed this Law the Generalized Uncertainty Principle.
- Note: they are by their nature unpredictable so don’t try to predict them.
- Most people would like to think of themselves as anticipating all contingencies. But this is a yearning, not an accomplished fact.
- Note: give up on thinking you can predict everything
- No matter what else it does, a System will act like a System. We are accustomed to thinking that a System acts like a machine and that if we only knew its mechanism, we could understand, even predict, its behavior. This is wrong.
The correct orientation is: A MACHINE ACTS LIKE A SYSTEM —and if the machine is large and complex enough, it will act like a large System. We simply have our metaphors backwards. With this deep insight tucked under our belt, or otherwise stashed, we can understand the philosophical error that has plagued astronomers and cosmologists since the eighteenth century: THE UNIVERSE IS NOT LIKE A MACHINE except in certain trivial ways. Rather: THE UNIVERSE IS LIKE A VERY LARGE SYSTEM.
- A LARGE SYSTEM, PRODUCED BY EXPANDING THE DIMENSIONS OF A SMALLER SYSTEM, DOES NOT BEHAVE LIKE THE SMALLER SYSTEM
- Note: rough rule of thumb, every time you 3x, then all the rules change and you have to kind of start from scratch.
- Note: rough rule of thumb, every time you 3x, then all the rules change and you have to kind of start from scratch.
- THE SYSTEM ALWAYS KICKS BACK SYSTEMS GET IN THE WAY —or, in slightly more elegant language: SYSTEMS TEND TO OPPOSE THEIR OWN PROPER FUNCTIONS.
- SYSTEMS TEND TO MALFUNCTION CONSPICUOUSLY JUST AFTER THEIR GREATEST TRIUMPH Toynbee explains this effect by pointing out the strong tendency to apply a previously-successful strategy to the new challenge: THE ARMY IS NOW FULLY PREPARED TO FIGHT THE PREVIOUS WAR For brevity, we shall in future refer to this Axiom as Fully Prepared for the Past (F.P.F.P.).
- PERFECTION OF PLANNING IS A SYMPTOM OF DECAY.
- The field of Architecture has given rise to a second major principle relating to the Life Cycle of Systems. This principle has emerged from the observation that temporary buildings erected to house Navy personnel in World War I continued to see yeoman service in World War II as well as in subsequent ventures, and are now a permanent, if fading, feature of Constitution Avenue in Washington, D.C. The construction of the Pentagon, a few short miles away across the Potomac River, never even threatened their tenure. We conclude: A TEMPORARY PATCH WILL VERY LIKELY BE PERMANENT. Since Systems generally Don’t Go Away, and since they occupy space, our landscape is now littered with the bleached bones and rotting carcasses of old attempted solutions to our problems: THE OLD SYSTEM IS NOW THE NEW PROBLEM or, as even more poignantly formulated by Parkinson: THE GHOST OF THE OLD SYSTEM CONTINUES TO HAUNT THE NEW.
- Note: careful what systems you set up in the first pace.
- PEOPLE IN SYSTEMS DO NOT DO WHAT THE SYSTEM SAYS THEY ARE DOING.
- In general, the larger and more complex the System, the less the resemblance between a particular function and the name it bears. For brevity, we shall refer to this paradox as FUNCTIONARY’S FALSITY.
- 2. What is the real-world function of a University Scholar? Answer: University Scholars are supposed to think and study deeply on basic intellectual problems of their own choosing. In fact they must teach assigned courses, do “research” on problems for which research money is available, and publish, or perish.
- Note: get clear on what the system does before you jump in.
- Note: get clear on what the system does before you jump in.
- OPERATIONAL FALLACY in all its austere grandeur. Just as people in Systems do not do what the System says they are doing, so also: THE SYSTEM ITSELF DOES NOT DO WHAT IT SAYS IT IS DOING In slightly greater detail: The function performed by a System is not operationally identical to the function of the same name performed by a person. In general, a function performed by a larger System is not operationally identical to the function of the same name as performed by a smaller System.
- THE FUNCTION (OR PRODUCT) IS DEFINED BY THE SYSTEMS- OPERATIONS THAT OCCUR IN ITS PERFORMANCE OR MANUFACTURE.
- In Zen, there is a saying, “If you meet the Buddha on the road, kill him!” After the initial shock, one understands that someone who claims to have achieved Buddha-hood obviously has not done so. The claim refutes itself. We have already noted that a supermarket Apple is not like the Apple we had in mind, that what comes out of a Coffee-vending machine is not Coffee as we once knew it, and that a person who takes a course in Leadership Training is acting out a behavioral pattern better described as Following rather than Leading. We summarize in the Naming Fallacy: THE NAME IS MOST EMPHATICALLY NOT THE THING.
- The power of the Naming Effect should not be underestimated. It is literally the power to bring new “realities” into existence. Because of this power the various wars on “crime,” “poverty,” “addiction,” and the like not only continue in a state of chronic failure: they are doomed to be waged forever, so long as they continue to be framed in those terms. They perpetuate that which they have named. We are indebted to Mr. George Orwell for bringing home to us the real functions of a Ministry of Truth or a Ministry of Peace or of Justice; but Orwell was only updating an insight first enunciated by Lao-tzu at the dawn of written history, when he wrote: People, through finding something ‘beautiful,’ think something else ‘unbeautiful’ Hard experience has taught us the meaning of Orwellian Ministries of this and that; but the delicate nuance of Lao-Tzu still escapes us, until we realize that the thing called Education is not really education, just as the supermarket Apple is not the apple we once knew and loved. In that moment we escape from the Naming Delusion.
- Our study of the Operational Fallacy has made clear how and why it is that (1) large Systems really do not do what they purport to do and that (2) people in large Systems are not actually performing the function ascribed to them.
- Fundamental Law of Administrative Workings (F.L.A.W.): THINGS ARE WHAT THEY ARE REPORTED TO BE The observant reader has doubtless already noted various alternative formulations of this Axiom, all to the same general effect, for example: THE REAL WORLD IS WHAT IS REPORTED TO THE SYSTEM —or, in the world of Diplomacy: IF IT ISN’T OFFICIAL, IT HASN’T HAPPENED Amongst television journalists this Axiom takes the following form: IF IT DIDN’T HAPPEN ON CAMERA, IT DIDN’T HAPPEN.
- TO THOSE WITHIN A SYSTEM, THE OUTSIDE REALITY TENDS TO PALE AND DISAPPEAR.
- DESIGNERS OF SYSTEMS TEND TO DESIGN WAYS FOR THEMSELVES TO BYPASS THE SYSTEM We pause only briefly to validate it with a well-known example from Government: Example:The Congress of the United States specifically exempts itself from the Civil Rights Act, the Equal Pay Act, the Fair Labor Standards Act, the National Labor Relations Act, the Occupational Safety and Health Act, the Freedom of Information Act, and the Privacy Act. At a slightly less exalted level, we note that even so altruistic a person as Alex. G. Bell, having invented the telephone, rather promptly retired to a lonely Atlantic island where no one could ring him up. Having thoroughly digested this introduction, we should have no trouble understanding that: IF A SYSTEM CAN BE EXPLOITED, IT WILL BE Nor will we cavil at its twin: ANY SYSTEM CAN BE EXPLOITED.
- EVEN TRYING TO BE HELPFUL IS A DELICATE AND DANGEROUS UNDERTAKING.
- Note: you often can’t reform a broken system. you have to scrap it.
- Note: you often can’t reform a broken system. you have to scrap it.
- COMPLEX SYSTEM THAT WORKS IS INVARIABLY FOUND TO HAVE EVOLVED FROM A SIMPLE SYSTEM THAT WORKED The parallel proposition also appears to be true:[g. ] A COMPLEX SYSTEM DESIGNED FROM SCRATCH NEVER WORKS AND CANNOT BE MADE TO WORK. YOU HAVE TO START OVER, BEGINNING WITH A WORKING SIMPLE SYSTEM.
- Less technically stated: WHATEVER THE SYSTEM HAS DONE BEFORE, YOU CAN BE SURE IT WILL DO IT AGAIN.
- SYSTEMS DON’T WORK FOR YOU OR FOR ME. THEY WORK FOR THEIR OWN GOALS.
- Success is largely a matter of Avoiding the Most Likely Ways to Fail[d. ], and since every Bug advances us significantly along that path, we may hearken back to the advice given in the Preface and urge the following Policy: CHERISH YOUR BUGS. STUDY THEM.
- THE INFORMATION YOU HAVE IS NOT THE INFORMATION YOU WANT. THE INFORMATION YOU WANT IS NOT THE INFORMATION YOU NEED. THE INFORMATION YOU NEED IS NOT THE INFORMATION YOU CAN OBTAIN.
- IN A CLOSED SYSTEM, INFORMATION TENDS TO DECREASE AND HALLUCINATION TENDS TO INCREASE.
- The person (or System) who has a problem and doesn’t realize it has two problems, the problem itself and the meta-problem of Unawareness: IF YOU’RE NOT AWARE THAT YOU HAVE A PROBLEM, HOW CAN YOU CALL FOR HELP?
- Problem Avoidance is in fact the most elegant form of Problem-solving, since it actively and responsibly avoids the entire Meta-problem of Dealing With the Problem.
- The decision to become involved with a particular System should be made carefully, on the basis of a balanced judgment of one’s interests. One need not drift (or sail, or barge) into Systems uncritically: CHOOSE YOUR SYSTEMS WITH CARE Remember: DESTINY IS LARGELY A SET OF UNQUESTIONED ASSUMPTIONS.
- IF AT FIRST YOU DON’T SUCCEED, TRY, TRY AGAIN —a dangerous, two-edged precept which, if wrongly understood, can become the basis for a lifetime career of Struggling-and-Failing. More in line with the spirit of the Creative Tack is the newer admonition: IF SOMETHING ISN’T WORKING, DON’T KEEP DOING IT. DO SOMETHING ELSE INSTEAD.
- ALMOST ANYTHING IS EASIER TO GET INTO THAN OUT OF More specifically: TAKING IT DOWN IS OFTEN MORE TEDIOUS THAN SETTING IT UP.
- SYSTEMS RUN BEST WHEN DESIGNED TO RUN DOWNHILL More briefly formulated: AVOID UPHILL CONFIGURATIONS —or, in the vernacular: GO WITH THE FLOW In human terms, this means working with human tendencies rather than against them.
- LOOSE SYSTEMS LAST LONGER AND FUNCTION BETTER Since most of modern life is lived in the interstices of large systems, it is of practical importance to note that LOOSE SYSTEMS HAVE LARGER INTERSTICES and are therefore generally somewhat less hostile to human life forms than tighter Systems. As an example of a System attuned to the principles of Systems-design enunciated thus far, consider the System of the Family. The Family has been around for a long time. Our close primate relatives, the gorillas, form family units consisting of husband and wife and one or more offspring. As Jane Goodall has shown, gorillas take naps after meals. (Every day is Sunday for large primates.) The youngsters wake up too soon, get bored and start monkeying around the nest. Father gorilla eventually wakes up, leans on one elbow, and fixes the errant youngster with a penetrating stare that speaks louder than words. The offending juvenile thereupon stops his irritating hyperactivity, at least for a few minutes. Clearly, this is a functioning family System. Its immense survival power is obvious. It has weathered vicissitudes compared to which the stresses of our own day are trivial. And what are the sources of its strength? In brief: extreme simplicity of structure; looseness in everyday functioning; “inefficiency” in the efficiency-expert’s sense of the term; and a strong alignment with basic primate motivations.
- PLAN TO SCRAP THE FIRST SYSTEM: YOU WILL ANYWAY.
- The Potemkin Village Effect. The P.V.E. is especially pronounced in Five-Year Plans, which typically report sensational overachievement during the first four and a half years, followed by a rash of criminal trials of top officials and the announcement of a new and more ambitious Five-Year Plan, starting from a baseline somewhat lower than that of the preceding Plan, but with higher goals.
- A SYSTEM THAT IGNORES FEEDBACK HAS ALREADY BEGUN THE PROCESS OF TERMINAL INSTABILITY.
- THE FUTURE IS NO MORE PREDICTABLE NOW THAN IT WAS IN THE PAST, BUT YOU CAN AT LEAST TAKE NOTE OF TRENDS.
- “There go my people. I must find out where they are going, so I can lead them.”
- UTILIZE THE PRINCIPLE OF UTILIZATION For those to whom this formula seems unduly abstract, we offer the following Rule, drawn from the Engineering Profession: (A) IF IT’S FOR DIGGING A HOLE IT SHOULD PROBABLY LOOK SOMETHING LIKE A SHOVEL (B) IF IT LOOKS LIKE A SHOVEL, TRY USING IT FOR DIGGING A HOLE.
- Note: use the system for what it can do not what you want it to do.
- Note: use the system for what it can do not what you want it to do.
- A System that is sufficiently large, complex, and ambitious can reduce output far below “random” levels, since everyone’s attention is now focused on making the existing System (i.e., the wrong answer) work. Thus, a Federal Program to Conquer Cancer may tie up all the competent researchers in the field, leaving the Problem to be solved by someone else, typically a graduate student from the University of Tasmania doing a little recreational entomology on her vacation. Solutions usually come from people who see in the Problem only an interesting puzzle, and whose qualifications would never satisfy a select committee.
- Great advances may be achieved by individuals working essentially alone or in small teams.
- GREAT ADVANCES DO NOT COME OUT OF SYSTEMS DESIGNED TO PRODUCE GREAT ADVANCES —and furthermore: COMPLICATED SYSTEMS PRODUCE COMPLICATED RESPONSES TO PROBLEMS —or, as stated by Ashby:[cxvii] COMPLEX SYSTEMS HAVE COMPLEX BEHAVIORS Indeed, it is clear from our discussion of Catalytic Managership that at best, and under ideal conditions: MAJOR ADVANCES TAKE PLACE BY FITS AND STARTS.
- The twin Limit Theorems: (A) YOU CAN’T CHANGE JUST ONE THING —and at the other extreme: (B) YOU CAN’T CHANGE EVERYTHING Pragmatically, it is generally easier to aim at changing one or a few things at a time and then work out the unexpected effects, than to go to the opposite extreme. Attempting to correct everything in one grand design is appropriately designated as Grandiosity. Without further apology we offer the following Rule: A LITTLE GRANDIOSITY GOES A LONG WAY.
- The diagnosis of Grandiosity is quite elegantly and strictly made on a purely quantitative basis: How many features of the present System, and at what level, are to be corrected at once? If more than three, the plan is grandiose and will fail.
- IF IT’S WORTH DOING AT ALL, IT’S WORTH DOING POORLY.
- IN ORDER TO SUCCEED IT IS NECESSARY TO KNOW HOW TO AVOID THE MOST LIKELY WAYS TO FAIL.
- Note: Chesterton’s fence.
- Note: Chesterton’s fence.
- The belief that constructive change in a System is possible through direct intervention is an optimistic view that dies hard. The alert Systems-student will recognize this as the common psychological phenomenon of denial of unpleasant reality. In fact, it may be viewed as a classic example of a Systems-delusion. Even more insidious, however, is the implicit assumption that there is (somewhere) a science of Systems-intervention which any diligent pupil can master, thereby achieving Competence to intervene here and there in Systems large and small, with results objectively verifiable and of consistent benefit to mankind. Such a belief is clearly in conflict with the Generalized Uncertainty Principle.
- An Airplane has been defined as a collection of parts having an inherent tendency to fall to earth, and requiring constant effort and supervision to stave off that outcome. The System called “airplane” may have been designed to fly, but the parts don’t share that tendency. In fact, they share the opposite tendency. And the System will fly—if at all—only as a System.
- Note: good analogy
- Note: good analogy
- IF YOUR PROBLEM SEEMS UNSOLVABLE, CONSIDER THAT YOU MAY HAVE A META-PROBLEM.
- We conclude, with Pooh and Heisenberg: THE SYSTEM IS ALTERED BY THE PROBE USED TO TEST ITa. —and, mindful of Pooh’s head, we add: THE PROBE IS ALTERED ALSO.
- Shortly after a rescue team had drilled a deep borehole to provide safe drinking water for a village in Ethiopia, the team was dismayed to learn that the borehole was being repeatedly vandalized by being filled with rocks. Previously, certain men of the village had made their livelihood by carrying water from a distant waterhole in skin containers on the backs of donkeys. Now they were out of a job. When those men were appointed as guardians of the new borehole, at a good salary, the vandalism ceased. In the old Model of the Universe, the men had been “criminals” and “vandals.” In the New Model, they were “policemen.”
- Note: on how changing the mental model is often a better solution than changing the system.
- Note: on how changing the mental model is often a better solution than changing the system.
- We are now ready to consider the case of a purely mental restructuring of the System, in which the only change is in the Mental Model. We begin with a deliberately simplified Example; the case of a Jet Pilot whose plane breaks through the clouds on approach to a strange airport at an unfamiliar destination. The pilot is suddenly faced with the problem of putting down his plane on a runway fifty feet long and half a mile wide. As we have previously noted, until the pilot solves the Meta-problem of restating his Problem in solvable terms, he will experience some frustration.
- Creative reframing is the art of substituting useful metaphors for limiting metaphors.
- Bismarck, the ultraconservative Iron Chancellor of Germany, was unalterably opposed to anything that smacked of Socialism. But when someone pointed out to him that a million loyal Civil Servants represented a Standing Army in disguise, he bought the whole thing, including sickness benefits and pensions.
- Tempered, moderate pessimism is the hallmark of the seasoned Systems-student.
- In Hindu legend the Net of Indra is infinitely large. At every intersection of the meshes of the net is a precious jewel. The net consists of an infinity of precious jewels, each of which reflects the entire net, including the infinity of other jewels. We believe the Hindu sages were trying to give expression to a fundamental principle of Systems-thinking, as follows: ANY GIVEN ELEMENT OF ONE SYSTEM IS SIMULTANEOUSLY AN ELEMENT IN AN INFINITY OF OTHER SYSTEMS.
- Finding helpful systems is much like Reframing, with which, indeed, at some deeper level it may be identical. Both are chancy arts. But when one succeeds in finding or defining the right system, the results can be spectacular and, to the Systems-student accustomed to five-per-cent returns, soul-satisfying indeed.
- IN ORDER TO REMAIN UNCHANGED, THE SYSTEM MUST CHANGE
- Note: the red queen effect applies to all systems.
- Note: the red queen effect applies to all systems.
- SYSTEMS TEND TO MALFUNCTION CONSPICUOUSLY JUST AFTER THEIR GREATEST TRIUMPH.
- PERFECTION OF PLANNING IS A SYMPTOM OF DECAY.
- THE END RESULT OF EXTREME COMPETITION IS BIZARRENESS.
- IF A SYSTEM CAN BE EXPLOITED, IT WILL BE. ANY SYSTEM CAN BE EXPLOITED.
- A COMPLEX SYSTEM THAT WORKS IS INVARIABLY FOUND TO HAVE EVOLVED FROM A SIMPLE SYSTEM THAT WORKED.
- LARGE COMPLEX SYSTEMS ARE BEYOND HUMAN CAPACITY TO EVALUATE. (LARGE SYSTEMS KANT BE FULLY KNOWN).
- SYSTEMS DON’T WORK FOR YOU OR FOR ME. THEY WORK FOR THEIR OWN GOALS.
- Inaccessibility Theorem: THE INFORMATION YOU HAVE IS NOT THE INFORMATION YOU WANT. THE INFORMATION YOU WANT IS NOT THE INFORMATION YOU NEED. THE INFORMATION YOU NEED IS NOT THE INFORMATION YOU CAN OBTAIN.
- ALMOST ANYTHING IS EASIER TO GET INTO THAN OUT OF.
- The First Law of Systems-Survival: A SYSTEM THAT IGNORES FEEDBACK HAS ALREADY BEGUN THE PROCESS OF TERMINAL INSTABILITY.
- GREAT ADVANCES DO NOT COME OUT OF SYSTEMS DESIGNED TO PRODUCE GREAT ADVANCES.
- Perfectionist’s Paradox: IN DEALING WITH LARGE SYSTEMS, THE STRIVING FOR PERFECTION IS A SERIOUS IMPERFECTION.
- Rule of Thumb (Survivors’ Souffle): IF IT’S WORTH DOING AT ALL, IT’S WORTH DOING POORLY.
- Using a “sponge” (Ecological) metaphor rather than a “plumbing”(Engineering) metaphor.
- Note: metaphors we live by.
Last Updated on May 13, 2021 by Taylor Pearson