Leverage Points: Places to Intervene in a System
URL : https://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/
Author : [[Donella Meadows]]
[[leverage points]]
Leverage Points: Places to Intervene in a System
Metadata
- Author: [[Donella Meadows]]
- Full Title: Leverage Points: Places to Intervene in a System
- Category: #articles
- URL: https://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/
Highlights
- PLACES TO INTERVENE IN A SYSTEM (in increasing order of effectiveness) * Constants, parameters, numbers (such as subsidies, taxes, standards). * The sizes of buffers and other stabilizing stocks, relative to their flows. * The structure of material stocks and flows (such as transport networks, population age structures). * The lengths of delays, relative to the rate of system change. * The strength of negative feedback loops, relative to the impacts they are trying to correct against. * The gain around driving positive feedback loops. * The structure of information flows (who does and does not have access to information). * The rules of the system (such as incentives, punishments, constraints). * The power to add, change, evolve, or self-organize system structure. * The goals of the system. * The mindset or paradigm out of which the system — its goals, structure, rules, delays, parameters — arises. * The power to transcend paradigms.
- Not that parameters aren’t important — they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about parameters and fight fierce battles over them. But they RARELY CHANGE BEHAVIOR. If the system is chronically stagnant, parameter changes rarely kick-start it. If it’s wildly variable, they don’t usually stabilize it. If it’s growing out of control, they don’t brake it.
- Since I’m about to get into some examples where parameters ARE leverage points, let me stick in a big caveat here. Parameters become leverage points when they go into ranges that kick off one of the items higher on this list. Interest rates, for example, or birth rates, control the gains around positive feedback loops. System goals are parameters that can make big differences. Sometimes a system gets onto a chaotic edge, where the tiniest change in a number can drive it from order to what appears to be wild disorder.
-
- The sizes of buffers and other stabilizing stocks, relative to their flows. Consider a huge bathtub with slow in and outflows. Now think about a small one with very fast flows. That’s the difference between a lake and a river. You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones. In chemistry and other fields, a big, stabilizing stock is known as a buffer.
- Physical structure is crucial in a system, but rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place. After the structure is built, the leverage is in understanding its limitations and bottlenecks, using it with maximum efficiency, and refraining from fluctuations or expansions that strain its capacity.
-
- The lengths of delays, relative to the rate of system changes. Remember that bathtub on the fourth floor I mentioned, with the water heater in the basement? I actually experienced one of those once, in an old hotel in London. It wasn’t even a bathtub, it was a shower — no buffering capacity. The water temperature took at least a minute to respond to my faucet twists. Guess what my shower was like. Right, oscillations from hot to cold and back to hot, punctuated with expletives.
- A complex system usually has numerous negative feedback loops it can bring into play, so it can self-correct under different conditions and impacts. Some of those loops may be inactive much of the time — like the emergency cooling system in a nuclear power plant, or your ability to sweat or shiver to maintain your body temperature — but their presence is critical to the long-term welfare of the system. One of the big mistakes we make is to strip away these “emergency” response mechanisms because they aren’t often used and they appear to be costly. In the short term we see no effect from doing this. In the long term, we drastically narrow the range of conditions over which the system can survive. One of the most heartbreaking ways we do this is in encroaching on the habitats of endangered species. Another is in encroaching on our own time for rest, recreation, socialization, and meditation.
- Missing feedback is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure.
- It’s important that the missing feedback be restored to the right place and in compelling form. To take another tragedy of the commons, it’s not enough to inform all the users of an aquifer that the groundwater level is dropping. That could initiate a race to the bottom. It would be more effective to set a water price that rises steeply as the pumping rate begins to exceed the recharge rate.
- There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops — and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen (or go around them and make it happen anyway).
- The rules of the system define its scope, its boundaries, its degrees of freedom. Thou shalt not kill. Everyone has the right of free speech. Contracts are to be honored. The president serves four-year terms and cannot serve more than two of them. Nine people on a team, you have to touch every base, three strikes and you’re out. If you get caught robbing a bank, you go to jail.
- As we try to imagine restructured rules like that and what our behavior would be under them, we come to understand the power of rules. They are high leverage points. Power over the rules is real power. That’s why lobbyists congregate when Congress writes laws, and why the Supreme Court, which interprets and delineates the Constitution — the rules for writing the rules — has even more power than Congress. If you want to understand the deepest malfunctions of systems, pay attention to the rules, and to who has power over them.
- Self-organization means changing any aspect of a system lower on this list — adding completely new physical structures, such as brains or wings or computers — adding new negative or positive loops, or new rules. The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself. The human immune system has the power to develop new responses to (some kinds of ) insults it has never before encountered. The human brain can take in new information and pop out completely new thoughts.
- When you understand the power of system self-organization, you begin to understand why biologists worship biodiversity even more than economists worship technology. The wildly varied stock of DNA, evolved and accumulated over billions of years, is the source of evolutionary potential, just as science libraries and labs and universities where scientists are trained are the source of technological potential. Allowing species to go extinct is a systems crime, just as randomly eliminating all copies of particular science journals, or particular kinds of scientists, would be.
- Insistence on a single culture shuts down learning. Cuts back resilience. Any system, biological, economic, or social, that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet.
- The intervention point here is obvious, but unpopular. Encouraging variability and experimentation and diversity means “losing control.” Let a thousand flowers bloom and ANYTHING could happen! Who wants that? Let’s play it safe and push this leverage point in the wrong direction by wiping out biological, cultural, social, and market diversity!
- Right there, the diversity-destroying consequence of the push for control, that demonstrates why the goal of a system is a leverage point superior to the self-organizing ability of a system. If the goal is to bring more and more of the world under the control of one particular central planning system (the empire of Genghis Khan, the world of Islam, the People’s Republic of China, Wal-Mart, Disney, whatever), then everything further down the list, physical stocks and flows, feedback loops, information flows, even self-organizing behavior, will be twisted to conform to that goal.
-
- The mindset or paradigm out of which the system — its goals, structure, rules, delays, parameters — arises. Another of Jay Forrester’s famous systems sayings goes: it doesn’t matter how the tax law of a country is written. There is a shared idea in the minds of the society about what a “fair” distribution of the tax load is. Whatever the rules say, by fair means or foul, by complications, cheating, exemptions or deductions, by constant sniping at the rules, actual tax payments will push right up against the accepted idea of “fairness.”
- Paradigms are the sources of systems. From them, from shared social agreements about the nature of reality, come system goals and information flows, feedbacks, stocks, flows and everything else about systems. No one has ever said that better than Ralph Waldo Emerson:
- The ancient Egyptians built pyramids because they believed in an afterlife. We build skyscrapers, because we believe that space in downtown cities is enormously valuable. (Except for blighted spaces, often near the skyscrapers, which we believe are worthless.) Whether it was Copernicus and Kepler showing that the earth is not the center of the universe, or Einstein hypothesizing that matter and energy are interchangeable, or Adam Smith postulating that the selfish actions of individual players in markets wonderfully accumulate to the common good, people who have managed to intervene in systems at the level of paradigm have hit a leverage point that totally transforms systems.
- So how do you change paradigms? Thomas Kuhn, who wrote the seminal book about the great paradigm shifts of science,7 has a lot to say about that. In a nutshell, you keep pointing at the anomalies and failures in the old paradigm, you keep coming yourself, and loudly and with assurance from the new one, you insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather you work with active change agents and with the vast middle ground of people who are open-minded. Systems folks would say you change paradigms by modeling a system, which takes you outside the system and forces you to see it whole. We say that because our own paradigms have been changed that way.
- There is yet one leverage point that is even higher than changing a paradigm. That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that NO paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension. It is to “get” at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny. It is to let go into Not Knowing, into what the Buddhists call enlightenment.
Loading context... (requires JavaScript)
🏛️ Stoas for [[leverage points places to intervene in a system]]
📖 Open document (Hedgedoc) at https://doc.anagora.org/leverage-points-places-to-intervene-in-a-system
📖 Open document (Etherpad) at https://stoa.anagora.org/p/leverage-points-places-to-intervene-in-a-system
📹 Video conferencing space (Jitsi Meet) at https://meet.jit.si/leverage-points-places-to-intervene-in-a-system
🔎 Full text search for [[leverage points places to intervene in a system]]