irreducible uncertainty

Irreducible Uncertainty

Donella or ‘Dana’ was one of the key early systems people at MIT, mostly around cybernetics, but moved onto living systems theory in later life. She was at the forefront of the early movement on ‘sustainability’ and is probably more known nowadays through her book, Limits to Growth. Written in 1972, it stands the tests of time and was remarkably prescient, setting the tone for the debates that arose over the coming decades. The quote that follows comes from the archives of Myron Rogers. I’m grateful to Myron for pointing me at this and I have been unable to find the original source. Myron highlights one area that he describes as ‘brilliant, Dana’s idea that systems thinkers need to come to terms with "Irreducible uncertainty”. Too often people attach themselves to a belief that systems thinking offers a certainty that mechanistic thinking no longer can. Dana punctures this and we would be wise to take note.

'People who are raised in the industrial world and who get enthused about systems thinking are likely to make a terrible mistake. They are likely to assume that here, in systems analysis, in interconnection and complication, in the power of the computer, here at last, is the key to prediction and control. This mistake is likely because the mindset of the industrial world assumes that there is a key to prediction and control.'

I assumed that at first too. We all assumed it, as eager systems students at the great institution called MIT. More or less innocently, enchanted by what we could see through our new lens, we did what many discoverers do. We exaggerated our own ability to change the world. We did so not with any intent to deceive others, but in the expression of our own expectations and hopes. Systems thinking for us was more than subtle, complicated mind play. It was going to Make Systems Work.

But self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. 

We can never fully understand our world, not in the way our reductionistic science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can't optimize; we don't even know what to optimize. We can't keep track of everything. We can't find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.'


Other resources on systems thinking:

Leave a Reply