Holotopia: Collective mind
- 1 H O L O T O P I A: F I V E I N S I G H T S
- 2 Collective mind
- 2.1 Stories
- 2.2 Our collective mind needs structural change
- 2.3 Democracy needs structural change
The printing press revolutionized communication, and enabled the Enlightenment. But the Internet and the interactive digital media constitute a similar revolution. Hasn't the change we are proposing, from the 'candle' to the 'lightbulb', already been completed?
We look at the socio-technical system by which information is produced and handled in our society, which the new information technology helped us create; and we zoom in on its structure. We readily see that its principle of operation has remained broadcasting—which suited the printing press, but when applied to the new technology exacerbates problems, instead of enabling solutions.
We see, in other words, that we are using our wonderful new technology to do no better than create 'electrical candles'.
Our collective mind needs structural change
What it takes to be informed
Imagine a world where correct understanding of one's situation is used as basis for action.
In knowledge federation we use the keyword gestalt for such understanding. And we use this keyword to make the intuitive meaning of the word "informed" precise and clear: One is informed, if one has an "appropriate gestalt", or a gestalt that is appropriate to the situation at hand. "Our house is on fire" is a canonical example of a gestalt. An appropriate gestalt correctly points to a course of action by which the situation needs to be handled.
Suppose, now, that we apply this idea to our very handling of information, and of knowledge. What gestalt would result? What course of action would it point to?
Knowledge work has a flat tire
In 2011, when the Knowledge Federation completed its self-organization as a transdiscipline, we decided to "go public" and propose knowledge federation to the Silicon Valley, and to the world. In our workshop at the Triple Helix IX conference of international change-makers in knowledge work, at Stanford University, we used the flat tire metaphor to answer the above questions, and motivate our proposal.
Knowledge Work Has a Flat Tire is a real-life event where two leading scientists contradicted one other while presenting to the public and the media the scientific standing of an urgent and complex policy issue, the climate change. Our point was that a non-expert reader had no way to resolve this contradiction and decide who was right. That our present way of informing the public leads to confusion and inaction. And that our situation resembles the situation of people in a car that has a punctured tire: Pressing the gas pedal and surging forward (publishing, or broadcasting) will not take us to our destination. Our situation demands that we stop and take care of a structural defect.
Democracy needs structural change
Cybernetics of democracy
We are preparing a book series, to help us launch holotopia and knowledge federation. The second book in the series has the working title "Knowledge Federation", and subtitle "Cybernetics of Democracy".
What is "democracy"?
We tend to answer that question in the same way as we answer "What is science?" or "What is journalism?" We simply reify a certain practice as we've inherited it from the past. "Democracy" distinguishes itself in that we've inherited its mechanisms from a very distant point in history, when everything was different.
The word "democracy" is derived from Greek words "demos", which means "people", and "kratos", which means power. So why not consider "democracy" to be a social system where the people have power; where the people are in control? We would then be able to ask "What instruments does a democracy need to have, if the people should be in control?
Cybernetics gave us a scientific basis for answering this question. "Cybernetics" is derived from Greek "kybernetike", which means governance. So cybernetics is a scientific study of governance, or of governability. This study is transdisciplinary. Cybernetics shares its larger purpose with general systems science, and with the systems sciences more generally—which is to study systems of all kinds, both natural and human-made; to develop a terminology that allows for expressing how the structure of a system drives or influences its behavior. And to use this knowledge to understand, create and govern systems of all kinds—and social systems in particular.
All we'll need from cybernetics, to begin our quest, is the obvious insight that motivated its development: In a bus without a steering wheel and without proper headlights, which is speeding through uncharted terrain in the darkness of the night—nobody is in control. You might have seen someone sitting in the driver's seat (Donald Trump; the people who elected him), and believed that he was driving. But the moment you've examined the structure of the bus, you've understood that you were wrong, because driving simply isn't physically possible.
Democracy needs brakes
We intend to begin the Cybernetics of Democracy book by telling the story of Jørgen Randers, who in 1969, having just graduated from college, traveled from Oslo to Boston to do a doctorate in physics at MIT. And who upon hearing a lecture by Jay Forrester, decided that his study would be in systems sciences, or in "system dynamics" more precisely.
In 1972, Randers became one of the authors of The Club of Rome's first and most widely read book report "The Limits to Growth". What followed was an exhausting series of completely nonsensical debates. He and his three co-authors, whose average age was 25, were called "doomsday prophets", and severely attacked from all sides. The real issue was all but completely ignored. And their point, about this real issue, was hardly debatable: It was that our 'bus' (the human system growing at an accelerating speed on a finite planet) must have 'brakes', to avoid crashing.
Hear Randers summarize his forty years of experience at the 40th anniversary of The Limits to Growth at the Smithsonian), by declaring:
"The horrible fact is that democracy, and capitalism, will not solve those problems. We do need a fundamental paradigm shift in the area of governance."
Social systems behave counterintuitively
Jay Forrester was a computer machinery pioneer. In 1956, with several patents and an MIT professor chair, he got the idea that this intelligent new machine could be used for a whole new purpose—to model social systems, and experiment with ways in which their structure influences their behavior.
A colleague who had earlier been the mayor of Boston moved to an office near by, and told him that he often noticed how applying the obvious policy to solve a problem made the problem worse. So Forrester made models, and found that this "counterintuitive behavior of social systems" (the title of Forrester's 1971 paper) is a rule rather than exception. Social systems are "complex" or "non-linear dynamic systems", which all share that property.
Forrester lobbied to present his insight to the American congress.
Social systems must be "anticipatory"
As a mathematical biologist, focusing specifically on the issue of democracy while on sabbatical in the Center for the Study of Democratic Institutions in Santa Barbara, in 1972, showed that to be viable, social systems must share a property shared by all living systems—namely that they must use predictions to govern their present behavior (instead of only reacting to stimuli from their environment). He later summarized his findings in the book titled "Anticipatory Systems", with subtitle "Philosophical, Mathematica and Methodological Foundations".
Democracy must evolve
A half-century after the mentioned insights were made, our "democracy" is notoriously still only reacting to contingencies. Our "policy makers" are experts in keeping the 'bus' on its present course. In keeping the economy growing—for another four-year term.
The Club of Rome's 1972 simulation study did exactly what Forrester and Rosen found a democracy must be able to do—namely to make predictions; to see where it's headed. But neither our voters nor our politicians have even a faintest clue that those "doomsday prophets" might have been right in a fundamental way—that they were trying to add to "democracy" a capability it must have—the capability to 'steer'!
So the problem of democracy, and by extension our other problems as well, are instances of a single fundamental problem—that we are no longer using information—even when, or especially when, this information is telling us what updates we need to make to our social "machinery", to become able to 'steer', or in other words to take care of our problems.
To give this all-important issue visibility and citizenship rights, we have given it a name: the Wiener's paradox.