Holotopia: Collective mind

From Knowledge Federation
Revision as of 19:44, 17 November 2020 by Dino (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

H O L O T O P I A:    F I V E    I N S I G H T S



The printing press revolutionized communication, and enabled the Enlightenment. But the Internet and the interactive digital media constitute a similar revolution. Hasn't the change we are proposing, from the 'candle' to the 'lightbulb', already been completed?

We look at the socio-technical system by which information is produced and handled in our society, which the new information technology helped us create; and we zoom in on its structure. We readily see that its principle of operation has remained broadcasting—which suited the printing press, but when applied to the new technology exacerbates problems, instead of enabling solutions.

We see, in other words, that we are using our wonderful new technology to do no better than create 'electrical candles'.


The real story will be told in the second book of the Holotopia series, whose tentative title is "Systemic Innovation", and subtitle "Cybernetics of Democracy".

While we wait for this book to be written, we offer the article "Bootstrapping Social-Systemic Evolution" as a placeholder.

The article has an appendix with a very short version of the two stories that are woven together to form the core texture of the book—"The Incredible History of Doug Engelbart", and a similarly incredible history of Erich Jantsch. The two stories are told to set the stage for the article's call to action—to combine systems science and knowledge federation (which those two frontier thinkers iconically represent) under a new paradigm (which is now modeled by the holoscope), and foster a creative frontier with an uncommon social and academic potential impact.

The article itself, and the situation where it was presented, are part of the "Cybernetics of Democracy" story. In 2013, in Haiphong, Vietnam, less than two weeks after Doug Engelbart passed away, Alexander Laszlo as the President of the International Society for the Systems Sciences initiated a systemic change in this academic community—exactly along the lines that was Engelbart's life-long dream, and call to action. We began our presentation in Haiphong by saying "We are here to build a bridge—between two communities, and interests, and ways of working." See this summary.

We here supplement the Engelbart and Jantsch story by several side stories prepared for the book—which might alone be sufficient to make the point that the book is intended to make.

Democracy must evolve

Cybernetics of democracy

What is "democracy"?

There are two ways to answer that question.

One of them is to answer it in a similar way as we answer "What is science?", or "journalism", or any of our various professions or institutions—by reifying a practice we've inherited from the past. The other one is to define it as a role or a function, within our society as a whole.

The word "democracy" is derived from Greek "demos", which means "people", and "kratos", which means power. So "democracy" is supposed to be a social system where the people have power; where the people are in control. But are people in control?

We added the word "cybernetics" to the subtitle, to suggest the answer. "Cybernetics" is derived from Greek "kybernetike", which means governance. So cybernetics is a scientific study of governance, or of governability. This study is transdisciplinary. Cybernetics shares its larger purpose with general systems science, and with the systems sciences more generally—which is to study systems of all kinds, both natural and human-made, in order to understand how a system's structure influences or "drives" its behavior. And to to then use this understanding create and handle systems of all kinds—and social systems in particular. So all we'll need from cybernetics, to answer our question, is the obvious insight that motivated its development.

In a bus without a steering wheel and without proper headlights, which is speeding through uncharted terrain in the darkness of the night—nobody is in control!

There might be seeing someone sitting in the driver's seat (Donald Trump; the people who elected him); we might believe that he's driving. But the moment we've examined the structure of the bus we see that this just cannot be the case—because driving is physically impossible.

Does our society have a structure where "democracy" is possible?

Democracy needs brakes

We intend to begin the Cybernetics of Democracy book by telling the story of Jørgen Randers, who in 1969, having just graduated from college, traveled from Oslo to Boston to do a doctorate in physics at MIT. And who upon hearing a lecture by Jay Forrester, decided that his study would be in systems sciences, or in "system dynamics" more precisely.

In 1972, Randers became one of the authors of The Club of Rome's first and most widely read book report "The Limits to Growth". What followed was an exhausting series of completely nonsensical debates. He and his three co-authors, whose average age was 25, were called "doomsday prophets", and severely attacked. What they were really saying was, however, completely obviously (it didn't even require computer simulation)—and completely ignored. Their point was, namely, that a 'bus' (a human system growing at an accelerating speed on a finite planet) must have 'brakes' to avoid crashing.

Hear Randers summarize his forty years of experience (at the 40th anniversary of The Limits to Growth at the Smithsonian), by declaring:

"The horrible fact is that democracy, and capitalism, will not solve those problems. We do need a fundamental paradigm shift in the area of governance."

Democracy cannot be reactive

Jay Forrester was a creative computing pioneer, who contributed to the computer revolution. In 1956, with several patents and an MIT professor chair, he got the idea that this new machine could be used for a whole new purpose—to model social systems, and understand their behavior.

A colleague who had earlier been the mayor of Boston moved to an office near by, and told him how he often noticed that applying the obvious policy to solve a recurring problem made the problem worse. Forrester made models, and found that this "counterintuitive behavior of social systems" (the title his 1971 research article) was a rule rather than exception. Social systems share that property with all "complex" or "non-linear dynamic systems".

Forrester lobbied to present this insight to the American congress.

Democracy must be "anticipatory"

As a mathematical biologist, focusing specifically on the issue of democracy while on sabbatical in the Center for the Study of Democratic Institutions in Santa Barbara, in 1972, Robert Rosen showed that to be viable, social systems must share a property shared by all living systems—namely that they must be "anticipatory" (make predictions of the future to govern their present behavior). He later summarized his findings in the book titled "Anticipatory Systems", with subtitle "Philosophical, Mathematica and Methodological Foundations".

The root of our problems is a paradox

A half-century after the mentioned insights have been made, our "democracy" is still only reacting to contingencies. Our "policy makers" are experts in doing no more than keeping the 'bus' on its present course (keeping the economy growing—for another four-year term).

You might have noticed that The Club of Rome's 1972 simulation study did exactly what Forrester and Rosen found a democracy must be able to do—make models and predictions, to see what sort of condition its present course is leading to. But neither the voters nor the politicians have, even today, a faintest clue that those "doomsday prophets" were just trying to add to our "democracy" a capability that any system of control that deserves that name must have—the capability to 'steer'!

And so the root problem of our democracy, and by extension of our various other problems as well, is not at all a problem but a paradox: We are not using information to understand our world and modify our behavior!

And we are not doing that even when this information is telling us what our systems must be like, if we should become capable of using information to see where we are headed, and what's going on!

To point to this most intriguing and no less alarming issue, to give it visibility and citizenship rights, we have given it a name: the Wiener's paradox.

David Bohm left us this clue, how we may (not!) be able to handle it:

As long as a paradox is treated as a problem, it can never be dissolved.


Wiener's paradox

We are back to 'square one'

What we have seen so far is what we pointed out to begin with—that "knowledge work has a flat tire". Before we tell you how this issue needs to be handled—a solution that was in principle proposed already in 1945 (yes, this too has been ignored), and developed with audacious novelty and in profound detail well beyond that early vision by 1968 (we are calling this solution collective mind)—we will remain a moment longer with the paradox. We want to tell you why exactly we are calling it Wiener's paradox. And by doing that, share a story whose points should not be missed.

We've already explained the Wiener's paradox by sharing the Wiener-Jantsch-Reagan thread (on this website, and then again in a blog post, we'll here only mention a couple of important points we've omitted there, and highlight the conclusions.

It seems rather obvious that the natural "systemic leverage point", or place to begin "a great cultural revival", is to provide (a way to) information that can show the way (as we submitted at the Visions of Possible Worlds conference, at the Triennale di Milano in 2003, see the transcript here.)

But that is also the key insight Wiener was intended to communicate in the mentioned last chapter of his 1948 Cybernetics (a copy of which we provided here). The most elementary fact reaching us from cybernetics is that a system needs "communication (or feedback) and control" ('headlights' and 'steering') to be governable or viable. Communication, Wiener observed, is the system (being what enables a collection of disparate entities to function together as an entity).

We need "evolutionary guidance"

Jantsch-university.jpeg

We have introduced Erich Jantsch as a link between the universe of the systems sciences, and the universe where The Club of Rome belongs, where the goal is to secure our civilization's future. Erich Jantsch's final message, however, may be summarized by the formula that intervening into (or "designing for") the evolution is the key to our contemporary situation (see our summaries here and here). Which is, once again, what the bus with candle headlights is pointing to (the 'way' that the 'bus' is following is our society's evolution).

"The invisible hand" won the argument

Coincidentally, Erich Jantsch passed away in the same year when Ronald Reagan became the US president—on an agenda opposite to his and wiener's.

Reagan did not win by the force of the argument, but by having incomparably more "air time" than our two academic heroes.

We may now see who 'keeps Galilei in house arrest', and how. Well before the advent of the Internet, Umberto Eco compared the New York Times and (then the main Soviet communist paper) Pravda in an interview, to argue that while in the latter censorship was achieved directly, in the former overabundance of information had the same effect.


KFvision.jpeg

Our civilization is like an organism that has recently grown beyond bounds ("exponentially")—and now represents a threat to its environment, and to itself. By a most fortunate mutation, this creature has recently developed a nervous system, which could allow it to comprehend the world and coordinate its actions. But it uses it only to amplify its most primitive, limbic impulses.



Collective mind

We use this keyword to point to the core of Doug Engelbart's vision, as rendered in the shown four slides. When each of us is connected through an interactive interface to a digital computer, and when those computers are linked together into a network, we are in effect connected as cells in a single nervous system are. Imagine if your own cells were using your nervous system to broadcast messages—and you will see why broadcasting on a collective mind leads to collective insanity, not to "collective intelligence" (capability to cope with the complexity and urgency of our problems) as Engelbart intended.


Knowledge federation

Knowledge federation can now be understood, simply, as the activity of a well-functioning or 'sane' collective mind.

A core task of the proposed knowledge federation transdiscipline is to draw insights from relevant fields—weave them into structural changes of academic and other institutions, to give them vision.


Bootstrapping

The key to solution is what Engelbart called bootstrapping—and we adopted and adapted here as a keyword. The point is that–in a situation where using the old system to achieve the result is useless—we must create new systems, with our own bodies. And/or help others do that, in a way that can scale.

The last decades of Engelbart's career were about bootstrapping—see this brief video excerpt.

Knowledge Federation was created by an act of bootstrapping—to enable bootstrapping; see the summary here.



The Lighthouse

The Lighthouse prototype, done in collaboration with the International Society for the Systems Sciences and for the society, was developed as a remedy for dissolving the Wiener's paradox. See it described here.

BCN2011

The Barcelona Innovation Ecosystem for Good Journalism (BCN2011) is a complete prototype showing how public informing can be reconstructed, to federate the most relevant information according to contemporary needs of people and society. A description with links is provided here.


TNC2015

Tesla and the Nature of Creativity (TNC2015) is a complete example of knowledge federation in academic communication, which shows how an academic result has been federated. See it described in the Tesla and the Nature of Creativity and A Collective Mind – Part One blog posts.