Nov 062011
 

In his delightful collection of robot stories Cyberiad, Polish science-fiction author Stanislaw Lem tells us how to build a computer (a sentient computer, no less): the most important step is to pour a large number of transistors into a vat and stir.

This mental image popped into my mind as I was reading the last few pages of Andrew Pickering’s The Cybernetic Brain, subtitled Sketches of Another Future.

Beyond presenting a history of (chiefly British) cybernetics (and cyberneticians) the book’s main point is that cybernetics should be resurrected from the dead fringes as a nonmodern (the author’s word) alternative to the hegemony of modern science, and that the cybernetic approach of embracing unknowability is sometimes preferable to the notion that everything can be known and controlled. The author even names specific disasters (global warming, hurricane Katrina, the war in Iraq) as examples, consequences of the “high modernist” approach to the world.

Well, this is, I take it, the intended message of the book. But what I read from the book is a feely-goody New Age rant against rational (that is, fact and logic-based) thinking, characterized by phrases like “nonmodern” and “ontological theater”. The “high modernist” attitude that the author (rightfully) criticizes is more characteristic of 19th century science than the late 20th or early 21st centuries. And to be sure, the cyberneticians featuring in the book are just as guilty of arrogance as the worst of the “modernists”: after all, who but a true “mad scientist” would use an unproven philosophy as justification for electroshock therapy, or to build a futuristic control center for an entire national economy?

More importantly, the cyberneticians and Pickering never appear to go beyond the most superficial aspects of complexity. They conceptualize a control system for a cybernetic factory with a set of inputs, a set of outputs, and a nondescript blob in the middle that does the thinking; then, they go off and collect puddle water (!) that is supposed to be trained by, and eventually replace, the factory manager. The thinking goes something like this: the skills and experience of a manager form an “exceedingly complex” system. The set of biological and biochemical reactions in a puddle form another “exceedingly complex” system. So, we replace one with the other, do a bit of training, and presto! Problem solved.

These and similar ideas of course only reveal their proponents’ ignorance. Many systems appear exceedingly complex not because they are, but simply because their performance is governed by simple rules that the mathematician immediately recognizes as higher order differential equations, leading to chaotic behavior. The behavior of the cybernetic tortoise described in Pickering’s book appears complex only because it is unpredictable and chaotic. Its reaction in front of a mirror may superficially resemble the reaction of a cat, say, but that’s where the analogy ends.

In the end, the author laments that cybernetics has been marginalized by the hegemony of modernist science. I say no; I say cybernetics has been marginalized by its own failure to be useful. Much as cyberneticians would have preferred otherwise, you cannot build a sentient computer by pouring puddle water or a bag of transistors into a vat. The sentient machines of the future may be unknowable in the sense that their actions will be unpredictable, but it will be knowledge that builds them, not New Age ignorance.

 Posted by at 3:00 pm