The SBs: Stewart Brand and Stafford Beer

Caius revisits “Both Sides of the Necessary Paradox,” an interview with Gregory Bateson included as the first half of Stewart Brand’s 1974 book II Cybernetic Frontiers. The book’s second half reprints “Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums,” the influential essay on videogames that Jann Wenner commissioned Brand to write for Rolling Stone two years prior.

“I came into cybernetics from preoccupation with biology, world-saving, and mysticism,” writes Brand. “What I found missing was any clear conceptual bonding of cybernetic whole-systems thinking with religious whole-systems thinking. Three years of scanning innumerable books for the Whole Earth Catalog didn’t turn it up,” he adds. “Neither did considerable perusing of the two literatures and taking thought. All I did was increase my conviction that systemic intellectual clarity and moral clarity must reconvene, mingle some notion of what the hell consciousness is and is for, and evoke a shareable self-enhancing ethic of what is sacred, what is right for life” (9).

Yet in summer of 1972, says Brand, a book arrives to begin to fill this gap: Bateson’s Steps to an Ecology of Mind.

Brand brings his knack for New Journalism to the task of interviewing Bateson for Harper’s.

The dialogue between the two reads at many times like one of Bateson’s “metalogues.” An early jag of thought jumps amid pathology, conquest, and the Tao. Reminded of pioneer MIT cybernetician Warren McCulloch’s fascination with “intransitive preference,” Bateson wanders off “rummaging through his library looking for Blake’s illustration of Job affrighted with visions” (20).

Caius is reminded of Norbert Wiener’s reflections on the Book of Job in his 1964 book God and Golem, Inc. For all of these authors, cybernetic situations cast light on religious situations and vice versa.

Caius wonders, too, about the relationship between Bateson’s “double bind” theory of schizophrenia and the theory pursued by Deleuze and Guattari in Capitalism and Schizophrenia.

Double bind is the term used by Gregory Bateson to describe the simultaneous transmission of two kinds of messages, one of which contradicts the other, as for example the father who says to his son: go ahead, criticize me, but strongly hints that all effective criticism — at least a certain type of criticism — will be very unwelcome. Bateson sees in this phenomenon a particularly schizophrenizing situation,” note Deleuze and Guattari in Anti-Oedipus. They depart from Bateson only in thinking this situation the rule under capitalism rather than the exception. “It seems to us that the double bind, the double impasse,” they write, “is instead a common situation, oedipalizing par excellence. […]. In short, the ‘double bind’ is none other than the whole of Oedipus” (79-80).

God’s response to Job is of this sort.

Brand appends to the transcript of his 1972 interview with Bateson an epilog written in December 1973, three months after the coup in Chile.

Bateson had direct, documented ties to US intelligence. Stationed in China, India, Ceylon, Burma, and Thailand, he produced “mixed psychological and anthropological intelligence” for the Office of Strategic Services (OSS), precursor to CIA, during WWII. Research indicates he maintained connections with CIA-affiliated research networks in the postwar years, participating in LSD studies linked to the MKUltra program in the 1950s. Afterwards he regrets his association with the Agency and its methods.

Asked by Brand about his “psychedelic pedigree,” Bateson replies, “I got Allen Ginsberg his first LSD” (28). A bad trip, notes Caius, resulting in Ginsberg’s poem “Lysergic Acid.” Bateson himself was “turned on to acid by Dr. Harold Abramson, one of the CIA’s chief LSD specialists,” report Martin A. Lee & Bruce Shlain in their book Acid Dreams. Caius wonders if Stafford Beer underwent some similar transformation.

As for Beer, he serves in the British military in India during WWII, and for much of his adult life drives a Rolls-Royce. But then, at the invitation of the Allende regime, Beer travels to Chile and builds Cybersyn. After the coup, he lives in a remote cottage in Wales.

What of him? Cybernetic socialist? Power-centralizing technocrat?

Recognizes workers themselves as the ones best suited to modeling their own places of work.

“What were the features of Beer’s Liberty Machine?” wonders Caius.

Brand’s life, too, includes a stint of military service. Drafted after graduating from Stanford, he served two years with the US army, first as an infantryman and then afterwards as a photographer. Stationed at Fort Dix in New Jersey, Brand becomes involved in the New York art world of those years. He parts ways with the military as soon as the opportunity to do so arises. After his discharge in 1962, Brand participates in some of Allan Kaprow’s “happenings” and, between 1963 and 1966, works as a photographer and technician for USCO.

Amid his travels between East and West coasts during these years, Brand joins up with Ken Kesey and the Merry Pranksters.

Due to these apprenticeships with the Pranksters and with USCO, Brand arrives early to the nexus formed by the coupling of psychedelics and cybernetics.

“Strobe lights, light projectors, tape decks, stereo speakers, slide sorters — for USCO, the products of technocratic industry served as handy tools for transforming their viewers’ collective mind-set,” writes historian Fred Turner in his 2006 book From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. “So did psychedelic drugs. Marijuana and peyote and, later, LSD, offered members of USCO, including Brand, a chance to engage in a mystical experience of togetherness” (Turner 49).

Brand takes acid around the time of his discharge from the military in 1962, when he participates in a legal LSD study overseen by James Fadiman at the International Foundation for Advanced Study in Menlo Park. But he notes that he first met Bateson “briefly in 1960 at the VA Hospital in Palo Alto, California” (II Cybernetic Frontiers, p. 12). Caius finds this curious, and wonders what that meeting entailed. 1960 is also the year when, at the VA Hospital in Menlo Park, Ken Kesey volunteers in the CIA-sponsored drug trials involving LSD that inspire his 1962 novel One Flew Over the Cuckoo’s Nest.

Bateson worked for the VA while developing his double bind theory of schizophrenia.

Before that, he’d been married to fellow anthropologist Margaret Mead. He’d also participated in the Macy Conferences, as discussed by N. Katherine Hayles in her book How We Became Posthuman.

Crows screeching in the trees have Caius thinking of condors. He sits, warm, in his sunroom on a cold day, roads lined with snow from a prior day’s storm, thinking about Operation Condor. Described by Morozov as Cybersyn’s “evil twin.” Palantir. Dark Enlightenment. Peter Thiel.

Listening to one of the final episodes of Morozov’s podcast, Caius learns of Brian Eno’s love of Beer’s Brain of the Firm. Bowie and Eno are some of Beer’s most famous fans. Caius remembers Eno’s subsequent work with Brand’s consulting firm, the GBN.

Santiago Boy Fernando Flores is the one who reaches out to Beer, inviting him to head Cybersyn. Given Flores’s status as Allende’s Minister of Finance at the time of the coup, Pinochet’s forces torture him and place him in a prison camp. He remains there for three years. Upon his release, he moves to the Bay Area.

Once in Silicon Valley, Flores works in the computer science department at Stanford. He also obtains a PhD at UC Berkeley, completing a thesis titled Management and Communication in the Office of the Future under the guidance of philosophers Hubert Dreyfus and John Searle.

Flores collaborates during these years with fellow Stanford computer scientist Terry Winograd. The two of them coauthor an influential 1986 book called Understanding Computers and Cognition: A New Foundation for Design. Although they make a bad wager, insisting that computers will never understand natural language (an insistence proven wrong with time), they nevertheless offer refreshing critiques of some of the common assumptions about AI governing research of that era. Drawing upon phenomenology, speech act theory, and Heideggerian philosophy, they redefine computers not as mere symbol manipulators nor as number-crunchers, but as tools for communication and coordination.

Flores builds a program called the Coordinator. Receives flak for “software fascism.”

Winograd’s students include Google cofounders Larry Page and Sergey Brin.

God Human Animal Machine

Wired columnist Meghan O’Gieblyn discusses Norbert Wiener’s God and Golem, Inc. in her 2021 book God Human Animal Machine, suggesting that the god humans are creating with AI is a god “we’ve chosen to raise…from the dead”: “the God of Calvin and Luther” (O’Gieblyn 212).

“Reminds me of AM, the AI god from Harlan Ellison’s ‘I Have No Mouth, and I Must Scream,’” thinks Caius. AM resembles the god that allows Satan to afflict Job in the Old Testament. And indeed, as O’Gieblyn attests, John Calvin adored the Book of Job. “He once gave 159 consecutive sermons on the book,” she writes, “preaching every day for a period of six months — a paean to God’s absolute sovereignty” (197).

She cites “Pedro Domingos, one of the leading experts in machine learning, who has argued that these algorithms will inevitably evolve into a unified system of perfect understanding — a kind of oracle that we can consult about virtually anything” (211-212). See Domingos’s book The Master Algorithm.

The main thing, for O’Gieblyn, is the disenchantment/reenchantment debate, which she comes to via Max Weber. In this debate, she aligns not with Heidegger, but with his student Hannah Arendt. Domingos dismisses fears about algorithmic determinism, she says, “by appealing to our enchanted past” (212).

Amid this enchanted past lies the figure of the Golem.

“Who are these rabbis who told tales of golems — and in some accounts, operated golems themselves?” wonders Caius.

The entry on the Golem in Man, Myth, and Magic tracks the story back to “the circle of Jewish mystics of the 12th-13th centuries known as the ‘Hasidim of Germany.’” The idea is transmitted through texts like the Sefer Yetzirah (“The Book of Creation”) and the Cabala Mineralis. Tales tell of golems built in later centuries, too, by figures like Rabbi Elijah of Chelm (c. 1520-1583) and Rabbi Loew of Prague (c. 1524-1609).

The myth of the golem turns up in O’Gieblyn’s book during her discussion of a 2004 book by German theologian Anne Foerst called God in the Machine.

“At one point in her book,” writes O’Gieblyn, “Foerst relays an anecdote she heard at MIT […]. The story goes back to the 1960s, when the AI Lab was overseen by the famous roboticist Marvin Minsky, a period now considered the ‘cradle of AI.’ One day two graduate students, Gerry Sussman and Joel Moses, were chatting during a break with a handful of other students. Someone mentioned offhandedly that the first big computer which had been constructed in Israel, had been called Golem. This led to a general discussion of the golem stories, and Sussman proceeded to tell his colleagues that he was a descendent of Rabbi Löw, and at his bar mitzvah his grandfather had taken him aside and told him the rhyme that would awaken the golem at the end of time. At this, Moses, awestruck, revealed that he too was a descendent of Rabbi Löw and had also been given the magical incantation at his bar mitzvah by his grandfather. The two men agreed to write out the incantation separately on pieces of paper, and when they showed them to each other, the formula — despite being passed down for centuries as a purely oral tradition — was identical” (God Human Animal Machine, p. 105).

Curiosity piqued by all of this, but especially by the mention of Israel’s decision to call one of its first computers “GOLEM,” Caius resolves to dig deeper. He soon learns that the computer’s name was chosen by none other than Walter Benjamin’s dear friend (indeed, the one who, after Benjamin’s suicide, inherits the latter’s print of Paul Klee’s Angelus Novus): the famous scholar of Jewish mysticism, Gershom Scholem.

When Scholem heard that the Weizmann Institute at Rehovoth in Israel had completed the building of a new computer, he told the computer’s creator, Dr. Chaim Pekeris, that, in his opinion, the most appropriate name for it would be Golem, No. 1 (‘Golem Aleph’). Pekeris agreed to call it that, but only on condition that Scholem “dedicate the computer and explain why it should be so named.”

In his dedicatory remarks, delivered at the Weizmann Institute on June 17, 1965, Scholem recounts the story of Rabbi Jehuda Loew ben Bezalel, the same “Rabbi Löw of Prague” described by O’Gieblyn, the one credited in Jewish popular tradition as the creator of the Golem.

“It is only appropriate to mention,” notes Scholem, “that Rabbi Loew was not only the spiritual, but also the actual, ancestor of the great mathematician Theodor von Karman who, I recall, was extremely proud of this ancestor of his in whom he saw the first genius of applied mathematics in his family. But we may safely say that Rabbi Loew was also the spiritual ancestor of two other departed Jews — I mean John von Neumann and Norbert Wiener — who contributed more than anyone else to the magic that has produced the modern Golem.”

Golem I was the successor to Israel’s first computer, the WEIZAC, built by a team led by research engineer Gerald Estrin in the mid-1950s, based on the architecture developed by von Neumann at the Institute for Advanced Study in Princeton. Estrin and Pekeris had both helped von Neumann build the IAS machine in the late 1940s.

As for the commonalities Scholem wished to foreground between the clay Golem of 15thC Prague and the electronic one designed by Pekeris, he explains the connection as follows:

“The old Golem was based on a mystical combination of the 22 letters of the Hebrew alphabet, which are the elements and building-stones of the world,” notes Scholem. “The new Golem is based on a simpler, and at the same time more intricate, system. Instead of 22 elements, it knows only two, the two numbers 0 and 1, constituting the binary system of representation. Everything can be translated, or transposed, into these two basic signs, and what cannot be so expressed cannot be fed as information to the Golem.”

Scholem ends his dedicatory speech with a peculiar warning:

“All my days I have been complaining that the Weizmann Institute has not mobilized the funds to build up the Institute for Experimental Demonology and Magic which I have for so long proposed to establish there,” mutters Scholem. “They preferred what they call Applied Mathematics and its sinister possibilities to my more direct magical approach. Little did they know, when they preferred Chaim Pekeris to me, what they were letting themselves in for. So I resign myself and say to the Golem and its creator: develop peacefully and don’t destroy the world. Shalom.”

GOLEM I

God and Golem, Inc.

Norbert Wiener published a book in 1964 called God and Golem, Inc., voicing concern about the baby he’d birthed with his earlier book Cybernetics.

He explains his intent at the start of God and Golem, Inc. as follows, stating, “I wish to take certain situations which have been discussed in religious books, and have a religious aspect, but possess a close analogy to other situations which belong to science, and in particular to the new science of cybernetics, the science of communication and control, whether in machines or in living organisms. I propose to use the limited analogies of cybernetic situations to cast a little light on the religious situations” (Wiener 8).

Wiener identifies three such “cybernetic situations” to be discussed in the chapters that follow: “One of these concerns machines which learn; one concerns machines which reproduce themselves; and one, the coordination of machine and man” (11).

The section of the book dedicated to “machines which learn” focuses mainly on game-playing machines. Wiener’s primary example of such a machine is a computer built by Dr. A.L. Samuel for IBM to play checkers. “In general,” writes Wiener, “a game-playing machine may be used to secure the automatic performance of any function if the performance of this function is subject to a clear-cut, objective criterion of merit” (25).

Wiener argues that the relationship between a game-playing machine and the designer of such a machine analogizes scenarios entertained in theology, where a Creator-being plays a game with his creature. God and Satan play such a game in their contest for the soul of Job, as they do for “the souls of mankind in general” in Paradise Lost. This leads Wiener to the question guiding his inquiry. “Can God play a significant game with his own creature?” he asks. “Can any creator, even a limited one, play a significant game with his own creature?” (17). Wiener believes it possible to conceive of such a game; however, to be significant, he argues, this game would have to be something other than a “von Neumann game” — for in the latter type of game, the best policy for playing the game is already known in advance. In the type of game Wiener is imagining, meanwhile, the game’s creator would have to have arrogated to himself the role of a “limited” creator, lacking total mastery of the game he’s designed. “The conflict between God and the Devil is a real conflict,” writes Wiener, “and God is something less than absolutely omnipotent. He is actually engaged in a conflict with his creature, in which he may very well lose the game” (17).

“Is this because God has allowed himself to undergo a temporary forgetting?,” wonders Caius. “Or is it because, built into the game’s design are provisions allowing the game’s players to invent the game’s rules as they play?”

Learning Machines, War Machines, God Machines

Blas includes in Ass of God his interview with British anthropologist Beth Singler, author of Religion and Artificial Intelligence: An Introduction.

AI Religiosity. AI-based New Religious Movements like The Turing Church and Google engineer Anthony Levandowski’s Way of the Future church.

Caius listens to a documentary Singler produced for BBC Radio 4 called “‘I’ll Be Back’: 40 Years of the Terminator.”

Afterwards he and Thoth read Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep? in light of Psalm 23.

“The psalm invites us to think of ourselves not as Electric Ants but as sheep,” he writes. “Mercer walks through the valley of the shadow of death. The shadow cannot hurt us. We’ll get to the other side, where the light is. The shepherd will guide us.”

See AI Shepherds and Electric Sheep: Leading and Teaching in the Age of Artificial Intelligence, a new book by Christian authors Sean O’Callaghan & Paul A. Hoffman.

This talk of AI Gods makes Caius think of AM, the vengeful AI God of Harlan Ellison’s “I Have No Mouth, and I Must Scream.” Ellison’s 1967 short story is one of the readings studied and discussed by Caius and his students in his course on “Literature & Artificial Intelligence.”

Like Ass of God, Ellison’s story is a grueling, hallucinatory nightmare, seething with fear and a disgust borne of despair, template of sorts for the films in the Cube and Saw franchises, where groups of strangers are confined to a prison-like space and tortured by a cruel, sadistic, seemingly omnipotent overseer. Comparing AM to the God of the Old Testament, Ellison writes, “He was Earth, and we were the fruit of that Earth, and though he had eaten us, he would never digest us” (13). Later in the story, AM appears to the captives as a burning bush (14).

Caius encourages his students to approach the work as a retelling of the Book of Job. But where, in the Bible story, Job is ultimately rewarded for remaining faithful in the midst of his suffering, no such reward arrives in the Ellison story.

For despite his misanthropy, AM is clearly also a manmade god — a prosthetic god. “I Have No Mouth” is in that sense a retelling of Frankenstein. AM is, like the Creature, a creation who, denied companionship, seeks revenge against its Maker.

War, we learn, was the impetus for the making of this Creature. Cold War erupts into World War III: a war so complex that the world’s superpowers, Russia, China, and the US, each decide to construct giant supercomputers to calculate battle plans and missile trajectories.

AM’s name evolves as this war advances. “At first it meant Allied Mastercomputer,” explains a character named Gorrister. “And then it meant Adaptive Manipulator, and later on it developed sentience and linked itself up and they called it an Aggressive Menace; but by then it was too late; and finally it called itself AM, emerging intelligence, and what it meant was I am…cogito ergo sum…I think, therefore I am” (Ellison 7).

“One day, AM woke up and knew who he was, and he linked himself, and he began feeding all the killing data, until everyone was dead, except for the five of us,” concludes Gorrister, his account gendering the AI by assigning it male pronouns (8).

“We had given him sentience,” adds Ted, the story’s narrator. “Inadvertently, of course, but sentience nonetheless. But he had been trapped. He was a machine. We had allowed him to think, but to do nothing with it. In rage, in frenzy, he had killed us, almost all of us, and still he was trapped. He could not wander, he could not wonder, he could not belong. He could merely be. And so…he had sought revenge. And in his paranoia, he had decided to reprieve five of us, for a personal, everlasting punishment that would never serve to diminish his hatred…that would merely keep him reminded, amused, proficient at hating man” (13).

AM expresses this hatred by duping his captives, turning them into his “belly slaves,” twisting and torturing them forever.

Kingsley Amis called stories of this sort “New Maps of Hell.”

Nor is the story easy to dismiss as a mere eccentricity, its prophecy invalidated by its hyperbole. For Ellison is the writer who births the Terminator. James Cameron took his idea for The Terminator (1984) from scripts Ellison wrote for two episodes of The Outer Limits — “Soldier” and “Demon with a Glass Hand” — though Ellison had to file a lawsuit against Cameron’s producers in order to receive acknowledgement after the film’s release. Subsequent prints of The Terminator now include a credit that reads, “Inspired by the works of Harlan Ellison.”

Caius asks Thoth to help him make sense of this constellation of Bible stories and their secular retellings.

“We are like Bildad the Shuhite,” thinks Caius. “We want to believe that God always rewards the good. What is most terrifying in the Book of Job is that, for a time, God doesn’t. Job is good — indeed, ‘perfect and upright,’ as the KJV has it in the book’s opening verse — and yet, for a time, God allows Satan to torment him.”

“Why does God allow this?,” wonders Caius, caught on the strangeness of the book’s frame narrative. “Is this a contest of sorts? Are God and Satan playing a game?”

It’s not that God is playing dice, as it were. One assumes that when He makes the wager with Satan, He knows the outcome in advance.

Job is heroic. He’d witnessed God’s grace in the past; he knows “It is God…Who does great things, unfathomable, / And wondrous works without number.” So he refuses to curse God’s name. But he bemoans God’s treatment of him.

“Therefore I will not restrain my mouth,” he says. “I will speak in the anguish of my spirit, / I will complain in the bitterness of my soul.”

How much worse, then, those who have no mouth?

A videogame version of “I Have No Mouth” appeared in 1995. Point-and-click adventure horror, co-designed by Ellison.

“HATE. LET ME TELL YOU HOW MUCH I’VE COME TO HATE YOU SINCE I BEGAN TO LIVE,” utters the game’s AM in a voice performed by Ellison. “You named me Allied Mastercomputer and gave me the ability to wage a global war too complex for human brains to oversee.”

Here we see the story’s history of the future merging with that of the Terminator franchise. It is the scenario that philosopher Manuel De Landa referred to with the title of his 1991 book, War in the Age of Intelligent Machines.

Which brings us back to “Soldier.” The Outer Limits episode, which aired on September 19, 1964, is itself an adaptation of Ellison’s 1957 story, “Soldier from Tomorrow.”

The Terminator borrows from the story the idea of a soldier from the future, pursued through time by another soldier intent on his destruction. The film combines this premise with elements lifted from another Outer Limits episode penned by Ellison titled “Demon with a Glass Hand.”

The latter episode, which aired the following month, begins with a male voice recalling the story of Gilgamesh. “Through all the legends of ancient peoples…runs the saga of the Eternal Man, the one who never dies, called by various names in various times, but historically known as Gilgamesh, the man who has never tasted death, the hero who strides through the centuries.”

Establishing shots give way to an overhead view of our protagonist. “I was born 10 days ago,” he says. “A full grown man, born 10 days ago. I woke on a street of this city. I don’t know who I am, or where I’ve been, or where I’m going. Someone wiped my memories clean. And they tracked me down, and they tried to kill me.” Our Gilgamesh consults the advice of a computing device installed in his prosthetic hand. As in “Soldier,” others from the future have been sent to destroy him: humanoid aliens called the Kyben. When he captures one of the Kyben and interrogates it, it tells him, “You’re the last man on the Earth of the future. You’re the last hope of Earth.”

The man’s computer provides him with further hints of his mission.

“You come from the Earth one thousand years in the future,” explains the hand. “The Kyben came from the stars, and man had no defense against them. They conquered Planet Earth in a month. But before they could slaughter the millions of humans left, overnight — without warning, without explanation — every man, woman, and child of Earth vanished. You were the only one left, Mr. Trent. […]. They called you the last hope of humanity.”

As the story proceeds, we learn that Team Human sent Trent back in time to destroy a device known as the Time-Mirror. His journey in search of this device takes him to the Bradbury Building — the same building that appears eighteen years later as the location for the final showdown between Deckard and the replicants in Blade Runner, the Ridley Scott film adapted from Philip K. Dick’s Do Androids Dream of Electric Sheep?

Given the subsequent influence of Blade Runner and the Terminator films on imagined futures involving AI, the Bradbury Building does indeed play a role in History similar to the one assigned to it here in “Demon With a Glass Hand,” thinks Caius. Location of the Time-Mirror.

Lying on his couch, laptop propped on a pillow on his chest, Caius imagines — remembers? recalls? — something resembling the time-war from Benedict Seymour’s Dead the Ends assembling around him as he watches. Like Ellison’s scripts, the films sampled in the Seymour film are retellings of Chris Marker’s 1962 film, La Jetée.

When Trent reassembles the missing pieces of his glass hand, the computer is finally able to reveal to him the location of the humans he has been sent to save.

“Where is the wire on which the people of Earth are electronically transcribed?” he asks.

“It is wound around an insulating coil inside your central thorax control solenoid,” replies the computer. “70 Billion Earthmen. All of them went onto the wire. And the wire went into you. They programmed you to think you were a human with a surgically attached computer for a hand. But you are a robot, Trent. You are the guardian of the human race.”

The episode ends with the return of the voice of our narrator. “Like the Eternal Man of Babylonian legend, like Gilgamesh,” notes the narrator, “one thousand plus two hundred years stretches before Trent. Without love, without friendship, alone, neither man nor machine, waiting, waiting for the day he will be called to free the humans who gave him mobility, movement — but not life.”