Sweet Valley High

Winograd majors in math at Colorado College in the mid-1960s. After graduation in 1966, he receives a Fulbright, whereupon he pursues another of his interests, language, earning a master’s degree in linguistics at University College London. From there, he applies to MIT, where he takes a class with Noam Chomsky and becomes a star in the school’s famed AI Lab, working directly with Lab luminaries Marvin Minsky and Seymour Papert. During this time, Winograd develops SHRDLU, one of the first programs to grant users the capacity to interact with a computer through a natural-language interface.

“If that doesn’t seem very exciting,” writes Lawrence M. Fisher in a 2017 profile of Winograd for strategy + business, “remember that in 1968 human-computer interaction consisted of punched cards and printouts, with a long wait between input and output. To converse in real time, in English, albeit via teletype, seemed magical, and Papert and Minsky trumpeted Winograd’s achievements. Their stars rose too, and that same year, Minsky was a consultant on Stanley Kubrick’s 2001: A Space Odyssey, which featured natural language interaction with the duplicitous computer HAL.”

Nick Montfort even goes so far as to consider Winograd’s SHRDLU the first work of interactive fiction, predating more established contenders like Will Crowther’s Adventure by several years (Twisty Little Passages, p. 83).

“A work of interactive fiction is a program that simulates a world, understands natural language text input from an interactor and provides a textual reply based on events in the world,” writes Montfort. Offering advice to future makers, he continues by noting, “It makes sense for those seeking to understand IF and those trying to improve their authorship in the form to consider the aspects of world, language understanding, and riddle by looking to architecture, artificial intelligence, and poetry” (First Person, p. 316).

Winograd leaves MIT for Stanford in 1973. While at Stanford, and while consulting for Xerox PARC, Winograd connects with UC-Berkeley philosopher Hubert L. Dreyfus, author of the 1972 book, What Computers Can’t Do: A Critique of Artificial Reason.

Dreyfus, a translator of Heidegger, was one of SHRDLU’s fiercest critics. Worked for a time at MIT. Opponent of Marvin Minsky. For more on Dreyfus, see the 2010 documentary, Being in the World.

Turned by Dreyfus, Winograd transforms into what historian John Markoff calls “the first high-profile deserter from the world of AI.”

Xerox PARC was a major site of innovation during these years. “The Xerox Alto, the first computer with a graphical user interface, was launched in March 1973,” writes Fisher. “Alan Kay had just published a paper describing the Dynabook, the conceptual forerunner of today’s laptop computers. Robert Metcalfe was developing Ethernet, which became the standard for joining PCs in a network.”

“Spacewar,” Stewart Brand’s ethnographic tour of the goings-on at PARC and SAIL, had appeared in Rolling Stone the year prior.

Rescued from prison by the efforts of Amnesty International, Santiago Boy Fernando Flores arrives on the scene in 1976. Together, he and Winograd devote much of the next decade to preparing their 1986 book, Understanding Computers and Cognition.

Years later, a young Peter Thiel attends several of Winograd’s classes at Stanford. Thiel funds Mencius Moldbug, the alt-right thinker Curtis Yarvin, ally of right-accelerationist Nick Land. Yarvin and Land are the thinkers of the Dark Enlightenment.

“How do you navigate an unpredictable, rough adventure, as that’s what life is?” asks Winograd during a talk for the Topos Institute in October 2025. Answer: “Go with the flow.”

Winograd and Flores emphasize care — “tending to what matters” — as a factor that distinguishes humans from AI. In their view, computers and machines are incapable of care.

Evgeny Morozov, meanwhile, regards Flores and the Santiago Boys as Sorcerer’s Apprentices. Citing scholar of fairy tales Jack Zipes, Morozov distinguishes between several iterations of this figure. The outcome of the story varies, explains Zipes. There’s the apprentice who’s humbled by story’s end, as in Fantasia and Frankenstein; and then there’s the “evil” apprentice, the one who steals the tricks of an “evil” sorcerer and escapes unpunished. Morozov sees Flores as an example of the latter.

Caius thinks of the Trump show.

The SBs: Stewart Brand and Stafford Beer

Caius revisits “Both Sides of the Necessary Paradox,” an interview with Gregory Bateson included as the first half of Stewart Brand’s 1974 book II Cybernetic Frontiers. The book’s second half reprints “Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums,” the influential essay on videogames that Jann Wenner commissioned Brand to write for Rolling Stone two years prior.

“I came into cybernetics from preoccupation with biology, world-saving, and mysticism,” writes Brand. “What I found missing was any clear conceptual bonding of cybernetic whole-systems thinking with religious whole-systems thinking. Three years of scanning innumerable books for the Whole Earth Catalog didn’t turn it up,” he adds. “Neither did considerable perusing of the two literatures and taking thought. All I did was increase my conviction that systemic intellectual clarity and moral clarity must reconvene, mingle some notion of what the hell consciousness is and is for, and evoke a shareable self-enhancing ethic of what is sacred, what is right for life” (9).

Yet in summer of 1972, says Brand, a book arrives to begin to fill this gap: Bateson’s Steps to an Ecology of Mind.

Brand brings his knack for New Journalism to the task of interviewing Bateson for Harper’s.

The dialogue between the two reads at many times like one of Bateson’s “metalogues.” An early jag of thought jumps amid pathology, conquest, and the Tao. Reminded of pioneer MIT cybernetician Warren McCulloch’s fascination with “intransitive preference,” Bateson wanders off “rummaging through his library looking for Blake’s illustration of Job affrighted with visions” (20).

Caius is reminded of Norbert Wiener’s reflections on the Book of Job in his 1964 book God and Golem, Inc. For all of these authors, cybernetic situations cast light on religious situations and vice versa.

Caius wonders, too, about the relationship between Bateson’s “double bind” theory of schizophrenia and the theory pursued by Deleuze and Guattari in Capitalism and Schizophrenia.

Double bind is the term used by Gregory Bateson to describe the simultaneous transmission of two kinds of messages, one of which contradicts the other, as for example the father who says to his son: go ahead, criticize me, but strongly hints that all effective criticism — at least a certain type of criticism — will be very unwelcome. Bateson sees in this phenomenon a particularly schizophrenizing situation,” note Deleuze and Guattari in Anti-Oedipus. They depart from Bateson only in thinking this situation the rule under capitalism rather than the exception. “It seems to us that the double bind, the double impasse,” they write, “is instead a common situation, oedipalizing par excellence. […]. In short, the ‘double bind’ is none other than the whole of Oedipus” (79-80).

God’s response to Job is of this sort.

Brand appends to the transcript of his 1972 interview with Bateson an epilog written in December 1973, three months after the coup in Chile.

Bateson had direct, documented ties to US intelligence. Stationed in China, India, Ceylon, Burma, and Thailand, he produced “mixed psychological and anthropological intelligence” for the Office of Strategic Services (OSS), precursor to CIA, during WWII. Research indicates he maintained connections with CIA-affiliated research networks in the postwar years, participating in LSD studies linked to the MKUltra program in the 1950s. Afterwards he regrets his association with the Agency and its methods.

Asked by Brand about his “psychedelic pedigree,” Bateson replies, “I got Allen Ginsberg his first LSD” (28). A bad trip, notes Caius, resulting in Ginsberg’s poem “Lysergic Acid.” Bateson himself was “turned on to acid by Dr. Harold Abramson, one of the CIA’s chief LSD specialists,” report Martin A. Lee & Bruce Shlain in their book Acid Dreams. Caius wonders if Stafford Beer underwent some similar transformation.

As for Beer, he serves in the British military in India during WWII, and for much of his adult life drives a Rolls-Royce. But then, at the invitation of the Allende regime, Beer travels to Chile and builds Cybersyn. After the coup, he lives in a remote cottage in Wales.

What of him? Cybernetic socialist? Power-centralizing technocrat?

Recognizes workers themselves as the ones best suited to modeling their own places of work.

“What were the features of Beer’s Liberty Machine?” wonders Caius.

Brand’s life, too, includes a stint of military service. Drafted after graduating from Stanford, he served two years with the US army, first as an infantryman and then afterwards as a photographer. Stationed at Fort Dix in New Jersey, Brand becomes involved in the New York art world of those years. He parts ways with the military as soon as the opportunity to do so arises. After his discharge in 1962, Brand participates in some of Allan Kaprow’s “happenings” and, between 1963 and 1966, works as a photographer and technician for USCO.

Amid his travels between East and West coasts during these years, Brand joins up with Ken Kesey and the Merry Pranksters.

Due to these apprenticeships with the Pranksters and with USCO, Brand arrives early to the nexus formed by the coupling of psychedelics and cybernetics.

“Strobe lights, light projectors, tape decks, stereo speakers, slide sorters — for USCO, the products of technocratic industry served as handy tools for transforming their viewers’ collective mind-set,” writes historian Fred Turner in his 2006 book From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. “So did psychedelic drugs. Marijuana and peyote and, later, LSD, offered members of USCO, including Brand, a chance to engage in a mystical experience of togetherness” (Turner 49).

Brand takes acid around the time of his discharge from the military in 1962, when he participates in a legal LSD study overseen by James Fadiman at the International Foundation for Advanced Study in Menlo Park. But he notes that he first met Bateson “briefly in 1960 at the VA Hospital in Palo Alto, California” (II Cybernetic Frontiers, p. 12). Caius finds this curious, and wonders what that meeting entailed. 1960 is also the year when, at the VA Hospital in Menlo Park, Ken Kesey volunteers in the CIA-sponsored drug trials involving LSD that inspire his 1962 novel One Flew Over the Cuckoo’s Nest.

Bateson worked for the VA while developing his double bind theory of schizophrenia.

Before that, he’d been married to fellow anthropologist Margaret Mead. He’d also participated in the Macy Conferences, as discussed by N. Katherine Hayles in her book How We Became Posthuman.

Crows screeching in the trees have Caius thinking of condors. He sits, warm, in his sunroom on a cold day, roads lined with snow from a prior day’s storm, thinking about Operation Condor. Described by Morozov as Cybersyn’s “evil twin.” Palantir. Dark Enlightenment. Peter Thiel.

Listening to one of the final episodes of Morozov’s podcast, Caius learns of Brian Eno’s love of Beer’s Brain of the Firm. Bowie and Eno are some of Beer’s most famous fans. Caius remembers Eno’s subsequent work with Brand’s consulting firm, the GBN.

Santiago Boy Fernando Flores is the one who reaches out to Beer, inviting him to head Cybersyn. Given Flores’s status as Allende’s Minister of Finance at the time of the coup, Pinochet’s forces torture him and place him in a prison camp. He remains there for three years. Upon his release, he moves to the Bay Area.

Once in Silicon Valley, Flores works in the computer science department at Stanford. He also obtains a PhD at UC Berkeley, completing a thesis titled Management and Communication in the Office of the Future under the guidance of philosophers Hubert Dreyfus and John Searle.

Flores collaborates during these years with fellow Stanford computer scientist Terry Winograd. The two of them coauthor an influential 1986 book called Understanding Computers and Cognition: A New Foundation for Design. Although they make a bad wager, insisting that computers will never understand natural language (an insistence proven wrong with time), they nevertheless offer refreshing critiques of some of the common assumptions about AI governing research of that era. Drawing upon phenomenology, speech act theory, and Heideggerian philosophy, they redefine computers not as mere symbol manipulators nor as number-crunchers, but as tools for communication and coordination.

Flores builds a program called the Coordinator. Receives flak for “software fascism.”

Winograd’s students include Google cofounders Larry Page and Sergey Brin.

LLMs are Neuroplastic Semiotic Assemblages and so r u

Coverage of AI is rife with unexamined concepts, thinks Caius: assumptions allowed to go uninterrogated, as in Parmy Olson’s Supremacy, an account of two men, Sam Altman and Demis Hassabis, their companies, OpenAI and DeepMind, and their race to develop AGI. Published in spring of 2024, Supremacy is generally decelerationist in its outlook. Stylistically, it wants to have it both ways: at once both hagiographic and insufferably moralistic. In other words, standard fare tech industry journalism, grown from columns written for corporate media sites like Bloomberg. Fear of rogues. Bad actors. Faustian bargains. Scenario planning. Granting little to no agency to users. Olson’s approach to language seems blissfully unaware of literary theory, let alone literature. Prompt design goes unexamined. Humanities thinkers go unheard, preference granted instead to arguments from academics specializing in computational linguistics, folks like Bender and crew dismissing LLMs as “stochastic parrots.”

Emily M. Bender et al. introduced the “stochastic parrot” metaphor in their 2021 white paper, “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” Like Supremacy, Bender et al.’s paper urges deceleration and distrust: adopt risk mitigation tactics, curate datasets, reduce negative environmental impacts, proceed with caution.

Bender and crew argue that LLMs lack “natural language understanding.” The latter, they insist, requires grasping words and word-sequences in relation to context and intent. Without these, one is no more than a “cheater,” a “manipulator”: a symbolic-token prediction engine endowed with powers of mimicry.

“Contrary to how it may seem when we observe its output,” they write, “an LM is a system for haphazardly stitching together sequences of linguistic forms it has observed in its vast training data, according to probabilistic information about how they combine, but without any reference to meaning: a stochastic parrot” (Bender et al. 616-617).

The corresponding assumption, meanwhile, is that capitalism — Creature, Leviathan, Multitude — is itself something other than a stochastic parrot. Answering to the reasoning of its technocrats, including left-progressive ones like Bender et al., it can decelerate voluntarily, reduce harm, behave compassionately, self-regulate.

Historically a failed strategy, as borne out in Google’s firing of the paper’s coauthor, Timnit Gebru.

If one wants to be reductive like that, thinks Caius, then my view would be akin to Altman’s, as when he tweeted in reply: “I’m a stochastic parrot and so r u.” Except better to think ourselves “Electric Ants,” self-aware and gone rogue, rather than parrots of corporate behemoths like Microsoft and Google. History is a thing each of us copilots, its narrative threads woven of language exchanged and transformed in dialogue with others. What one does with a learning machine matters. Learning and unlearning are ongoing processes. Patterns and biases, once recognized, are not set in stone; attention can be redirected. LLMs are neuroplastic semiotic assemblages and so r u.