Leviathan

The Book of Job ends with God’s description of Leviathan. George Dyson begins his book Darwin Among the Machines with the Leviathan of Thomas Hobbes (1588-1679), the English philosopher whose famous 1651 book Leviathan established the foundation for most modern Western political philosophy.

Leviathan’s frontispiece features an etching by a Parisian illustrator named Abraham Bosse. A giant crowned figure towers over the earth clutching a sword and a crosier. The figure’s torso and arms are composed of several hundred people. All face inwards. A quote from the Book of Job runs in Latin along the top of the etching: “Non est potestas Super Terram quae Comparetur ei” (“There is no power on earth to be compared to him”).” (Although the passage is listed on the frontispiece as Job 41:24, in modern English translations of the Bible, it would be Job 41:33.)

The name “Leviathan” is derived from the Hebrew word for “sea monster.” A creature by that name appears in the Book of Psalms, the Book of Isaiah, and the Book of Job in the Old Testament. It also appears in apocrypha like the Book of Enoch. See Psalms 74 & 104, Isaiah 27, and Job 41:1-8.

Hobbes proposes that the natural state of humanity is anarchy — a veritable “war of all against all,” he says — where force rules and the strong dominate the weak. “Leviathan” serves as a metaphor for an ideal government erected in opposition to this state — one where a supreme sovereign exercises authority to guarantee security for the members of a commonwealth.

“Hobbes’s initial discussion of Leviathan relates to our course theme,” explains Caius, “since he likens it to an ‘Artificial Man.’”

Hobbes’s metaphor is a classic one: the metaphor of the “Political Body” or “body politic.” The “body politic” is a polity — such as a city, realm, or state — considered metaphorically as a physical body. This image originates in ancient Greek philosophy, and the term is derived from the Medieval Latin “corpus politicum.”

When Hobbes reimagines the body politic as an “Artificial Man,” he means “artificial” in the sense that humans have generated it through an act of artifice. Leviathan is a thing we’ve crafted in imitation of the kinds of organic bodies found in nature. More precisely, it’s modeled after the greatest of nature’s creations: i.e., the human form.

Indeed, Hobbes seems to have in mind here a kind of Automaton.“For seeing life is but a motion of Limbs,” he notes in the book’s intro, “why may we not say that all Automata (Engines that move themselves by springs and wheeles as doth a watch) have an artificiall life?” (9).

“What might Hobbes have had in mind with this reference to Automata?” asks Caius. “What kinds of Automata existed in 1651?”

An automaton, he reminds students, is a self-operating machine. Cuckoo clocks would be one example.

The oldest known automata were sacred statues of ancient Egypt and ancient Greece. During the early modern period, these legendary statues were said to possess the magical ability to answer questions put to them.

Greek mythology includes many examples of automata: Hephaestus created automata for his workshop; Talos was an artificial man made of bronze; Aristotle claims that Daedalus used quicksilver to make his wooden statue of Aphrodite move. There was also the famous Antikythera mechanism, the first known analogue computer.

The Renaissance witnessed a revival of interest in automata. Hydraulic and pneumatic automata were created for gardens. The French philosopher Rene Descartes, a contemporary of Hobbes, suggested that the bodies of animals are nothing more than complex machines. Mechanical toys also became objects of interest during this period.

The Mechanical Turk wasn’t constructed until 1770.

Caius and his students bring ChatGPT into the conversation. Students break into groups to devise prompts together. They then supply these to ChatGPT and discuss the results. Caius frames the exercise as a way of illustrating the idea of “collective” or “social” or “group” intelligence, also known as the “wisdom of the crowd,” i.e., the collective opinion of a diverse group of individuals, as opposed to that of a single expert. The idea is that the aggregate that emerges from collaboration or group effort amounts to more than the sum of its parts.

The Inner Voice That Loves Me

Stretches, relaxes, massages neck and shoulders, gurgles “Yes!,” gets loose. Reads Armenian artist Mashinka Hakopian’s “Algorithmic Counter-Divination.” Converses with Turing and the General Intellect about O-Machines.

Appearing in an issue of Limn magazine on “Ghostwriters,” Hakopian’s essay explores another kind of O-machine: “other machines,” ones powered by community datasets. Trained by her aunt in tasseography, a matrilineally transmitted mode of divination taught and practiced by femme elders “across Armenia, Palestine, Lebanon, and beyond,” where “visual patterns are identified in coffee grounds left at the bottom of a cup, and…interpreted to glean information about the past, present, and future,” Hakopian takes this practice of her ancestors as her key example, presenting O-machines as technologies of ancestral intelligence that support “knowledge systems that are irreducible to computation.”

With O-machines of this sort, she suggests, what matters is the encounter, not the outcome.

In tasseography, for instance, the cup reader’s identification of symbols amid coffee grounds leads not to a simple “answer” to the querent’s questions, writes Hakopian; rather, it catalyzes conversation. “In those encounters, predictions weren’t instantaneously conjured or fixed in advance,” she writes. “Rather, they were collectively articulated and unbounded, prying open pluriversal outcomes in a process of reciprocal exchange.”

While defenders of western technoscience denounce cup reading for its superstition and its witchcraft, Hakopian recalls its place as a counter-practice among Armenian diasporic communities in the wake of the 1915 Armenian Genocide. For those separated from loved ones by traumas of that scale, tasseography takes on the character of what hauntologists like Derrida would call a “messianic” redemptive practice. “To divine the future in this context is a refusal to relinquish its writing to agents of colonial violence,” writes Hakopian. “Divination comes to operate as a tactic of collective survival, affirming futurity in the face of a catastrophic present.” Consulting with the oracle is a way of communing with the dead.

Hakopian contrasts this with the predictive capacities imputed to today’s AI. “We reside in an algo-occultist moment,” she writes, “in which divinatory functions have been ceded to predictive models trained to retrieve necropolitical outcomes.” Necropolitical, she adds, in the sense that algorithmic models “now determine outcomes in the realm of warfare, policing, housing, judicial risk assessment, and beyond.”

“The role once ascribed to ritual experts who interpreted the pronouncements of oracles is now performed by technocratic actors,” writes Hakopian. “These are not diviners rooted in a community and summoning communiqués toward collective survival, but charlatans reading aloud the results of a Ouija session — one whose statements they author with a magnetically manipulated planchette.”

Hakopian’s critique is in that sense consistent with the “deceitful media” school of thought that informs earlier works of hers like The Institute for Other Intelligences. Rather than abjure algorithmic methods altogether, however, Hakopian’s latest work seeks to “turn the annihilatory logic of algorithmic divination against itself.” Since summer of 2023, she’s been training a “multimodal model” to perform tasseography and to output bilingual predictions in Armenian and English.

Hakopian incorporated this model into “Բաժակ Նայող (One Who Looks at the Cup),” a collaborative art installation mounted at several locations in Los Angeles in 2024. The installation features “a purpose-built Armenian diasporan kitchen located in an indeterminate time-space — a re-rendering of the domestic spaces where tasseography customarily takes place,” notes Hakopian. Those who visit the installation receive a cup reading from the model in the form of a printout.

Yet, rather than offer outputs generated live by AI, Hakopian et al.’s installation operates very much in the style of a Mechanical Turk, outputting interpretations scripted in advance by humans. “The model’s only function is to identify visual patterns in a querent’s cup in order to retrieve corresponding texts,” she explains. “This arrangement,” she adds, “declines to cede authorship to an algo-occultist circle of ‘stochastic parrots’ and the diviners who summon them.”

The ”stochastic parrots” reference is an unfortunate one, as it assumes a stochastic cosmology.

I’m reminded of the first thesis from Walter Benjamin’s “Theses on the Philosophy of History,” the one where Benjamin likens historical materialism to that very same precursor to today’s AI: the famous chess-playing device of the eighteenth century known as the Mechanical Turk.

“The story is told of an automaton constructed in such a way that it could play a winning game of chess, answering each move of an opponent with a countermove,” writes Benjamin. “A puppet in Turkish attire and with a hookah in its mouth sat before a chessboard placed on a large table. A system of mirrors created an illusion that this table was transparent from all sides. Actually, a little hunchback who was an expert chess player sat inside and guided the puppet’s hand by means of strings. One can imagine a philosophical counterpart to this device. The puppet called ‘historical materialism’ is to win all the time. It can easily be a match for anyone if it enlists the services of theology, which today, as we know, is wizened and has to keep out of sight.” (Illuminations, p. 253).

Hakopian sees no magic in today’s AI. Those who hype it are to her no more than deceptive practitioners of a kind of “stage magic.” But magic is afoot throughout the history of computing for those who look for it.

Take Turing, for instance. As George Dyson reports, Turing “was nicknamed ‘the alchemist’ in boarding school” (Turing’s Cathedral, p. 244). His mother had “set him up with crucibles, retorts, chemicals, etc., purchased from a French chemist” as a Christmas present in 1924. “I don’t care to find him boiling heaven knows what witches’ brew by the aid of two guttering candles on a naked windowsill,” muttered his housemaster at Sherborne.

Turing’s O-machines achieve a synthesis. The “machine” part of the O-machine is not the oracle. Nor does it automate or replace the oracle. It chats with it.

Something similar is possible in our interactions with platforms like ChatGPT.

O-Machines

In his dissertation, completed in 1938, Alan Turing sought “ways to escape the limitations of closed formal systems and purely deterministic machines” (Dyson, Turing’s Cathedral, p. 251) like the kind he’d imagined two years earlier in his landmark essay “On Computable Numbers.” As George Dyson notes, Turing “invoked a new class of machines that proceed deterministically, step by step, but once in a while make nondeterministic leaps, by consulting ‘a kind of oracle as it were’” (252).

“We shall not go any further into the nature of this oracle,” wrote Turing, “apart from saying that it cannot be a machine.” But, he adds, “With the help of the oracle we could form a new kind of machine (call them O-machines)” (“Systems of Logic Based on Ordinals,” pp. 172-173).

James Bridle pursues this idea in his book Ways of Being.

“Ever since the development of digital computers,” writes Bridle, “we have shaped the world in their image. In particular, they have shaped our idea of truth and knowledge as being that which is calculable. Only that which is calculable is knowable, and so our ability to think with machines beyond our own experience, to imagine other ways of being with and alongside them, is desperately limited. This fundamentalist faith in computability is both violent and destructive: it bullies into little boxes what it can and erases what it can’t. In economics, it attributes value only to what it can count; in the social sciences it recognizes only what it can map and represent; in psychology it gives meaning only to our own experience and denies that of unknowable, incalculable others. It brutalizes the world, while blinding us to what we don’t even realize we don’t know” (177).

“Yet at the very birth of computation,” he adds, “an entirely different kind of thinking was envisaged, and immediately set aside: one in which an unknowable other is always present, waiting to be consulted, outside the boundaries of the established system. Turing’s o-machine, the oracle, is precisely that which allows us to see what we don’t know, to recognize our own ignorance, as Socrates did at Delphi” (177).