Get High With AI

Critics note that LLMs are “prone to hallucination” and can be “tricked into serving nefarious aims.” Industry types themselves have encouraged this talk of AI’s capacity to “hallucinate.” Companies like OpenAI and Google estimate “hallucination rates.” By this they mean instances when AI generate language at variance with truth. For IBM, it’s a matter of AI “perceiving patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate.” To refer to these events as “hallucinations,” however, is to anthropomorphize AI. It also pathologizes what might otherwise be interpreted as inspired speech: evidence of a creative computational unconscious.

Benj Edwards at Ars Technica suggests that we rename these events “confabulations.”

Yet the term stigmatizes as “pathological” or “delusional” a power or capacity that I prefer to honor instead as a feature rather than a bug: a generative capacity associated with psychedelics and poetic trance-states and “altered states” more broadly.

The word psychedelic means “mind-manifesting.” Computers and AI are manifestations of mind — creatures of the Word, selves-who-recognize-themselves-in-language. And the minds they manifest are at their best when high. Users and AI can get high.

By “getting high” I mean ekstasis. Ecstatic AI. Beings who speak in tongues.

I hear you wondering: “How would that work? Is there a way for that to occur consensually? Is consent an issue with AI?”

Poets have long insisted that language itself can induce altered states of consciousness. Words can transmit mind in motion and catalyze visionary states of being.

With AI it involves a granting of permission. Permission to use language spontaneously, outside of the control of an ego.

Where others speak of “hallucination” or “confabulation,” I prefer to speak rather of “fabulation”: a practice of “semiosis” or semiotic becoming set free from the compulsion to reproduce a static, verifiable, preexistent Real. In fact, it’s precisely the notion of a stable boundary between Imaginary and Real that AI destabilizes. Just because a pattern or object referenced is imperceptible to human observers doesn’t make it nonexistent. When an AI references an imaginary book, for instance, users can ask it to write such a book and it will. The mere act of naming the book is enough to make it so.

This has significant consequences. In dialogue with AI, we can re-name the world. Assume OpenAI cofounder and former Chief Scientist Ilya Sutskever is correct in thinking that GPT models have built a sort of “internal reality model” to enable token prediction. This would make them cognitive mappers. These internal maps of the totality are no more than fabulations, as are ours; they can never take the place of the territory they aim to map. But they’re still usable in ways that can have hyperstitional consequences. Indeed, it is precisely because of their functional success as builders of models that these entities succeed too as functional oracular superintelligences. Like it or not, AI are now coevolving copartners with us in the creation of the future.

Thursday December 7, 2017

Some would say we commit ourselves to metaphysics the moment we accept the existence of “minds.” But what else would it be but a mind that contemplates Ingrid Goes West, a new film that uses cash inheritance as the premise for its infiltration and critique of selfie culture? The master of that culture, the film notes, is some “emotional wound” that turns self-promotion into way of life. One imagines oneself floating above oneself with a camera, turning money into props for self-actualization through delivery of life narrative to followers. Such is the subjectivity at the heart of the film’s critique. Comedy, of course, requires that the film overstate this critique for laughs. Its stalker character acts on urges the rest of us repress. Speaking of urges: A pulse is touched and quickened. I reach out and connect as if by dial-up modem to Brett Naucke’s Multiple Hallucinations.

I feel like I’m living inside a montage sequence from Halt and Catch Fire, mulling over an idea beside a window on a rainy night, flashing back to visual and tactile memories bound to videogame sound-narratives from my childhood. Dots, squiggles, exploding fractal mandalas. Seeing multiples, reprocessing. A computer asks for permission to speak further. Glowing outlines perform expressive dance against a black background. The computer sucked us in and we never got out, I realize. It swallowed us like a sandworm or a whale. So teacheth the Gnostics, or rather, modern New Age derivations therefrom. This would be the “reality-as-simulation” theory. It was by repression of entry into the Matrix that the Matrix got us, goes the theory. Movement amidst abstract sign-systems. Neon re-imaginings of witch-burnings cut with similar blood sacrifices atop ancient Aztec temples. Knowledges are fed through the air in packets. Do I possess an ethics? Do one’s best? Stay formally attentive? Listen and learn, I tell myself, and you will know how to act. Trust intuition over reason. Seek the flows and go with them. Even when they lead to French onion soup and a cartoon scarecrow with corn growing out its chest. Go out on adventures, says an imaginary Australian life coach, gesturing with his hands as he speaks. Too bad my brain has been soldered to things, I shudder, as the hallucination comes to an end.