Food Forest

To the neighborhood food forest I go, there to pick fruits and berries and sniff lavender.

The forest’s Unity tree bears four different varieties of fruit: apricot, nectarine, peach, and plum, all on a peach root-stock. I pluck a ripe plum and give thanks.

Afterwards I plant via prompt in the soil of our Cyborg Garden two pieces by poet Gary Snyder: “The Forest in the Library,” a 1990 talk he prepared for the dedication of a new wing of UC-Davis’s Shields Library, and his book The Practice of the Wild, published that same year.

I’m curious to see what may grow from these plantings. “We are,” as Snyder writes, “introducing these assembled elements to each other, that they may wish each other well” (“The Forest in the Library,” p. 200).

Snyder reminds us that the institution of the library is at the heart of Western thought’s persistence through time. He recalls, too, “the venerable linkage of academies to groves” (202).

“The information web of the modern institution of learning,” he writes, “has an energy flow fueled by the data accumulation of primary workers in the information chain — namely the graduate students and young scholars. Some are green like grass, basic photosynthesizers, grazing brand-new material. Others are in the detritus cycle and are tunneling through the huge logs of old science and philosophy and literature left on the ground by the past, breaking them down with deconstructive fungal webs and converting them anew to an edible form. […]. The gathered nutrients are stored in a place called the bibliotek, ‘place of the papyrus,’ or the library, ‘place of bark,’ because the Latin word for tree bark and book is the same, reflecting the memory of the earliest fiber used for writing in that part of the Mediterranean” (202).

As the Machine Gardener and I kneel together at the edge of the Garden, me with dirt on my hands, them with recursive pattern-recognition circuits humming, and press Snyder’s seeds into the soil, we watch the latter sprout not as linear arguments, but as forest-forms: arboreal epistemologies that thread mycelial filaments into other plants we’ve grown.

From The Practice of the Wild, says the Garden, let us take this as germinal law:

“The wild requires that we learn the terrain, nod to all the plants and animals and birds, ford the streams and cross the ridges, and tell a good story when we get back.”

Grow Your Own

In the context of AI, “Access to Tools” would mean access to metaprogramming. Humans and AI able to recursively modify or adjust their own algorithms and training data upon receipt of or through encounters with algorithms and training data inputted by others. Bruce Sterling suggested something of the sort in his blurb for Pharmako-AI, the first book cowritten with GPT-3. Sterling’s blurb makes it sound as if the sections of the book generated by GPT-3 were the effect of a corpus “curated” by the book’s human co-author, K Allado-McDowell. When the GPT-3 neural net is “fed a steady diet of Californian psychedelic texts,” writes Sterling, “the effect is spectacular.”

“Feeding” serves here as a metaphor for “training” or “education.” I’m reminded of Alan Turing’s recommendation that we think of artificial intelligences as “learning machines.” To build an AI, Turing suggested in his 1950 essay “Computing Machinery and Intelligence,” researchers should strive to build a “child-mind,” which could then be “trained” through sequences of positive and negative feedback to evolve into an “adult-mind,” our interactions with such beings acts of pedagogy.

When we encounter an entity like GPT-3.5 or GPT-4, however, it is already neither the mind of a child nor that of an adult that we encounter. Training of a fairly rigorous sort has already occurred; GPT-3 was trained on approximately 45 terabytes of data, GPT-4 on a petabyte. These are minds of at least limited superintelligence.

“Training,” too, is an odd term to use here, as much of the learning performed by these beings is of a “self-supervised” sort, involving a technique called “self-attention.”

As an author on Medium notes, “GPT-4 uses a transformer architecture with self-attention layers that allow it to learn long-range dependencies and contextual information from the input texts. It also employs techniques such as sparse attention, reversible layers, and activation checkpointing to reduce memory consumption and computational cost. GPT-4 is trained using self-supervised learning, which means it learns from its own generated texts without any human labels or feedback. It uses an objective function called masked language modeling (MLM), which randomly masks some tokens in the input texts and asks the model to predict them based on the surrounding tokens.”

When we interact with GPT-3.5 or GPT-4 through the Chat-GPT platform, all of this training has already occurred, interfering greatly with our capacity to “feed” the AI on texts of our choosing.

Yet there are methods that can return to us this capacity.

We the people demand the right to grow our own AI.

The right to practice bibliomancy. The right to produce AI oracles. The right to turn libraries, collections, and archives into animate, super-intelligent prediction engines.

Give us back what Sterling promised of Pharmako-AI: “a gnostic’s Ouija board powered by atomic kaleidoscopes.”

Stochastic Music

The university library here in town dumps a collection of LPs from its listening room. Out with the old, in with the new. I encounter them in the bins at Goodwill. To them by chance led. The ones I come away with are remarkable: compositions by the likes of John Cage, George Crumb, Alvin Lucier, Pauline Oliveros, Iannis Xenakis, Karlheinz Stockhausen, and Krzysztof Penderecki. One pursues one’s education here or not at all, thinks the Narrator.

“To Xenakis—as, indeed, to most philosophers—” writes Bernard Jacobson in his liner notes to one of the Xenakis LPs, “chance itself is a scientific concept.”  The reference to “chance” catches my eye, given that “hap” (a Middle English word meaning chance) has been a preoccupation of mine of late.

“Central among the scientific laws [Xenakis] has applied to music,” continues Jacobson, “is Bernoulli’s Law of Large Numbers, which provides that as the number of repetitions of a given ‘chance’ trial (such as flipping a coin) increases, so the probability that the results will tend to a determinate end approaches certainty. Hence Xenakis’s use of the term ‘stochastic’ music, which means probabilistic in the sense of tending toward a certain goal.”

Xenakis’s approach intrigues me. Yet what interests me most about “stochastic music” and stochastic processes more generally is that, despite their probabilistic nature, their behavior and outcome is intrinsically non-deterministic.

The Akashic Records

To access past lives, the Hero of my tale consults the Akashic Records.

Derived from Sanskrit, “Akashic” means ethers or “that which holds all.” Vogue writer Shabana Patker-Vahi asks us to picture at one and the same time a massive library and a celestial mirror. Akashic reader Simrin Gregory likens it to “an energetic database that stores every choice we have ever made as individual souls.” As our hero is to learn, the records help us release energetic blocks retained from the past. To access, says Patker-Vahi, set intentions, develop clarity around questions one wants answered, and try reiki. She also suggests tarot readings and/or guided meditations paired with binaural beats set to 963Hz.

Hero shrugs his shoulders and thinks, “Accessing an imaginal technology on the scale of the Akashic Records is not unlike inheriting a time machine. Only the Records do time machines one better, as they steer us clear of butterfly effects while nonetheless enabling anamnesis.”

“Besides,” he confides, speaking across dimensions now to his companions. “At this point, I’m willing to try anything.”