A Drift
Artist, filmmaker, and researcher Matthew C. Wilson weaves the evolutionary throughlines of life, the alphabet, DNA nucleobases, metaphor, and AI. Via 36 drifting but interconnected lines of thought, he traverses ancient prehistory to speculative futures, meditating on the chancy throughlines that make up the world as we know it.
1.
A, the first letter of Latinized alphabets including English and German, emerges from millennia of mutations, carrying with it material and social legacies. A’s most recent alphabetic ancestor is the Greek alpha, from which it is indistinguishable in its capital form. The Greeks borrowed alpha from the Phoenicians, who were using a similar letter, aleph, 𐤀. The Phoenicians seem to have picked up the idea for aleph through their Mediterranean maritime trade network, adapting it from the Egyptian hieroglyph depicting an ox’s head: 𓃾. While the Egyptians attached plows to the horns of their oxen, transforming the animal into a source of labor—and in turn an agent of landscape transformation—it was meaning that hung upon the horned form of 𓃾 and its descendants.
Aleph, 𐤀, retains visual traces of the ox’s head. For the Phoenician and other phonetic alphabets, the abstraction was not simply visual, but aural: aleph, 𐤀, also abstracted a sound from spoken language, mapping it onto an arbitrary, unrelated figure.1 The word aleph, however, retained its abstract relation in a nonarbitrary way: aleph still meant “ox.”
Besides ancient Greece, aleph traveled into other sociocultural domains—the alphabetic analogue of evolutionary radiation. It almost seems to have an agency of its own. Along other mutational meanders, aleph becomes alif, ا, in the Arabic alphabet, and alef, א, in the Hebrew alphabet; it is the first letter of these alphabets as well. ا … א … A … None of these were goals. A just … happened. For a Sufi, alif is the beginning as well as the unity of all things emanating from Allah. For a Jewish mystic, alef may similarly point to the oneness of all things, while the Hebrew language itself “reflects the inner structure of the divine realm.”2
2.
For the characters in Jorge Luis Borges’ story “The Aleph,” “an Aleph is one of the points in space that contains all other points.”3
3.
For a geneticist, the letter A will likely draw an association with DNA. As a convention, A is used to represent adenine, one of the four nucleobases in the nucleic acid of DN𐤀. A came to represent this particular nucleobase because it was isolated from bovine pancreases. Adenine’s discoverer, the German chemist Albrecht Kossel, named adenine after the Greek word for pancreas, ἀδήν (aden), which happens to begin with A.
A of course is merely one letter in both phonetic and genetic alphabets. The genetic alphabet is four letters, one for each of the four nucleobases: A G T C. To be more precise, since there are no nanoscale letters strung together in DNA or RNA, the genetic alphabet is not an alphabet but rather four molecules: adenine, guanine, thymine, and cytosine.
A is the first letter of the genetic codon A-T-G, which tells messenger RNA4 in eukaryotic organisms5—including cattle and humans—to “start reading,” an essential instruction in the conception of DNA as the so-called language of life. Abstracted into letters, it can be easy to forget that the linguistic comparisons that DNA draws are merely analogical tools to approach an infinitely shufflable molecule of molecules that eludes full understanding.
4.
For tens of thousands of years or more, humans lived in proximity with aurochs, domestic cattle’s wild ancestor, which they hunted and ate, incorporating it metabolically. The animal appears in many of the oldest cave paintings, as well as in both wall paintings and architectural features in the Anatolian proto-city Çatalhöyük, suggesting it held symbolic significance among the inhabitants. At some point, the animal was incorporated even more directly into the human domus, as a co-habitant. The animals, with whom humans were already living in environmental proximity for millennia, would have originally been morphologically wild, with their behavior managed; under the management process of human selection, the physical form and behavior of the aurochs would have slowly been morphed, often in correlated traits.
There is increasing evidence in the archaeological record that cattle and wheat appear almost simultaneously. Domestication may have been more of an accident than a goal-directed process. Rather than “an active and engaged human process, certain plants gradually increased in prominence around villages, in cultivated fields, or on grazing land.”6 This may have been a co-evolutionary process originally developed between certain plants and (now extinct) megafauna, a package that continued with some grazing animals.7 The agricultural revolution may have been an accident precipitating out of evolutionary contingencies.
5.
Long before their burps were factored into climate models, cattle figured prominently not only in mythologies but in economies of the ancient Mediterranean, with some historians claiming that cattle are the oldest form of money. Their sacrifice was a merging of symbolic value and economic value, as if a manner to transfer value from an earthly account to a supernatural account. Environmental literary scholar Jemma Deer writes:
The word “cattle” comes from the Latin capitāle, “head,” and was used in medieval times to mean “principal sum of money, capital, wealth, property.” The earliest recorded usage is c.1275, in the now obsolete sense of “property, substance; strictly personal property or estate, wealth, goods” (OED). The later sense of “cattle” as “a collective term for live animals held as property, or reared to serve as food, or for the sake of their milk, skin, wool, etc.” is first recorded in 1325, and the modern sense, of specifically bovine animals, in 1555 (OED).8
Cattle are animals par excellence of the Anthropocene Capitalocene, from the early days of the Anatolian domestication (as well as a separate domestication event in the Indian subcontinent), to their spread through Eurasia and Africa as part of the agricultural transformation package, to their role as neobiota in the imperial expansion in the Americas,9 to the scaling up of their populations as part of industrial farming from the nineteenth century onward.
6.
One day an aurochs, another day an A. Incarnations upon incarnations.
7.
The end of the nineteenth-century experiment leading to the discovery of adenine involved 100 kilograms of pancreas from thirty different cows supplied by a slaughterhouse as well as an industrial amount of acid—more than 200 liters.
The experimental engagement with these bovine pancreases also yielded another of the nucleobases: guanine.10
8.
Guanine had, however, been discovered forty years earlier in the shit of South American seabirds by another German chemist. But since DNA itself had yet to be discovered, guanine was not yet recognized as a nucleic acid component. What were German chemists doing playing around with bird poop, and why did they find guanine in it? The poo clue is in the nucleobase name. Guanine takes its name from guano, a word derived from the Quechua word for dung, wanu.
The Quechua-speaking Inca and their descendants valued the birds’ wanu for the same basic reason as the German chemists: it makes a great fertilizer. Seabirds eliminate excess nitrogen by converting it to guanine and expelling it with the rest of their excrement. Prior to colonization by Europeans, the Inca protected the seabirds and sustainably sourced guano. Europeans largely ignored guano, seeking shinier sources of wealth like gold and silver to extract, until Alexander von Humboldt witnessed the ongoing practice of using guano as fertilizer and brought some samples back to Germany. As European fields faced exhaustion, guano was seen as a means to revitalize them.
The mode of obtaining guano in European and US efforts followed a version of the standard violent extractivist approach: they used indentured laborers to remove massive quantities that had accumulated over hundreds of years and, in order to make the mining more efficient, removed bird nests and slaughtered the birds to prevent them from getting in the way. The guano grab can be seen as part of larger capitalist “one-way patterns of production, consumption and waste.”11
9.
When the “molecular toolkit” of life came together, the conditions on Earth were quite different—there was much less oxygen. At the time, guanine would have been a suitable “evolutionary choice” for one of the four nucleobases. However, the oxidative potential (i.e., reactivity) of the planet changed 1.5–2 billion years later, causing a significant uptick in atmospheric and oceanic oxygen.12
As one group of scientists explain: “For the last 2 billion years, biological systems have been under intense pressure brought on by chemical instability of guanine. The permanence and integrity of genetic information and of critical energy transduction and signaling molecules are under relentless assault by oxidative processes.”13 Rather than replacing guanine, elaborate systems emerged to repair oxidized guanine. With the persistence of guanine, a path dependency is present; once established, (natural) alterations to fundamental molecular components for life are “prohibited.”14
10.
Life and language, when understood as emergent processes, do not—even cannot—plan ahead, but rather make use of what’s available.
11.
The capacity to plan ahead is an indicator of intelligence (e.g., in discussion surrounding animal, plant, machine intelligence).
12.
Contingency seems to be a governing force, cascading across language and life, habitats and histories.
13.
The event that precipitated the shift in the oxidative potential of the planet and an increase in atmospheric oxygen is variously called the Great Oxidation Event, the Great Oxygenation Event, the Oxygen Catastrophe, and even the Oxygen Revolution. The hundreds-of-millions-of-years-long event was the result of the success of cyanobacteria. They became so numerous that some anaerobic organisms began to suffocate in the waste oxygen, triggering a mass extinction. Catastrophe. The prevalence of oxygen opened the path for oxygen-based life (which may have already existed in lesser numbers and diversity). Revolution.
Catastrophe and revolution can be interconnected.
Ultra-long-term outcomes of complex systems are unpredictable.
14.
It is not only from cow organs that the four nucleobases have been isolated, but also from extraterrestrial objects.15 Prelife, the prebiotic chemistry of life—including the nucleobases—may have arrived to Earth via carbonaceous chondrite meteorites. It isn’t only nucleobases that appear in meteorites, though, but also various organic compounds including ribose, a key component of DNA’s companion molecule, RNA. Since all the basic building blocks of life have been discovered within these fragments that once drifted through space, this gives some credence to theories of panspermia—the idea that life on Earth was seeded from outside. Evidence is, however, still accumulating and so abiogenesis—the idea that life emerged on Earth—remains the mainstream scientific view.
If someday substantiated, a flurry of existential questions will follow and new cosmological narratives of science may begin to intersect with mythopoetic origin stories. Some of the meteorites containing the nucleobases and organic molecules are older than the solar system itself. If life is not endemic to Earth and was indeed “seeded,” from what world tree did these figurative seeds of life fall? From what alien arboreal entity were they ejected?
Astrobiologists hold varying ideas, including that there is no figurative “tree.” This camp rejects the idea that life and organic material of extraterrestrial origin would have been ejected from some other place in the cosmos. Instead they believe the fundamental materials themselves formed in outer space.
With its extraterrestrial origins and proliferative, almost animistic, zeal, DNA starts to seem like a form of concretized cosmic glossolalia—a physicalized panpsychic poem.
15.
“What’s the difference between a god and an alien?”16
16.
After targeting the bovine pancreas, Albrecht Kossel turned his chemist’s crosshairs on the animal’s thymus, from which he successfully isolated another nucleobase: thymine. The thymus itself may take its name from the thyme plant, with which it shares some semblance of form. The thymus is an important organ for the immune system. Thyme is beneficial for the immune system, as if in accordance with the doctrine of signatures—the belief that a herbalist can look for visual similarities between a plant and body part to find treatments for specific issues related to that part of the body.
The thymus is involved in the training of the immune system’s T cells, which recognize foreign invaders. In the thymus, as T cells mature, they pass through a network in which those cells that can distinguish foreign particles from the body’s own cells are “selected,” while the rest are “eliminated.” The organ was known to the ancient Greeks but its function in the body was not apprehended until the 1960s. The thymus is akin to a training environment within the body, a sort of interior, immunological environment in which selection occurs.
Vertebrates are the only animals that have such an adaptive immune response where ongoing “learning” takes place.17 Recently, researchers have suggested that the general state of the body is “computed” by the immune system, suggesting “that immune experience,” like that of the T cells in the thymus, “requires preliminary training reminiscent of supervised machine learning.”18
17.
As an organism ages, the thymus produces fewer and fewer T cells, making the organism more susceptible to infection.
An infection killed computing pioneer Charles Babbage.
Some of Babbage’s most impactful work was his analysis of the factory system of production. Here Babbage was influenced by Adam Smith and in turn influenced Karl Marx. Babbage’s conception of the computer was very much connected to his analysis of the factory, which was organized around the division of labor. In his 1832 book On the Economy of Machinery and Manufacture, Babbage showed that the “principles of the division of labor could be applied ‘both in mechanical and mental operations.’”19
18.
The brain has also been described as operating through the division of labor. In the largest sense, this is known as “brain lateralization”: the specialization of each of the brain’s hemisphere in certain tasks.
The division of labor between the two hemispheres of the brain may have also shaped the letter A. Notably, “domestic cattle possess lateralized cognitive processing of human handlers. This has been recently demonstrated in the preference for large groups of cattle to view a human closely within the predominantly left visual field.”20 The ox head depictions (𓃾) that gave rise to the letter alpha (α) strongly preserve this view of the ox head from the left side. “This is evident today in the minuscule letter ‘a’,” which retains this same directionality. Together, “these examples collectively suggest a long history of lateralized cattle-human interactions.”21
19.
After his death, Charles Babbage’s brain was split in half and is currently in two different locations in London.
20.
“Division of labor” is used to describe a wide range of biological functions, such as the relationship between genes and enzymes. Some evolutionary biologists suggest that division of labor—sometimes also called “functional specialization”—is an evolved characteristic of living systems.22
Division of labor was a crucial concept for the development of the ideas underpinning one of the most important books of the Guano Age: Charles Darwin’s On the Origin of Species. Darwin would write: “all organic beings are striving to seize on each place in the economy of nature.”23
This understanding grew in part from Darwin’s encounter with the tropical rainforest, where he observed how each organism found its place, its niche. He saw this as a sort of ecological division of labor. Speaking of the physiological division of labor, in a letter to his wife, Darwin wrote: “An organism becomes more perfect and more fitted to survive when by division of labour the different functions of life are performed by different organs.”24
Having read Adam Smith (the so-called father of modern capitalism), other social theorists such as Thomas Malthus, and political economists, Darwin was well acquainted with the political economy and social theory of his day. These fields as well as an understanding of the factory had a significant influence on the scientific articulation of life. The description of specialization as “division of labor” continues in discourses on living systems today.
21.
Darwin articulates a structuring process that appears in multiple materially and historically distinct domains, analogizing across ontological boundaries, such as between a factory and an ecosystem or body. In a letter, Friedrich Engels described how Darwin’s ideas translated into other realms: “When this conjurer’s trick has been performed (and I questioned its absolute permissibility … ), the same theories are transferred back again from organic nature into history and it is now claimed that their validity as eternal laws of human society has been proved.”25
In other words, a contingent social order offers metaphors according to which Darwin conceives an account of biological natural law. This natural law is then reapplied as an explanation of human social order. Darwin’s ideas were immediately used for unscientific purposes such as claiming the supremacy of white Europeans by proponents of “Social Darwinism.” As bioengineering develops, the specter of eugenics still haunts. Capitalism too becomes expressed in terms of natural laws often linked to evolutionary explanations (see especially the work of the economist Friedrich August von Hayek).
22.
Fish repurpose guanine molecules, incorporating them into their scales in order to disappear; the crystalline forms of the repeating molecules create a shimmer, mimicking the effect of light in water and producing a camouflage. It’s a trick, an optical illusion, to help evade predators.
These same fish scales were in turn used in some of the first car paints to give the industrial, assembly-line-manufactured objects an appealing luster for consumers.26
23.
Babbage’s computer research also springs from the depths of the psyche and the mathematical study of light arriving from the depths of space. The first computer emerges not only from an analysis of the factory but just as much from the realm beyond the oneiric gates; in 1812, Babbage had a dream of tables of logarithms calculated by machinery.27 These are the same kinds of tables used in astronomy. Babbage went on to found the Royal Astronomical Society in London. He was also fascinated by the supernatural world and conceived of God as a sort of programmer.
24.
Building on philosophers Gilles Deleuze and Félix Guattari’s notion of “abstract machines,” philosopher and artist Manuel DeLanda’s A Thousand Years of Nonlinear History explores “different concrete processes embodying the same abstract machine.”28 DeLanda’s central thesis is that “all structures that surround us and form our reality (mountains, animals and plants, human languages, social institutions) are the products of specific historical processes” driven by those abstract machines.29
Geological, biological, and informational processes, according to the proposal, can be explained by the same kinds of abstract machines that carry out operations such as “sorting” and “stratifying.” In this way they are conceived of as systems subject to similar phenomena.
To be clear, this is a different form of “abstraction” than the visual and auditory abstraction that occurs on A’s journey from ox head to phonetic notation. The abstraction here is related to mathematics and computation. To abstract is to ignore details in favor of developing a model, such as a model of a particular kind of operation.
The details ignored by the “abstract machines” are the materials they process.
25.
What’s the difference between an abstract machine and an abstract being?
26.
For Marx, individuals living under capitalism “are now ruled by abstractions, whereas earlier they depended on one another.”30
Cooperation offers an alternative to the Darwinian competitive history of the emergence of life. The notion of interdependence is key for the theory of symbiogenesis, promoted and proven by evolutionary biologist Lynn Margulis. She states (with Dorion Sagan), “Life did not take over the globe by combat, but by networking.”31
Can the means sometimes become the ends? Perhaps connecting, collaborating, cohabitating, combining bodies is an end in itself that may help to undermine the ideological underpinnings of modernity—the fragmentation and decontextualization that at once shaped the way resources were sourced and distributed as well as the epistemic priorities of Enlightenment-style knowledge production.
27.
Is it possible to understand the neolithic human, the ox, the plow, and wheat as a single, composite, quasi-symbiotic entity of living and nonliving components?
28.
“Something very old, very powerful and very special has been unleashed on Earth.”32
As the astronomer Caleb Scharf puts it, humans found “a new trick for the restructuring of matter in service of a phenomenon with very deep roots in the statistical arrangement of atoms and molecules, in their order and disorder or dispersal: in entropy and its cousin, information.”33
To Scharf, the nongenetic material that we carry both externally (tools) and internally (knowledge) appears to be a “distinct, although entirely symbiotic (even endosymbiotic), phenomenon.” He sees “Homo sapiens … as a truly unique species because of our coevolution with a wealth of externalized information.” Scharf calls this the “dataome”34 (which unfortunately shares its name with a platform for biomedical information).
The dataome is akin to geochemist Vladimir Vernadsky’s “noosphere.”35 The letter A is part of the datome/noosphere, as is the knowledge of how to write it, and the knowledge of how to manipulate DNA’s A’s, G’s, C’s, and T’s.
Applying the “gene-centered view of biology,” most notably promoted by Richard Dawkins, to tools and knowledge means that “exactly how information survives is less important than the fact that it can do so. Once that information and its algorithmic underpinnings are in place in the world, it will keep going forever if it can.”36
29.
For all the benefits that such “symbiotic” relationships with the domain of externalized information may have, this capacity to materialize “algorithmic underpinnings” can lead to terrifying results. Inequality and oppression become literally built into the environment; architecture and urban planning can propagate systemic racisms across generations. Here we enter the epigenetic domain. The environment itself influences which genes are expressed and which are not. This is true of not only the spatial logics but also the institutional logics—from justice to education—of societies. As we enter an age in which social decisions are increasingly automated, seemingly neutral or arbitrary systems may be inherently and implicitly racist or misogynistic.37
30.
Albrecht Kossel’s calf thymus experiment yielded not only thymine but also the last of the nucleobases, cytosine, which takes its name from the Greek kytos, simply meaning “hollow” or “vessel” and used today to refer to cells.
Cytosine was integral to the first successful physical implementation of a quantum algorithm, whereby “two hydrogen atoms [in cytosine molecules] had been replaced with deuterium atoms—hydrogen with a neutron. [… Researchers] prepared the qubits into initial states, performed a computation by applying a specially crafted radio-frequency pulse, and measured the final states.”38 While cytosine-based quantum computing has given way to other material substrates, it was a crucial demonstration that “humans had the technology to control quantum states and use them for computations.”39
Computing has come a long way from crank-turned and steam-engine-powered mechanical-component machines, like those of Babbage. The matter does not matter; the material itself does not seem to be of particular importance with regard to abstract processes of computational logic. Nonetheless, the abstraction here (computation) depends totally on a contingent feature of matter, a very specific molecule with an eccentric (exploitable) property.
This quantum computing proof of concept involving the changing and measuring of molecules of just one nucleobase differs from repurposing actual (evolved) genetic processes for computation. Doing so is typically known as DNA computing: “the performing of computations using biological molecules, rather than traditional silicon chips.”40
31.
The case of DNA computing, then, displays a deepening of the merger of materiality and metaphor. Operations of abstraction render a thing called “DNA” intelligible. Still further abstractions deliver DNA as a literal computer, displaying and concretizing presuppositions of the initial abstraction.
Metaphors bring entire epistemes and ideologies with them. As discussed above with regard to Darwin, the episteme of evolutionary theory is entangled with industrial production and capitalism. Darwin’s “conjuror’s trick” is part of a feedback loop. Evolutionary theory, in turn, becomes a metaphor for understanding the nonliving systems of engineering and computing.
This tendency to analogize to the machinic domain occurs across centuries with regard to organisms in general. For example, René Descartes conceived of animals as “mechanisms,” a comparison that has a legacy in all present-day applications of engineering principles to biology. Even the high-throughput biofoundry in Liverpool is called the “GeneMill,” in reference to its history as a manufacturing site, where previously cotton from plantations was processed. Such a facility uses some of the same organizational logics of the original mill (including the division of labor).
The study of both evolution and life tends to make use of the available metaphors, which in turn influence the very perception of life. The rise of the computer provided a new kind of metaphor, as in the case of understanding the activity of the thymus and the immune system, also discussed above. DNA is, of course, often conceived of as “code.” Yet, this analogy carries dangers with it. For example, “unlike computer software, there’s no way so far to ‘patch’ [engineered] biological systems once released to the wild, although researchers are trying to develop one.”41
Comparisons use simile operators (“like” or “as”), while conflations drop these in favor of metaphor operators (“is” or “are”). Such is the case in the claim “organisms are algorithms” by the historian Yuval Noah Harari. Harari sticks to his ontological conflation that they are, in fact, the same. What might have started as poetic conceit or rhetorical strategy has a different effect: that of reductionism. The problem with this argument is that it is a reduction of the concept of the organism; it ignores, for example, that organisms are flows not only of information but also of energy.
Although certain processes can be carried out across different substrates, the pairings matter. The directionality matters. If the organisms and algorithms are equivalent then it should be possible to invert Harari’s statement—yet “algorithms are organisms” has very different implications. This example underscores the way fundamental biases can appear through metaphorical logic.
Metaphors and analogies can open understanding to unfamiliar or unknown phenomena, yet a more complete comprehension can be blocked by the very metaphors used in attempts to grasp those phenomena. And, worse, metaphors invite a perversion or abuse of the described object or phenomenon by treating it as if it “is” the thing to which it was analogized.
Studying an object or phenomenon brings it into human awareness through certain (historical) frameworks. A naive preconception of abstraction would permit simply digging down to claims that are not explicitly universal but often operate as such—scientific “facts,” for example. To “discover DNA” as the thing-in-itself of biology risks forgetting the contingencies and similes and metaphors and particular-provincial ends that produce abstract (supposed) universals, and will therefore serve to expand the dominance of exactly the most ideological or biased or parochial of in-built assumptions.
As philosopher Donna Haraway famously and more simply puts it:
It matters what matters we use to think other matters with; it matters what stories we tell to tell other stories with; it matters what knots knot knots, what thoughts think thoughts, what descriptions describe descriptions, what ties tie ties. It matters what stories make worlds, what worlds make stories.42
32.
The “alphabet” of DNA has already been expanded from four to eight letters through the synthesis of four new nucleobases.43 These new nucleobases can function within the existing system.
There are as many as 1 million molecules that could potentially store hereditary information.44 This suggests other alien “alphabets” are possible. Other alphabets might exist elsewhere in the cosmos. They may, however, not be discovered but invented. On Earth, the complexity of analyzing and developing complex systems is precisely the kind of task that artificial intelligence has proven to be so useful for and powerful at tackling. It is not difficult to imagine a feedback loop forming between AI and DNA computing in a very short period of time.
33.
Could those systems merge? If so, the final “cyborg” would not be the mechanical system of the cinema screen, but rather a synthetic biotic system. Would it be designed from scratch? Or could an organism’s immune system be trained to accept and allow a DNA (computing entity containing the organism’s own DNA) back into its body to co-evolve with it? The body can become the new domus for the computational entity, a system with a shared metabolism and even a shared heredity. The system could be opened up and even allowed to evolve within. When these vectors are drawn out and intersect, one vision of the future begins to appear: it looks like a great merging of systems that would constitute a new form of symbiogenesis, a machinic-symbiogenesis. This would constitute a new domain of life, a sort of Minotaur of systems, one grafted onto the next.
Or is the other way around? Will lifelike heritable transfers of information find a new material base, like the mismatches of material and form in a Giuseppe Arcimboldo painting, an oscillation of ontological category, in which the notion of the code of life becomes at last material and not metaphorical? The code can just as well take care of itself, evolving onward in another substrate entirely, more mineralogical than biological—as if returning to a meteoric body.
In either case, what would the “survival” strategies of such an entity be? Competitive or cooperative? Something else?
34.
The holobiont spoke softly to themselves: yesterday a new god started to grow in our silicon spleen.
35.
Ornithologist Richard O. Prum makes the case that birds’ aesthetic displays are not necessarily tied to fitness, and can even reduce it. He writes: “Individual organisms wield the potential to evolve arbitrary and useless beauty completely independent of (and sometimes in opposition to) the forces of natural selection.”45 Prum goes on to explain that the patriarchal Victorians, accepted the idea of male versus male competition, dismissed sexual selection, rejecting the idea of “female sexual autonomy—the taste for the beautiful,” as a mechanism “responsible for the evolution of natural beauty.”46 Sexual selection is, however, now supported by masses of evidence and is accepted by evolutionary biologists (though many still consider it a form of natural selection).
This case underscores once again the ways in which societies’ ideologies inform their narratives of evolutionary processes. How many other ways might evolutionary theory be reconfigured outside of not only its patriarchal roots but those of capitalism, industrialism, engineering, and more recently computing? The idea of cooperation/interdependence proposed by Lynn Margulis has been explored above. How many other drivers, impulses, and urges—besides survival and reproduction—might there be orienting creatures towards life? Where would such agencies take living systems on their physical and abstract paths?
36.
Genetic drifts are changes in an organism’s genome based on chance. Some biological forms can be attributed to genetic drift in whole or in part. No advantages are accrued by their drifting, nor disadvantages so great that the organism ceases to collect adequate calories or reproduce. So the chance mutations persist, and are inherited, drifting further for no reason—seemingly without a path or goal.
Drifts upon drifts.
Might there be some entities among us whose “attraction” to drift becomes almost a raison d’etre, a sort of radical acceptance of the contingency of embodiment in space and time? What might we learn from them, from self-organizing processes wandering the latent space of the cosmos—agency without a goal, succumbing only to the occasional emergence of strange attractors within shifting fields?
Drafts upon drafts.
Here adenine, guanine, thymine, cytosine; there *]pmn$vv, [n$ki*x, ]j..b*{w$z, *]pmn$vv.