Scientomancy, or Divination by Science

By

Olga Tokarczuk

I.


I would like to focus on a special way of predicting the future, and to invite you to take a trip down paths that are rarely trodden by humanists. As we are living at a time of unusual uncertainty and anxiety about what will happen next, this topic is on many people’s minds, prompting us to ask: “What is our future going to be like?” And also: “Can we foresee it to any degree at all?” When I say “our future” I’m not thinking of the next few years, but rather about the gradual processes whose effects will be visible several centuries from now.
  In ages past, various ways of predicting the future developed dynamically – dating back to oracles and predictions based on observations of nature, or by apparently communicating with supernatural forces corresponding to the ongoing demands of society. I shall provide some examples from the ancient world, where alongside astrology, with its centuries-old tradition, various schools of “something-mancy” co-existed, some more and some less obvious. The more obvious ones include oneiromancy, as in divination by dreams; arithmancy, which uses numbers for predictive purposes; cheiromancy, forreading the future from the palm lines; or ceromancy, which bases its predictions on wax poured into water, and which in Poland is also a familiar folk custom associated with St. Andrew’s Eve. The less obvious ones bear certain hallmarks of eccentricity, at least from the modern point of view, though since they were practiced, they must have had some sort of effect: ololygmancy divined according to the way dogs barked; tyromancy drew conclusions about the future by the way cheese was cut; and tasseography was divination by coffee grounds or tea leaves.
  Yet the language of fortune-telling was always very special: imprecise and highly ambiguous, if not obscure, as if intended to be given substance by human interpretation. Talented fortune-tellers seem to have done their best to predict their clients’ expectations, telling them what they wanted to hear. As a result, reviewing the predictions today not only tells us a great deal about individuals and their lives, but also about a society and the deep collective needs, fears, and desires, conscious or unconscious, that were prevalent within it. What about whether or not the predictions and prophecies came true? Well, there was often a psychological mechanism at work, which relied on the fact that on the whole people are inclined to remember the things that matched the predictions, and to disown the things that didn’t come true.
  From the start, considerable forces were harnessed for predicting the future. Their significance grew in particular at moments of historical crisis, in times of uncertainty or terror. From the turn of the third and fourth centuries A.D., when the ancient world was shaken to its foundations and nothing was going to remain as in the past, there were so many astrologers, clairvoyants, and soothsayers of every kind that the popes took fright. In the fourth century A.D. the Church banned every form of divination or practice involving the future in any way, because only God can know it. These prohibitions weren’t very effective; at any rate, a thousand years later astrologers were active at the papal and episcopal courts, and within the entourage of numerous Christian rulers and magnates.
  In the past few centuries, when the influence of religious prohibitions or objections has decidedly weakened, literature too has taken to predicting the future, for the most part by creating terrifying, dystopian visions. From some point in its development the modern genre of science fiction started to rely on the science known as futurology, which came into being in the twentieth century. In their works Stanisław Lem, and especially Philip K. Dick, were incredibly good at predicting all sorts of human inventions. Reading their work often sends a shiver down my spine.
  Actually, literature has proved a better fortune-teller than scientifically-backed futurology, which is really only known today for the fact that it doesn’t work. Most of its famous forecasts have failed to come true. Many futurological projects that were launched after the war proved completely wide of the mark for a reason that Stanisław Lem understood well. “Even minor progress in any field reveals to us the vast, previously invisible foreground of our ignorance,” he wrote in the 1970s, as it were anticipating the famous metaphor of the Black Swan created by Nassim Nicholas Taleb. It can serve as an illustration of Lem’s thesis that predictions do not come true because something unpredictable always occurs, even though it seemed quite impossible.
  As one of the countless examples of a Black Swan of this kind let’s offer Malthus’s gloomy visions of overpopulation, which totally failed to foresee methods for regulating human reproduction in the form of effective and widely available contraceptives. Similarly, it never entered the minds of most futurologists in the 1950s that the Soviet Union could simply collapse like a rotten tree eaten away by woodworm, leading to a fundamental change in the global distribution of political power.
  I think that among the thousands of scientific discoveries there are some to be found that have great potential for changing our tendency to think about the world in terms of paradigms; this is an approach that can undermine the most intuitive of overt certainties. Because there is a cognitive consensus that allows us to claim that certain things are simply never questioned, unless we’re suicidally trying to get ourselves excluded from the ranks of thinking, intellectually trustworthy people.
  We have only to remember the problems experienced by Nicolaus Copernicus before he succeeded in entirely changing our way of thinking about the universe. The Ptolemaic geocentric system seemed obvious, and was accepted on the basis of consensus, despite being erroneous. Yet at the same time it was logically and notionally cohesive, and allowed one to explain, though often in a rather complicated way, many of the movements of the celestial bodies.
  But this consensus is not always a given thing once and for all. A scientific revolution can occur that transforms the paradigm, and with it the discipline in question, and in many cases plenty of other areas of knowledge and of life as well. The results of the Copernican revolution affected not just the work of astronomers, but had psychological, philosophical, and religious consequences, transformed art and brought about profound changes in how human beings perceived themselves and their place in the world.
  It’s highly possible that even the most ground-breaking discoveries of the modern era, those made in recent years and in slightly earlier times too, will have a vast impact on how we view and understand the world in the future.
  But let me now offer you a sort of intellectual amusement, which involves regarding the most thrilling and inspiring of these discoveries as excuses to do some fortune-telling, to fire up our imagination and sense of synthesis, without avoiding a touch of eccentricity along the way. And so here is a version of “scientomancy,” a species of fortune-telling that involves predicting the future on the basis of scientific discoveries.

II.
The quantum world and polyvalent logic—
NO MORE EITHER OR


  Recent discoveries in quantum physics have affected our way of thinking on several fronts. The idea of “quantum” is so fascinating that it has become fashionable, even if we don’t really understand much about these discoveries. We live in times where everything is “quantum,” from diet to medicine to computers. I think this enigmatic word, from the vocabulary of a field that, apart from the specialists, few people know anything about, has simply become a symbol for something new that will entirely change the order of things, that will offer totally different views of reality, just as in the recent past “digital” pushed “analog” into the shade, and just as “virtual” supplied “real” with new, previously unsuspected dimensions.
  And what is this novelty all about? I’d say it’s about going beyond the classic formula of “either-or.” Earlier, the principles of logical reasoning as described by Aristotle were standard. One of the foundations of this classical logic was the claim that something can only be “truth” or “falsity” tertium non datur, as the Romans would have said. This sort of system is called bivalent logic. Hamlet’s “to be or not to be” is a literary example of this form of logic. The Danish prince didn’t consider a third possibility, though he might well have done so. That’s what Jan Łukasiewicz did, though on different terrain. The Polish scientist is thought to have been motivated by a resistance to the determinism and fatalism that were implied by bivalency. Therefore he claimed that for propositions concerning the future a third value also has a role to play: what we might simply call possibility. In other words, even if it seems inevitable today, what’s going to happen doesn’t have to happen. Referring to the famous example of the sea battle in Chapter Nine of Aristotle’s De Interpretatione, Łukasiewicz states that it’s hard to make true pronouncements about tomorrow: “an opinion about […] tomorrow’s sea battle cannot be true or false today.” Once the monopoly of bivalency had been broken, further elements and quantifiers can be introduced to multiply alternatives.
  Of course, bivalent logic has become so obvious to us that it’s virtually intuitive. By and large, conclusions based on it never let us down. Except that when this way of thinking was applied to the nuclear phenomena first discovered in the early twentieth century, it became apparent that this sort of logic does not work at the subatomic level. By the 1920s it was a known fact that in the nuclear world there are other laws in force. Bivalency needed to be reviewed, its status as a dominant logic reassessed.
  In fact, physics has proved that two previously opposing possibilities are capable of turning into each other and that “truth” and “falsity” cannot define absolutely the alternatives we consider. In keeping with the uncertainty principle formulated by Werner Heisenberg in 1927 we accept that we are unable to measure certain pairs of physical parameters (position and momentum) simultaneously on account of the wave-particle duality of quantum entities, in other words because matter is not at base either just a particle or just a wave, but is in fact both of them at once. Among the implications of this theory was a claim about the influence of measurement on the behavior of the entities being measured. A famous example is the thought experiment known as Schrödinger’s cat, which shows that until an entity of this world is measured, it finds itself in any of the possible states simultaneously: a cat shut in a container with poison is both alive and dead. Returning to Łukasiewicz’s logic, this superposition can be regarded as the tertium datur in between “yes” and “no.”
  Even if we were to watch a visual presentation of these apparently simple findings a number of times on YouTube, and to caution ourselves that they principally concern the subatomic world, they would seem to us so counterintuitive, so bizarre, that we would hesitate to extrapolate them to our ordinary, everyday life. And yet, as a writer with a slightly fevered and rather overactive imagination, I think that, if taken seriously, such discoveries are capable of leading us to a real breakthrough, to the end of the “either-or” era. Between zero and one there is also half. Between “yes” and “no” there is also “possible.” Between black and white there’s a vast ocean of grays, and between night and day there are dawn and dusk. In this light, many of our binary orders start to look suspiciously simplified. These binaries and dualities should be stretched out like an accordion and changed into a continuum, so that a number of intermediate qualities appear in between oppositions. Basically this means allowing the reality that we perceive to be more complicated. This will completely change not just our field of operations in those common situations where we’re forced to declare ourselves in favor of a single choice, A or B, under the pretext that this exhausts the entire range. All the “believe it or nots,” “can do it or nots,” “support it or nots” without any “but” will have to be reconsidered. The change in perception is sure to have an effect on how we define our identities, not just our sexual identity, but our social roles and our affiliations. Anachronistic concepts such as nation will have to be redefined (within this logic a person can be at the same time a Pole, an Inuit, and a Tasmanian). Faced with a sea of possibilities, people will need other means of navigation within reality, which is sure to prove more demanding intellectually. There will be increasingly dangerous temptations to simplify everything to the most straightforward dimensions. Some new advisory professions are sure to appear, and new words will appear to fill the gap between the poles of “either-or.” Ethics and our understanding of morality will change and so will the penal codes. But this transformation will need hundreds of years to take full effect.

III.
The holobiont and various forms of symbiotic dependency within the bodies of living creatures:
NO MORE SINGLE ONE


  An outwardly unimportant, innocent discovery from the borderland of biology and medicine (though anticipated by visionary scientists such as Lynn Margulis) leads us inside the human (and not just the human) body, where it turns out more organisms appear to be living than we had thought before now. The human body is inhabited by about 39 billion cells of micro-organisms that form a so-called microbiome. We inherit the kernel of it from our mothers as soon as we are born, by drinking mother’s milk. Without it we would be at risk of attack by pathogens. But as it has a relationship with our nervous system the microbiome also has an effect on our emotions. Scientists are ever more frequently proposing that we should treat the human organism and the symbiotic microbes that inhabit it as a holobiont, in other words as a single living being composed of many living beings. This applies to other animal and plant holobionts too. Fascinatingly, our microbiome changes over time and is connected with our way of life, dependent on our diet, work, leisure activities and so on, and thus it is susceptible to human culture!
  Thus we can say that the sort of holobiont we are is a multispecies creature, consisting, apart from their host, of bacteria, archaea, fungi, and viruses. Let’s try to imagine our organism as something like a complicated, multilevel organization, like an ecosystem, a coral reef for example, housing various organisms that cooperate or compete with each other, and that are more or less interdependent, more or less interconnected. From this viewpoint the issue of identity becomes a curious one: What is me, and what is not me? How am I to define or perceive myself with these other organisms, or maybe without them? To what extent is my identity the sum of their identities as well? Since I cannot live without them, is it the case that they too are me? Is it legitimate to talk of “I” (or “me”) and “they” (or “them”)? And what in fact is this famous “I” (or “me”)?
  In the era before the holobiont we had a completely different vision of ourselves: the world, in Leibnitz’s vision, consisted of separate, homogeneous, individual, monadic beings that did in fact interrelate, yet their integrity and impenetrability seemed obvious. Their individuality and separate identity were beyond doubt.
  This perception of living beings as exclusive entities that are separate from one another formatted our image of the world and of ourselves. The concept of the self as separate, monistic, self-enclosed “I” first came into being during the Reformation and the Renaissance. This was also when the philosopher Goclenius first coined the term “psychology,” though the context was theological. Easily combined with other words, the prefix self- expressed a new way of thinking about the world, where the person, the individual was acquiring great significance. The self always related persons to their own identity: self-regard, self-destruction, self-love… From then on, the individual “I,” separated from the rest of the world, became the chief perspective from which we view reality.
  Thus a huge role is played in our religions, mythologies, and fantasies by the isolated, separate, self-specific, solitary individual. We move around in the world like perfectly spherical monads with smooth surfaces, bumping into each other from time to time, but essentially remaining separate, indivisible unities; every inner division is perceived as an illness. This sort of “I” (or “self”), isolated from the world, is the object of many philosophical and religious parables. In all this we are backed up by God, equally individual, equally isolated and remote, suspended in a void, by definition not relational and not dependent on anyone else. To this figure monotheism added a sense of the homogeneity of societies and groups. Monotheism is the foundation stone of the hierarchical system, because only individual beings separated from one another can be arranged in some kind of order.
  It was on the basis of these theses and assumptions that the whole of Western psychology and our common perception of ourselves as strictly defined monolithic units, co-existing with, but separate from others, was formed.
  As a result, the seemingly innocent discovery that the human being is composed of a multitude of other beings, which influence what we feel, what illnesses affect us, and perhaps also how we think, is a fundamental discovery. Above all it changes our relationship with the rest of nature, because we turn out to be more a part of it than a life form that’s separated from it by a special status and special rights, as religion constantly assures us. This relationship is a tangle of extremely complicated dependencies, which we do not understand.
  Perhaps in the future we shall review our belief in our own integrity and we will start to regard ourselves as “syncretic beings.” Thinking this way would bid us to reconsider the concept of identity and reconstruct models of personality in psychology. For this particular field it would be a genuine revolution, bringing a new paradigm. The tendency to diversify the individual, integral “I” has also been appearing in other realms, and seems to have been greatly accelerated by the pandemic. Remote activity by people in the media has been prompting others to summon up various incarnations of themselves, to create avatars that represent certain aspects or modes of the base personality. Nowadays a person can exist, so to speak, in many forms within various spaces. The concept of the persona (the mask) in psychology is ceasing to be a metaphor, and is taking on a disturbingly literal aspect.
  What’s more, an important consequence of these processes would be the thought that if God created us in his image and likeness, then the figure of God too would have to become more complicated, literally a composite. Perhaps somewhere in the distant future this will lead to religion becoming polytheistic. With the break-up of the figure of God into his componentparts we would no longer have the Trinity, but the Polygony, the Holy Multiplicity.

IV.
Epigenetics
NO MORE FATE


  In my lifetime a great deal has changed about the way we observe and understand the world.
  When as a teenager I took an interest in biology, one of the absolutely binding paradigms was the conviction that it was impossible to inherit acquired characteristics. The belief was that evolutionary changes occur as a result of chance genetic mutations, which are inherited by successive generations. But it turns out we should not necessarily be limited in our thinking to this notion alone.
  The term epigenetics was first coined in 1942. The originally Greek prefix epi-, by and large meaning “above” or “on,” in this instance signifies that something is being added onto classical genetics. At the same time we shouldn’t forget that despite the sensational headlines, accepting the existence of epigenetic processes does not mean breaking with the entire scientific paradigm of genetics. Genetic information is inscribed in our DNA, while epigenetics studies external determinants for interpreting this information. The famous research on mice shows that the association of a traumatic incident with an innocent smell is deeply encoded in the mouse’s DNA. The next generation of mice recognizes this smell as hostile, carrying a threat. This and similar experiments have confirmed that the environment cannot in fact change the actual structure of DNA. In other words this is not a kind of genetic mutation of an irreversible kind, though it does have an active, dynamic effect on turning gene expression on and off. And this process can be inherited.
  If the experience of psychological trauma is capable of acting on gene expression, presumably something similar may occur in connection with other factors, such as diet, place of residence, fondness for certain comestibles, the boom of disco music, sitting all day long at a computer, tourism, and many other factors.
  Once it has been researched in more depth, inter-generational epigenetic inheritance is sure to change our view of evolution. It is already stirring plenty of emotion and arousing the imagination.
  The effect on the genome of real events shows how quickly animate matter can learn, and how various phenomena are capable of exerting an influence on something that we have always regarded as more or less eternal, invariable, inert. As we talk about it, at the back of our minds we have the idea that the world has its own special solidity, and that it carries on, despite or in defiance of the fleeting nature of our feelings and thoughts, or of our individual existences. Meanwhile evolution never stops running its course, here and now, within our homes and on our tables, carrying out changes in us that we don’t even notice. To give just one example, it’s not impossible that the children of today’s teenagers will be better at using computer keyboards and smartphone touch screens than we are. Epigenetics helps us to understand the mechanisms that control these phenomena, but it is not, let us repeat, a revolution that would invalidate the earlier achievements of genetics. It is more likeanevolution, the next stage in the research that brings us closer to understanding life’s mysteries.
  There’s something else that I find important. A possible consequence of the new thinking about who we actually are, and how we become who we are, is very broad reflection on the topic of our human identity. Above all it turns out that the biological and the cultural do not have to be so sharply opposed to each other, and that we can regard them as nothing more than the poles of a spectrum on which they both appear. If we display phenotypic plasticity (meaning that the same genome can be the basis for various phenotypes), and if we are able to record our experience within our genome, this fact completely changes the weight of each individual moment in our lives. Our life ceases to be something transient and unique, and becomes baggage that we send into the future, a message, as it were, to future generations. Everything that happens to us takes on an importance it never had before. Applying poetic license, we can say that nature has something like a memory, and that in a way this memory may be self-reflective. If this is the case, then evolution is not merely a blind sequence of changes based on coincidence and chance, but also includes a far more complicated process of recording and passing on individual and group experiences in a way that remains difficult to explain.
  I can understand why the fascination with epigenetics may, for some minds, offer hope for the deconstruction of the concept of genetic, inevitable, irreversible fate, the irrevocable will of the gods that’s inscribed in our DNA, and which humankind is unable to change. Here is perhaps the promise that the boundaries of freedom are greater than we had expected. Perhaps we’re capable of shaping our own destiny and the destiny of future generations beyond what we have assumed.

* * *

  I have great faith in science, and I believe it’s not just a list of facts and a set of theoretical explanations for them, but that it offers us a powerful, supple vision of our ever-changing world.
  The eminent physicist Leopold Infeld defined it with perfect precision, as follows: “Scientific theory is an attempt, an effort to create an image of the reality that surrounds us. It covers a narrower or broader range of facts and experimental laws, over which this theory extends, introducing order. For science is not a jumble of laws and a junk room of facts. Theory connects them by means of a common idea, and creates an image of reality from which those facts emerge by means of logical reasoning.”

I would add: and with the help of the imagination.




Translated by Antonia Lloyd-Jones