One of the least helpful habits of modern thinking is the common but really rather weird insistence that if something doesn’t belong all the way to one side of a distinction, it must go all the way to the other side. Think about the way that the popular imagination flattens out the futures open to industrial society into a forced choice between progress and catastrophe, or the way that American political rhetoric can’t get past the notion that the United States must be either the best nation on earth or the worst: in both these cases and many more like them, a broad and fluid continuum of actual possibilities has been replaced by an imaginary opposition between two equally imaginary extremes.
The discussion of the different kinds of thinking in last week’s Archdruid Report post brushed up against another subject that attracts this sort of obsessive binary thinking, because it touched on the limitations of the human mind. Modern industrial culture has a hard time dealing with limits of any lind, but the limitations that most reliably give it hiccups are the ones hardwired into the ramshackle combination of neurology and mentality we use to think with. How often, dear reader, have you heard someone claim that there are no limits to the power of human thought—or, alternatively, that human thought can be dismissed as blind stumbling through some set of biologically preprogrammed routines?
A more balanced and less binary approach allows human intelligence to be seen as a remarkable but fragile capacity, recently acquired in the evolutionary history of our species, and still full of bugs that the remorseless beta testing of natural selection hasn’t yet had time to find and fix. The three kinds of thinking I discussed in last week’s post—figuration, abstraction, and reflection—are at different stages in that Darwinian process, and a good many of the challenges of being human unfold from the complex interactions of older and more reliable kinds of thinking with newer and less reliable ones.
Figuration is the oldest kind of thinking, as well as the most basic. Students of animal behavior have shown conclusively that animals assemble the fragmentary data received by their senses into a coherent world in much the same way that human beings do, and assemble their figurations into sequences that allow them to make predictions and plan their actions. Abstraction seems to have come into the picture with spoken language; the process by which a set of similar figurations (this poodle, that beagle, the spaniel over there) get assigned to a common category with a verbal label, “dog,” is a core example of abstract thinking as well as the foundation for all further abstraction.
It’s interesting to note, though, that figuration doesn’t seem to produce its most distinctive human product, narrative, until the basic verbal tools of abstraction show up to help it out. In the same way, abstraction doesn’t seem to get to work crafting theories about the cosmos until reflection comes into play. That seems to happen historically about the same time that writing becomes common, though it’s an open question whether one of these causes the other or whether both emerge from deeper sources. While human beings in all societies and ages are capable of reflection, the habit of sustained reflection on the figurations and abstractions handed down from the past seems to be limited to complex societies in which literacy isn’t restricted to a small religious or political elite.
An earlier post in this sequence talked about what happens next, though I used a different terminology there—specifically, the terms introduced by Oswald Spengler in his exploration of the cycles of human history. The same historical phenomena, though, lend themselves at least as well to understanding in terms of the modes of thinking I’ve discussed here, and I want to go back over the cycle here, with a close eye on the way that figuration, abstraction, and reflection shape the historical process.
Complex literate societies aren’t born complex and literate, even if history has given them the raw materials to reach that status in time. They normally take shape in the smoking ruins of some dead civilization, and their first centuries are devoted to the hard work of clearning away the intellectual and material wreckage left behind. Over time, barbarian warlords and warbands settle down and become the seeds of a nascent feudalism, religious institutions take shape, myths and epics are told and retold: all these are tasks of the figurative stage of thinking, in which telling stories that organize the world of human experience into meaningful shapes is the most important task, and no one worries too much about whether the stories are consistent with each other or make any kind of logical sense.
It’s after the hard work of the figurative stage has been accomplished, and the cosmos has been given an order that fits comfortably within the religious sensibility and cultural habits of the age, that people have the leisure to take a second look at the political institutions, the religious practices, and the stories that explain their world to them, and start to wonder whether they actually make sense. That isn’t a fast process, and it usually takes some centuries either to create a set of logical tools or to adapt one from some older civilization so that the work can be done. The inevitable result is that the figurations of traditional culture are weighed in the new balances of rational abstraction and found wanting.
Thus the thinkers of the newborn age of reason examine the Bible, the poems of Homer, or whatever other collection of mythic narratives has been handed down to them from the figurative stage, and discover that it doesn’t make a good textbook of geology, morality, or whatever other subject comes first in the rationalist agenda of the time. Now of course nobody in the figurative period thought that their sacred text was anything of the kind, but the collective shift from figuration to abstraction involves a shift in the meaning of basic concepts such as truth. To a figurative thinker, a narrative is true because it works—it makes the world make intuitive sense and fosters human values in hard times; to an abstractive thinker, a theory is true because it’s consistent with some set of rules that have been developed to sort out true claims from false ones.
That’s not necessarily as obvious an improvement as it seems at first glance. To begin with, of course, it’s by no means certain that knowing the truth about the universe is a good thing in terms of any other human value; for all we know, H.P. Lovecraft may have been quite correct to suggest that if we actually understood the nature of the cosmos in which we live, we would all imitate the hapless Arthur Jermyn, douse ourselves with oil, and reach for the matches. Still, there’s another difficulty with rationalism: it leads people to believe that abstract concepts are more real than the figurations and raw sensations on which they’re based, and that belief doesn’t happen to be true.
Abstract concepts are simply mental models that more or less sum up certain characteristics of certain figurations in the universe of our experience. They aren’t the objective realities they seek to explain. The laws of nature so eagerly pursued by scientists, for example, are generalizations that explain how certain quantifiable measurements are likely to change when something happens in a certain context, and that’s all they are. It seems to be an inevitable habit of rationalists, though, to lose track of this crucial point, and convince themselves that their abstractions are more real than the raw sensory data on which they’re based—that the abstractions are the truth, in fact, behind the world of appearances we experience around us. It’s wholly reasonable to suppose that there is a reality behind the world of appearances, to be sure,but the problem comes in with the assumption that a favored set of abstract concepts is that reality, rather than merely a second- or thirdhand reflection of it in the less than flawless mirror of the human mind.
The laws of nature make a good example of this mistake in practice. To begin with, of course, the entire concept of “laws of nature” is a medieval Christian religious metaphor with the serial numbers filed off, ultimately derived from the notion of God as a feudal monarch promulgating laws for all his subjects to follow. We don’t actually know that nature has laws in any meaningful sense of the word—she could simply have habits or tendencies—but the concept of natural law is hardwired into the structure of contemporary science and forms a core presupposition that few ever think to question.
Treated purely as a heuristic, a mental tool that fosters exploration, the concept of natural law has proven to be very valuable. The difficulty creeps in when natural laws are treated, not as useful summaries of regularities in the world of experience, but as the realities of which the world of experience is a confused and imprecise reflection. It’s this latter sort of thinking that drives the insistence, very common in some branches of science, that a repeatedly observed and documented phenomenon can’t possibly have taken place, because the existing body of theory provides no known mechanism capable of causing it. The ongoing squabbles over acupuncture are one example out of many: Western medical scientists don’t yet have an explanation for how it functions, and for this reason many physicians dismiss the reams of experimental evidence and centuries of clinical experience supporting acupuncture and insist that it can’t possibly work.
That’s the kind of difficulty that lands rationalists in the trap discussed in last week’s post, and turns ages of reason into ages of unreason: eras in which all collective action is based on some set of universally accepted, mutually supporting, logically arranged beliefs about the cosmos that somehow fail to make room for the most crucial issues of the time. It’s among history’s richer ironies that the beliefs central to ages of unreason so consistently end up clustering around a civil religion—that is, a figurative narrative that’s no more subject to logical analysis than the theist religions it replaced, and which is treated as self-evidently rational and true precisely because it can’t stand up to any sort of rational test, but which provides a foundation for collective thought and action that can’t be supplied in any other way—and that it’s usually the core beliefs of the civil religion that turn out to be the Achilles’ heel of the entire system.
All of the abstract conceptions of classical Roman culture thus came to cluster around the civil religion of the Empire, a narrative that defined the cosmos in terms of a benevolent despot’s transformation of primal chaos into a well-ordered community of hierarchically ranked powers. Jove’s role in the cosmos, the Emperor’s role in the community, the father’s role in the family, reason’s role in the individual—all these mirrored one another, and provided the core narrative around which all the cultural achievements of classical society assembled themselves. The difficulty, of course, was that in crucial ways, the cosmos refused to behave according to the model, and the failure of the model cast everything else into confusion. In the same way, the abstract conceptions of contemporary industrial culture have become dependent on the civil religion of progress, and are at least as vulnerable to the spreading failure of that secular faith to deal with a world in which progress is rapidly becoming a thing of the past.
It’s here that reflection, the third mode of thinking discussed in last week’s post, takes over the historical process. Reflection, thinking about thinking, is the most recent of the modes and the least thoroughly debugged. During most phases of the historical cycle, it plays only a modest part, because its vagaries are held in check either by traditional religious figurations or by rational abstractions. Many religious traditions, in fact, teach their followers to practice reflection using formal meditation exercises; most rationalist traditions do the same thing in a somewhat less formalized way; both are wise to do so, since reflection limited by some set of firmly accepted beliefs is an extraordinarily powerful way of educating and maturing the mind and personality.
The trouble with reflection is that thinking about thinking, without the limits just named, quickly shows up the sharp limitations on the human mind mentioned earlier in this essay. It takes only a modest amount of sustained reflection to demonstrate that it’s not actually possible to be sure of anything, and that way lies nihilism, the conviction that nothing means anything at all. Abstractions subjected to sustained reflection promptly dissolve into an assortment of unrelated figurations; figurations subjected to the same process dissolve just as promptly into an assortment of unrelated sense data, given what unity they apparently possess by nothing more solid than the habits of the human nervous system and the individual mind. Jean-Paul Sartre’s fiction expressed the resulting dilemma memorably: given that it’s impossible to be certain of anything, how can you find a reason to do anything at all?
It’s not a minor point, nor one restricted to twentieth-century French intellectuals. Shatter the shared figurations and abstractions that provide a complex literate society with its basis for collective thought and action, and you’re left with a society in fragments, where biological drives and idiosyncratic personal agendas are the only motives left, and communication between divergent subcultures becomes impossible because there aren’t enough common meanings left to make that an option. The plunge into nihilism becomes almost impossible to avoid once abstraction runs into trouble on a collective scale, furthermore, because reflection is the automatic response to the failure of a society’s abstract representations of the cosmos. As it becomes painfully clear that the beliefs of the civil religion central to a society’s age of reason no longer correspond to the world of everyday experience, the obvious next step is to reflect on what went wrong and why, and away you go.
It’s probably necessary here to return to the first point raised in this week’s essay, and remind my readers that the fact that human thinking has certain predictable bugs in the programming, and tends to go haywire in certain standard ways, does not make human thinking useless or evil. We aren’t gods, disembodied bubbles of pure intellect, or anything else other than what we are: organic, biological, animal beings with a remarkable but not unlimited capacity for representing the universe around us in symbolic form and doing interesting things with the resulting symbols. Being what we are, we tend to run up against certain repetitive problems when we try to use our thinking to do things for which evolution did little to prepare it. It’s only the bizarre collective egotism of contemporary industrial culture that convinces so many people that we ought to be exempt from limits to our intelligence—a notion just as mistaken and unproductive as the claim that we’re exempt from limits in any other way.
Fortunately, there are also reliable patches for some of the more familiar bugs. It so happens, for example, that there’s one consistently effective way to short-circuit the plunge into nihilism and the psychological and social chaos that results from it. There may be more than one, but so far as I know, there’s only one that has a track record behind it, and it’s the same one that provides the core around which societies come together in the first place: the raw figurative narratives of religion. What Spengler called the Second Religiosity—the renewal of religion in the aftermath of an age of reason—thus emerges in every civilization’s late history as the answer to nihilism; what drives it is the failure of rationalism either to deal with the multiplying crises of a society in decline or to provide some alternative to the infinite regress of reflection run amok.
Religion can accomplish this because it has an answer to the nihilist’s claim that it’s impossible to prove the truth of any statement whatsoever. That answer is faith: the recognition, discussed in a previous post in this sequence, that some choices have to be made on the basis of values rather than facts, because the facts can’t be known for certain but a choice must be made anyway—and choosing not to choose is still a choice. Nihilism becomes self-canceling, after all, once reflection goes far enough to show that a belief in nihilism is just as arbitrary and unprovable as any other belief; that being the case, the figurations of a religious tradition are no more absurd than anything else, and provide a more reliable and proven basis for commitment and action than any other option.
The Second Religiosity may or may not involve a return to the beliefs central to the older age of faith. In recent millennia, far more often than not, it hasn’t been. As the Roman world came apart and the civil religion and abstract philosophies of the Roman world failed to provide any effective resistance to the corrosive skepticism and nihilism of the age, it wasn’t the old cults of the Roman gods who became the nucleus of a new religious vision, but new faiths imported from the Middle East, of which Christianity and Islam turned out to be the most enduring. Similarly, the implosion of Han dynasty China led not to a renewal of the traditional Chinese religion, but to the explosive spread of Buddhism and of the newly invented religious Taoism of Zhang Daoling and his successors. On the other side of the balance is the role played by Shinto, the oldest surviving stratum of Japanese religion, as a source of cultural stability all through Japan’s chaotic medieval era.
It’s still an open question whether the religious forms that will be central to the Second Religiosity of industrial civilization’s twilight years will be drawn from the existing religious mainstream of today’s Western societies, or whether they’re more likely to come either from the bumper crop of religious imports currently in circulation or from a even larger range of new religious movements contending for places in today’s spiritual marketplace. History strongly suggests, though, that whatever tradition or traditions break free from the pack to become the common currency of thought in the postrationalist West will have two significant factors in common with the core religious movements of equivalent stages in previous historical cycles. The first is the capacity to make the transition from the religious sensibility of the previous age of faith to the emerging religious sensibility of the time; the second is a willingness to abandon the institutional support of the existing order of society and stand apart from the economic benefits as well as the values of a dying civilization. Next week’s post will offer some personal reflections on how that might play out.