The end of one year and the beginning of another has been a time for celebration and reflection since around the time calendars were invented, and even though the date has been kicked around the yearly cycle pretty comprehensively by history’s boot – it hasn’t been that long, all things considered, since the civil year in the English-speaking world began in late April – there’s a point to the custom. Our individual lives have their turning points, and so does the collective life of communities and cultures; the hinge of time when one year changes to another provides a useful reminder of such things. It’s in this spirit that I want to wrap up one of the threads of discussion that’s shaped my posts on The Archdruid Report for several weeks now.
Several times now in these essays, I’ve brought up the names of some of the major theorists of cyclic history – Giambattista Vico, Oswald Spengler, Arnold Toynbee – and talked a little about how their ideas illuminate the current crisis of industrial civilization. For the last three centuries, the tradition these authors and their works embody has challenged the historicist faith discussed in last week’s post: the belief that history has an arrow with the words “this way only” painted on it somewhere; that, in other words, it has a direction, a purpose and a goal. If a meaningful sense of history is a tool worth having as we face the predicament of our time, and historicism does not provide such a sense – and to my mind, at least, both these assertions are far more true than not – the vision of cyclic history is one place where something more useful might be found.
Mind you, cyclic and historicist views of history are both out of fashion these days; there is no shortage of scholars who lump them together as “metanarratives,” and insist that they should be banned from serious history. The problem with this insistence is that human beings think in stories as inevitably as they walk with feet. Attempting to chase metanarratives out of history simply results in assaults on those metanarratives unpopular enough to be noticed, while those that are accepted unthinkingly slip past the sentries with ease. The statement “history follows no pattern,” after all, is itself a metanarrative: a narrative about historical narratives that embodies a particular approach to historical knowledge. Thus attempts to talk about the shape of history should not be dismissed out of hand; the question that needs to be asked of them is simply whether they help to make sense of the course of historical events.
Yet this question itself can be read in more than one way. Historicist and cyclic theories of history both try to make sense of history, but they try to make different kinds of sense; they get different answers because they ask fundamentally different questions. At the core of historicism is the intuition that history has a meaning, while at the core of the cyclic vision is the intuition that history has a pattern – and “meaning” and “pattern” are by no means interchangeable terms. Most historicist theories, mind you, find pattern as well as meaning in history. Most cyclic theories, by contrast, leave questions of the meaning of history entirely open, and some – Oswald Spengler was particularly outspoken in this regard – reject the idea that history as a whole has any meaning or purpose with as much vehemence as any positivist.
Spengler’s reasons for this rejection are worth examining, because his rejection of historicism went deeper than just about any other thinker I can name. He argued that history can have no overall meaning, because it’s impossible to talk of meaning at all except within the worldview of a given culture; each culture evolves its own distinct way of experiencing human life in the universe, and the only meaning humans can know is embodied in these distinctive worldviews. No culture’s worldview is more or less true than any other, nor are the worldviews of cultures that arise later on in history an improvement in any sense on the ones that came before; each culture defines reality uniquely through its own dialogue with the inscrutable patterns of nature and the human experience. Interestingly, Spengler applied this logic to his own work as well; he offered his theory not as an objective truth about historical cycles, but simply as the best account of historical cycles that could be given from within the perspective of modern Western – in his terms, Faustian – humanity.
When it got past superficialities, much of the criticism that has been directed at Spengler’s work over the last nine decades took aim squarely at his insistence that every culture’s worldview is equally valid, and that humanity therefore does not progress. What makes his resolute rejection of our culture’s superiority unacceptable to so many people, though, is precisely that it offends against the pervasive historicism of our age. Only the belief that history is headed somewhere in particular, with our civilization presumably in the lead, makes his thesis in any way problematic.
For what it’s worth, I think that Spengler was right in principle but wrong on a minor but important detail. He was certainly right to point out that trying to rank worldviews of different cultures according to some scheme of progress or other yields self-serving nonsense. Ancient Egyptians understood the universe in one way, and modern Americans understand it in another, not because Americans are right and Egyptians were wrong – or vice versa! – but because the two cultures were not talking about the same things, nor were they using the same symbolic language for the discussion. A worldview based on explorations of the metaphysics of human life in the language of myth cannot meaningfully be judged by the standards of a worldview that takes analysis of the physical world in the language of mathematics as its starting point.
To say that the industrial world’s technological progress proves the superiority of its worldview merely begs the question, since the Egyptians did not value technological progress. They valued cultural stability and they achieved it, maintaining cultural continuity for well over 3000 years – a feat our own civilization is not likely to equal. By their standards, for that matter, our society’s ephemeral fashions, ceaseless cultural turmoil, and incoherent metaphysics would have branded it as an abject failure at the most basic tasks of human social life.
As I see it, though, Spengler undervalued the process by which certain kinds of technique invented by one culture can enrich later cultures. A very relevant example is classical logic, among the supreme achievements of the Apollonian culture, which was inherited in turn by the Indian, Syrian-Byzantine-Arabic (in Spengler’s language, Magian), and Faustian cultures. No two of these cultures did the same thing with that inheritance; a toolkit Greeks devised to pick apart spoken language was used in India to analyze the structures of consciousness, in the Levant to contemplate the glories of God, and in Europe and the European diaspora to unravel the mysteries of matter. Without Greek logic, though, some of the greatest creations of all three inheritor cultures – the rich philosophical dimensions of Hinduism and Buddhism, the great theological syntheses of Islam and Christianity, or the fusion of logic with experience that gave rise to the modern scientific method – certainly could not have been done as easily, and quite possibly might not have happened at all.
What this implies is that, while history is not directional, it can be cumulative. Nothing in the history of cultures older than Greece suggests that the emergence of logic was inevitable, just as nothing in the subsequent history of logic justifies the claim that logic is developing toward some goal or other. Still, the toolkit of logic, absent before the Greeks, enriched a series of cultures that flourished after them. There are countless examples, and they span the full range of human cultural creations; for a small but telling example, consider how the practice of counting prayers on a string of beads, which originated in India, has spread through most of the world’s religions. For another, consider the way that forty centuries of East Asian intensive agriculture inspired the emergence of organic growing methods that are probably our best bet for tomorrow’s food supply. Every person who finds spiritual solace in prayer or meditation with a rosary, or is planning a backyard organic garden to help put food on the table next year, has good reasons to be grateful for the slow accumulation of technique over time.
Thus there’s a fine irony in the insistence by so many people these days that evolution will shortly relieve us of the necessity to deal with the consequences of our own mistakes, and get history back on track to their imagined goal. They’re right that the historical changes under way now are evolutionary in nature; their mistake lies in thinking, to put the matter perhaps a bit too harshly, that evolution is some sort of cosmic tooth fairy who can be counted on to leave a shiny new future under the modern world’s pillow to replace one rotted away by three centuries of extravagant living. Instead, the historical development of cultures parallels the way that evolution actually works in nature. Cultures, like species, tend to collect those adaptations that meet their needs, and discard the ones that don’t. Thus those techniques that happen to meet the needs of more than one culture tend to survive more often than those that don’t, just as those cultures that are able to make use of a suitable range of inherited techniques are more likely to thrive than those that do not.
I trust none of my readers are drowsy enough by this point to think that I am suggesting that the accumulation of useful techniques is the meaning, purpose, or goal of history. From my point of view, for whatever that may be worth, meanings, purposes, and goals are not to be found in any objective sense in the brute facts of existence; they are always and only attributes applied creatively to existence by conscious persons, and the emergence of meanings, purposes and goals common to more than one person depends on the relation between the person proposing these things and those who choose to accept or reject them. (Atheists may read this statement in one sense, and religious people in quite another; interestingly enough, the logic works either way.)
Like biological evolution, though, the cultural evolution I am proposing here is in no way inevitable. The crises that surround the decline and fall of civilizations, in particular, very often become massive choke points at which many valuable things are lost. One reasoned response to the approach of such a choke point in our own time thus might well be a deliberate effort to help the legacy of the present reach the waiting hands of the future. The same logic that leads the ecologically literate to do what they can to keep threatened species alive through the twilight of the industrial age, so that biological evolution has as wide a palette of raw materials as possible in the age that follows, applies just as well to cultural evolution.
Thus it may not be out of place to imagine a list of endangered knowledge to go along with today’s list of endangered species, and to take broadly equivalent steps to preserve both. There are certainly other meanings, purposes and goals that can be found in, or more precisely applied to, either the inkblot patterns of history as a whole or the specific challenges we face right now, in the early stages of industrial civilization’s decline and fall. We can decide as individuals whether to build on the heritage of our culture, to explore the legacies have been handed down to us from other cultures, or to scrap the lot and try to break new ground, knowing all the while that other individuals will make their own choices and the relative success of the results, rather than any preference of ours, will determine which of them plays the largest role in shaping the future.
My own choice centers on the preservation of those parts of the modern world’s heritage that I find most valuable, and most promising, as tools for the futures that seem most likely to me. If that way of putting things seems uncomfortably subjective, personal, and even arbitrary, dear reader, you’re beginning to get the point of the last month or two of Archdruid Report posts. Our own subjective, personal, and arbitrary perceptions are the only things we have to go on, and the results tend to be much less problematic when we accept this fact, rather than trying to cast the shadows of our desires onto history’s arc and stare at them in the fond delusion that we’re staring destiny in the face.
One way or another, we all have choices to make as the new year dawns. Some of us will face the harsh decisions that come with unemployment, foreclosure, and bankruptcy; others will encounter the moral challenges that face those who have wealth while others go hungry; still others will have other choices. Not everyone will be at liberty to take the deindustrial future into account as they make their choices, but I hope some will do so, and whatever you choose in this regard – whether or not it corresponds to any of the things I’ve discussed here – it might be wise to take action on the basis of your decisions sooner rather than later. A year, after all, is not the only thing that’s ending around us just now.
Wednesday, December 31, 2008
Wednesday, December 24, 2008
History's Arrow
One of the advantages of being a Druid is that you get to open your holiday presents four days early. Last Sunday’s winter solstice was pleasant, with a scattering of snow on the ground outside and candles burning indoors as we celebrated the rebirth of the sun. As one hinge of the year’s cycle, the solstice is a good time to ponder the shape of time: on the small scale, with hopes for the year to come and memories of the one now passing; the middle scale, as I think back on past holidays and the uncertain number that still lie ahead; and the large scale, with which this blog is mostly concerned. In keeping with that seasonal theme, I want to talk a bit about history on the large scale, and the ideas our culture uses to frame the idea of history.
One of the things that has interested me most about the reactions to the ideas about the shape of the future I’ve presented here on The Archdruid Report is the extent to which so many of them presuppose one particular way of thinking about history. Like the character in one of Moliére’s plays who was astonished to find that he had been speaking prose all his life, a great many people these days have embraced a distinctive philosophy of history, but seem never quite to have noticed that fact.
This is hardly a new thing. One of the ironies of the history of ideas is the way that so many cultural themes, surfacing first in avant-garde intellectual circles, are dismissed out of hand by the grandparents of those who will one day treat them as obvious facts. Modern nationalism, to cite one example out of many, began with the romantic visions of a few European poets, spilled out into the world largely through music and the arts, and turned into a massive political force that shredded the political maps of four continents. To some extent, this is the intellectuals’ revenge on an unreflective society: the men of affairs who treat the arts as amenities and dismiss philosophy as worthless abstraction spend their workdays unknowingly mouthing the words of dead philosophers and acting out the poems they never read on the stage of current events.
The way of thinking about history I have in mind today has followed the same trajectory. Karl Popper, who devoted much of his career to critiquing it, called it historicism. This is the belief that history as a whole moves inevitably in a single direction that can be known in advance by human beings. Exactly what that single direction is supposed to be varies from one historicist to another; choose any point along the spectrum of cultural politics, and you can find a version of historicism that treats the popular ideals and moral concerns common to that viewpoint as the linchpin of the historical process. The details differ; the basic assumption remains the same.
That same assumption has also spread to infect nearly every contemporary discussion of change over time. After my post “Taking Evolution Seriously” appeared a few weeks back, for example, one of my longtime readers forwarded me comments from a discussion on an email list, whose members took me to task in no uncertain terms for my discussion on the evolutionary process. When I said that no organism is “more evolved” than any other and that evolution has no particular direction or goal, they insisted, I was simply wrong; evolution progresses in the direction of increased complexity over time, one person claimed, and another suggested that I would be better informed if I read more of the writings of the late Stephen Jay Gould.
Now I have no objection to reading more of Gould’s work, as I’ve already enjoyed many of his books. For that matter, I’ve read a fair amount of evolutionary theory, beginning with Darwin and continuing through some of the most recent theorists, and also took college courses in evolutionary ecology and several related branches of environmental science. One thing this taught me is that attempts are always being made to stuff evolution into a historicist straitjacket. Another thing I learned is that these attempts are rejected by the great majority of evolutionary biologists, because the evidence simply doesn’t fit.
Some evolutionary lineages have moved from more simple to more complex forms over time, but others have gone in the other direction, and the vast majority of living things on Earth today belong to phyla that have not added any noticeable complexity since the Paleozoic. Nor has the Earth’s biosphere as a whole become more complex; the entire Cenozoic era – the 65 million years between the last dinosaurs and us – has been less biologically rich than the Mesozoic era that preceded it, and the global cooling of the last fifteen million years or so has seen a decrease in the world’s biological complexity, as ecosystems have adapted to the more rigorous conditions that have spread over much of the world.
The facts on the ground, then, simply don’t support any claim that evolution moves toward greater complexity. No other version of historicism fares any better when applied to evolution, either. Yet ninety-nine times out of a hundred, when you hear people outside of a university biology department talking about evolution, what they have in mind is a linear process leading in a particular direction. They are, in other words, talking historicism.
Trace these ideas back along their own evolutionary lineage and a fascinating history emerges. The founder of the current of thought that gave rise to today’s historicism was an Italian monk named Joachim of Flores, who lived from 1145 to 1202 and spent most of the latter half of his life writing abstruse books on theology. Most Christian theologians before his time accepted Augustine of Hippo’s famous distinction between the City of God and the City of Man, and assigned all secular history to the latter category, one more transitory irrelevance to be set aside by the soul in search of salvation. Joachim’s innovation was the claim that the plan of salvation works through secular history. He argued that all human history, secular as well as sacred, was divided into three ages, the age of Law under the Old Testament, the age of Love under the New, and the age of Liberty that was about to begin.
Some of his theories were formally condemned by church councils, but his core theory proved unstoppable. Every generation of church reformers from the thirteenth century to the eighteenth seized on his ideas and claimed that their own arrival marked the coming of the age of Liberty; every generation of church conservatives stood Joachim on his head, insisted that the three ages marked the progressive loss of divine guidance, and portrayed the arrival of the latest crop of reformers as Satan’s final offensive. As secular thought elbowed theology aside, in turn, Joachim’s notion of history as the working out of a divine plan got reworked into secular theories of humanity’s grand destiny.
Notable among these was the theory argued by the Marquis de Condorcet in Sketch for a Historical Picture of the Progress of the Human Spirit in 1794. A rich historical irony surrounds this work; Condorcet had been a strong supporter of the French Revolution, and hoped that the end of the monarchy would usher in a republic of reason; instead, he was condemned to death by the new government and wrote his Sketch while he was on the run from the guillotine. He nonetheless described human history as an inevitable rise from barbarism to a future of reason and progress in which all of human life would undergo endless improvement.
Condorcet’s faith in perpetual progress found many listeners, but a more influential voice was already waiting in the wings: Georg Wilhelm Friedrich Hegel, who managed the rare feat of becoming both the most influential and the most unreadable philosopher of modern times. In his Philosophy of History, which was published shortly after his death in 1831, he argued that history was the process by which human freedom (which, for him, was not quite the freedom of the individual; he idolized Napoleon and the government of Prussia) was maximized in time. In Hegel’s mind, Joachim’s threefold rhythm of history was reworked into the three phases of thesis, antithesis, and synthesis, by which every opposition was resolved into a higher unity.
Hegel’s view of history became enormously influential, less through his own work – I challenge any of my readers to plow through the Philosophy of History and come out the other side with anything but a headache – than through the writings of those influenced by him. Political radicals at both ends of the spectrum jumped on Hegel’s ideas; on the left, Karl Marx used Hegelian ideas as the foundation for his philosophy of class warfare and Communist revolution; on the right, Giovanni Gentile, the pet philosopher of Mussolini’s Fascist regime, was a rigorous Hegelian. For that matter, Francis Fukuyama, who played a role much like Gentile’s for the neoconservative movement, drew his theory of an end to history from Hegel.
Still, the spread of Hegel’s ideas isn’t limited to the radical fringes, or even to those who know who Hegel was. I think most people who have been following the issue of peak oil for more than a few months have noticed, when the subject comes up for discussion in public, one of the most common responses is “Oh, they’ll think of something.” Ask the person who says this to explain, and odds are you’ll be told that every time the world runs out of some resource, “they” find something new, and the result is more progress. This is Hegel reframed in terms of economics; shortage is the thesis, ingenuity the antithesis, and progress the synthesis; the insistence that the process is inevitable puts the icing on the Hegelian cake. More generally, the logic of historicism governs the entire narrative: history’s arrow points in the direction of progress, and so whatever happens, the result will be more progress.
Examples could be added by the page, but I hope the point has been made. Still, it’s crucial to realize just how deeply historicism has become entrenched in all modern thinking. If, dear reader, you think yourself untouched by it, I encourage you to try a thought experiment. The average species, paleontologists tell us, lasts around ten million years. Imagine that by some means – a visit from a time machine, say, that leaves you holding a history of humanity written by an intelligent species descended from chipmunks – you find out that this is how long we have. We won’t achieve godhood, or reach the stars, or destroy the planet, or enter Utopia; instead, the nine million years we’ve got left will be like recorded history so far. Civilizations will rise and fall; our species will create great art and literature, interpret the universe in various ways, explore many modes of living on the Earth; finally, millions of years from now, it will slowly lose the struggle for survival, dwindle to small populations in isolated areas, and go extinct.
If that turns out to be humanity’s future, would you be satisfied with it? Or would you feel that some goal has been missed, some destiny betrayed? If the latter, what makes you think that?
Now of course it may be a waste of breath to contend with ideas as pervasive and deeply rooted as historicism, but the effort has to be made, if only because historicism has a dismally bad track record as a basis for prophecy. Name a historicist belief system that’s been around more than a few years, right back to Joachim of Flores himself, and you’ll find a trail of failed predictions of the imminent arrival of the goal of history. (Joachim himself apparently believed that the age of Liberty would arrive in 1260; no such luck.) If we are to have any useful sense of the future ahead of us, historicist belief systems are among the worst sources of guidance available to us.
Fortunately there are other choices. In next week’s post, I plan on talking about some of those. In the meantime, best holiday wishes to all my readers – whatever holidays you celebrate at this time of year.
One of the things that has interested me most about the reactions to the ideas about the shape of the future I’ve presented here on The Archdruid Report is the extent to which so many of them presuppose one particular way of thinking about history. Like the character in one of Moliére’s plays who was astonished to find that he had been speaking prose all his life, a great many people these days have embraced a distinctive philosophy of history, but seem never quite to have noticed that fact.
This is hardly a new thing. One of the ironies of the history of ideas is the way that so many cultural themes, surfacing first in avant-garde intellectual circles, are dismissed out of hand by the grandparents of those who will one day treat them as obvious facts. Modern nationalism, to cite one example out of many, began with the romantic visions of a few European poets, spilled out into the world largely through music and the arts, and turned into a massive political force that shredded the political maps of four continents. To some extent, this is the intellectuals’ revenge on an unreflective society: the men of affairs who treat the arts as amenities and dismiss philosophy as worthless abstraction spend their workdays unknowingly mouthing the words of dead philosophers and acting out the poems they never read on the stage of current events.
The way of thinking about history I have in mind today has followed the same trajectory. Karl Popper, who devoted much of his career to critiquing it, called it historicism. This is the belief that history as a whole moves inevitably in a single direction that can be known in advance by human beings. Exactly what that single direction is supposed to be varies from one historicist to another; choose any point along the spectrum of cultural politics, and you can find a version of historicism that treats the popular ideals and moral concerns common to that viewpoint as the linchpin of the historical process. The details differ; the basic assumption remains the same.
That same assumption has also spread to infect nearly every contemporary discussion of change over time. After my post “Taking Evolution Seriously” appeared a few weeks back, for example, one of my longtime readers forwarded me comments from a discussion on an email list, whose members took me to task in no uncertain terms for my discussion on the evolutionary process. When I said that no organism is “more evolved” than any other and that evolution has no particular direction or goal, they insisted, I was simply wrong; evolution progresses in the direction of increased complexity over time, one person claimed, and another suggested that I would be better informed if I read more of the writings of the late Stephen Jay Gould.
Now I have no objection to reading more of Gould’s work, as I’ve already enjoyed many of his books. For that matter, I’ve read a fair amount of evolutionary theory, beginning with Darwin and continuing through some of the most recent theorists, and also took college courses in evolutionary ecology and several related branches of environmental science. One thing this taught me is that attempts are always being made to stuff evolution into a historicist straitjacket. Another thing I learned is that these attempts are rejected by the great majority of evolutionary biologists, because the evidence simply doesn’t fit.
Some evolutionary lineages have moved from more simple to more complex forms over time, but others have gone in the other direction, and the vast majority of living things on Earth today belong to phyla that have not added any noticeable complexity since the Paleozoic. Nor has the Earth’s biosphere as a whole become more complex; the entire Cenozoic era – the 65 million years between the last dinosaurs and us – has been less biologically rich than the Mesozoic era that preceded it, and the global cooling of the last fifteen million years or so has seen a decrease in the world’s biological complexity, as ecosystems have adapted to the more rigorous conditions that have spread over much of the world.
The facts on the ground, then, simply don’t support any claim that evolution moves toward greater complexity. No other version of historicism fares any better when applied to evolution, either. Yet ninety-nine times out of a hundred, when you hear people outside of a university biology department talking about evolution, what they have in mind is a linear process leading in a particular direction. They are, in other words, talking historicism.
Trace these ideas back along their own evolutionary lineage and a fascinating history emerges. The founder of the current of thought that gave rise to today’s historicism was an Italian monk named Joachim of Flores, who lived from 1145 to 1202 and spent most of the latter half of his life writing abstruse books on theology. Most Christian theologians before his time accepted Augustine of Hippo’s famous distinction between the City of God and the City of Man, and assigned all secular history to the latter category, one more transitory irrelevance to be set aside by the soul in search of salvation. Joachim’s innovation was the claim that the plan of salvation works through secular history. He argued that all human history, secular as well as sacred, was divided into three ages, the age of Law under the Old Testament, the age of Love under the New, and the age of Liberty that was about to begin.
Some of his theories were formally condemned by church councils, but his core theory proved unstoppable. Every generation of church reformers from the thirteenth century to the eighteenth seized on his ideas and claimed that their own arrival marked the coming of the age of Liberty; every generation of church conservatives stood Joachim on his head, insisted that the three ages marked the progressive loss of divine guidance, and portrayed the arrival of the latest crop of reformers as Satan’s final offensive. As secular thought elbowed theology aside, in turn, Joachim’s notion of history as the working out of a divine plan got reworked into secular theories of humanity’s grand destiny.
Notable among these was the theory argued by the Marquis de Condorcet in Sketch for a Historical Picture of the Progress of the Human Spirit in 1794. A rich historical irony surrounds this work; Condorcet had been a strong supporter of the French Revolution, and hoped that the end of the monarchy would usher in a republic of reason; instead, he was condemned to death by the new government and wrote his Sketch while he was on the run from the guillotine. He nonetheless described human history as an inevitable rise from barbarism to a future of reason and progress in which all of human life would undergo endless improvement.
Condorcet’s faith in perpetual progress found many listeners, but a more influential voice was already waiting in the wings: Georg Wilhelm Friedrich Hegel, who managed the rare feat of becoming both the most influential and the most unreadable philosopher of modern times. In his Philosophy of History, which was published shortly after his death in 1831, he argued that history was the process by which human freedom (which, for him, was not quite the freedom of the individual; he idolized Napoleon and the government of Prussia) was maximized in time. In Hegel’s mind, Joachim’s threefold rhythm of history was reworked into the three phases of thesis, antithesis, and synthesis, by which every opposition was resolved into a higher unity.
Hegel’s view of history became enormously influential, less through his own work – I challenge any of my readers to plow through the Philosophy of History and come out the other side with anything but a headache – than through the writings of those influenced by him. Political radicals at both ends of the spectrum jumped on Hegel’s ideas; on the left, Karl Marx used Hegelian ideas as the foundation for his philosophy of class warfare and Communist revolution; on the right, Giovanni Gentile, the pet philosopher of Mussolini’s Fascist regime, was a rigorous Hegelian. For that matter, Francis Fukuyama, who played a role much like Gentile’s for the neoconservative movement, drew his theory of an end to history from Hegel.
Still, the spread of Hegel’s ideas isn’t limited to the radical fringes, or even to those who know who Hegel was. I think most people who have been following the issue of peak oil for more than a few months have noticed, when the subject comes up for discussion in public, one of the most common responses is “Oh, they’ll think of something.” Ask the person who says this to explain, and odds are you’ll be told that every time the world runs out of some resource, “they” find something new, and the result is more progress. This is Hegel reframed in terms of economics; shortage is the thesis, ingenuity the antithesis, and progress the synthesis; the insistence that the process is inevitable puts the icing on the Hegelian cake. More generally, the logic of historicism governs the entire narrative: history’s arrow points in the direction of progress, and so whatever happens, the result will be more progress.
Examples could be added by the page, but I hope the point has been made. Still, it’s crucial to realize just how deeply historicism has become entrenched in all modern thinking. If, dear reader, you think yourself untouched by it, I encourage you to try a thought experiment. The average species, paleontologists tell us, lasts around ten million years. Imagine that by some means – a visit from a time machine, say, that leaves you holding a history of humanity written by an intelligent species descended from chipmunks – you find out that this is how long we have. We won’t achieve godhood, or reach the stars, or destroy the planet, or enter Utopia; instead, the nine million years we’ve got left will be like recorded history so far. Civilizations will rise and fall; our species will create great art and literature, interpret the universe in various ways, explore many modes of living on the Earth; finally, millions of years from now, it will slowly lose the struggle for survival, dwindle to small populations in isolated areas, and go extinct.
If that turns out to be humanity’s future, would you be satisfied with it? Or would you feel that some goal has been missed, some destiny betrayed? If the latter, what makes you think that?
Now of course it may be a waste of breath to contend with ideas as pervasive and deeply rooted as historicism, but the effort has to be made, if only because historicism has a dismally bad track record as a basis for prophecy. Name a historicist belief system that’s been around more than a few years, right back to Joachim of Flores himself, and you’ll find a trail of failed predictions of the imminent arrival of the goal of history. (Joachim himself apparently believed that the age of Liberty would arrive in 1260; no such luck.) If we are to have any useful sense of the future ahead of us, historicist belief systems are among the worst sources of guidance available to us.
Fortunately there are other choices. In next week’s post, I plan on talking about some of those. In the meantime, best holiday wishes to all my readers – whatever holidays you celebrate at this time of year.
Wednesday, December 17, 2008
Why Dissensus Matters
During the last month or so, these essays have tried to present an extended critique of the very common notion that we can collectively plan for, and achieve, the future that we decide we want. By now, though, that point has been pushed about as far as it will go. Those of my readers who are going to get my point have gotten it, and are likely more than ready to go on to something else, while those who continue to believe they can order up a future to go will continue to believe that no matter what gets said here.
Still, that discussion leads on to a further question, one that can’t reasonably be avoided here. Given that I don’t think much of the prospects of planning a desirable future and then making it happen, what am I suggesting instead? That’s a harder question to answer than to ask, because the only answer I have to offer presupposes certain things about what we can know about the future, and those have to be clarified first. The same thing is true, to be sure, about the attempts to plan the future I’ve critiqued. If we could know where history is headed and what influence our actions could have on it, making firm plans for the future would be a safe bet. Equally, if it’s impossible for us to know anything at all about the future, then all bets are off, no course of action is more likely to succeed than any other, and the only option left would be moment-by-moment improvisation.
Still, it seems to me that neither of these extremes fits our situation. There are certainly things ahead that we will never expect until they show up on the doorstep, but not everything about the future falls into this category. An interesting distinction also lies between many of the things we can know about and many of the things we can’t: very often, we have no way of knowing what will happen, but we can predict very accurately that certain things won’t happen, and we can also accurately predict the kinds of things that will happen.
A specific example may help show how this works. Some years ago, when the late housing bubble was shifting into overdrive, quite a number of people – I was among them, though I didn’t have a public platform for my predictions in those days – noted the acceleration in housing prices and drew two conclusions. The first was that those who insisted real estate could increase in value forever were wrong: not just a little bit wrong, but utterly, catastrophically wrong. The second was that if real estate kept zooming up, there was going to be a massive crash. (Those who see this as 20/20 hindsight may visit the Housing Panic blog, one place where these predictions appeared.)
Both predictions, it’s worth noting, were based on the evidence of history. Ever since market economies evolved the capacity to support speculative bubbles, people have lost their senses at intervals over some investment or other: tulips, stock, real estate, precious metals, commodities, you name it. The infallible sign that this has happened is the claim that the investment in question is exempt from supply and demand and will just keep increasing in value. I think most of us remember when exactly these things were said about tech stocks, and it’s been rather les than a year since many people insisted that the same things were true about petroleum: wrong in both cases, and in every other case in human history. Thus we can know something about the future: we can accurately predict that no speculative investment will rise in value indefinitely.
The second prediction followed on the heels of the first. Because millions of people were climbing aboard the real estate bandwagon, and prices were zooming upwards, it was a safe bet that eventually prices would slump, people would sell off their investments, and the result would be a crash; that’s the way every speculative bubble in history has ended. Thus the bloggers on Housing Panic knew the kind of thing the future would hold: a collapse in real estate prices in which a great many people would lose a huge amount of money. Those who noticed that banks were loaning money recklessly to speculators also knew that many banks would go under; again, that’s the kind of thing that happens when greed trumps caution and banks forget that money should only be loaned to people who can pay it back.
What nobody knew was when the crash would come, what would trigger it, and how it would play out. This is the difference between knowing what kind of thing will happen and knowing what will happen. Nothing is more difficult than timing a bubble. Isaac Newton, arguably one of the brightest human beings who ever lived, tried to time the market during the South Sea Bubble and lost most of his money. (Any of my readers who consider themselves smarter than Newton are invited to try to predict the turning point of the current bubble in US treasury bills. Since the bursting of that bubble will probably put what’s left of the global economy into cardiac arrest, this is by no means a purely academic exercise.)
These same considerations apply to any attempt to predict the future, and in particular to the central theme of this blog, the twilight of industrial civilization and the long descent into a new dark age. Civilizations, like speculative bubbles, have promoters who insist they can keep on going forever; just as with bubbles, announcements of that sort have historically been a clear sign that serious trouble is not too far off. It’s a safe bet, in any case, that every bubble will pop and every civilization will decline and fall. Those who are heavily invested in a particular bubble or civilization will of course insist that it’s different this time, just as their predecessors did; those claims have been wrong so far, and the evidence isn’t favoring them this time, either.
It’s quite possible, in turn, to predict the kind of things that will happen as industrial civilization lurches down the uneven slope of decline. Plenty of civilizations have done that before, and the common features stand out clearly from history; some of these features are already visible in the present case – it’s educational to page through Spengler or Toynbee and note how many features of a declining civilization had not yet appeared in their time, but have shown up on schedule in ours. What nobody can know in advance is just how these trends will work out in detail.
This is the perspective on the future that frames the proposals I’ve made here and elsewhere for coping with the long descent ahead of us. It’s certainly possible to know in advance some of the things that won’t happen. For example, declining civilizations always seem to get prophets who insist that some vast and improbable transformation will suddenly replace their civilization with the kind of society they would rather inhabit. They are always wrong, and such prophecies should be seen as signs of the times rather than knowledge about what will actually happen.
Set such fantasies aside, and it’s not that difficult to predict the kind of things that will happen as our civilization runs down. Mass migrations, for example, usually take place when civilizations collapse; the tidal force of migrant workers and refugees streaming across today’s borders is already making headlines, so it’s a safe bet this process will shift into high gear in the future. On the other hand, it’s anyone’s guess how those migrations will affect individuals and communities in any given corner of the world. I’ve suggested in these essays, for instance, that the western shores of North America may end up receiving some millions of refugees by sea from Japan. The Japanese islands can only support a small fraction of their current population on local resources; the northern Pacific currents go the right direction, and Japan has a huge and capable merchant marine and fishing fleet, so means, motive, and opportunity are present.
None of this makes the arrival of the first rusting container ship full of refugees on an Oregon beach a certainty. For all we know, Japan might purchase eastern Siberia from a disintegrating Russia thirty years from now, and settle its extra population there; it might go to war with China and suffer losses so drastic that the point becomes moot; or some other unexpected turn of events might set history in motion down a different path. What we do know is that as fossil fuels run short and importing food becomes a strategy without a future, a large fraction of the population of Japan will either relocate or die; what they do about that bitter choice is less predictable.
Thus the knowledge we can get provides no basis for making a future to order, but potentially allows room for something beyond improvisation. Since a miracle is not going to bail us out from the results of our collective mistakes, we need not waste time waiting for one, and can get to work in more practical ways. Since the kind of things that happen early in a civilization’s decline are tolerably well documented, we can assess trends already at work in the areas where we live, and guess at the near-term challenges we are likely to face. Since the endpoint of the process of decline is also fairly well documented, we can try to anticipate what things, readily available now, will be scarce and useful to our descendants, and do what we can to see that those things get passed down the chain of years to the waiting hands of the future.
Now it’s true, of course, that none of these options are foolproof. Even with the guidance of history, it’s possible to misjudge the shape of the future disastrously, and even those who guess the future in advance may not be able to avoid its dangers. This is why the concept of dissensus, introduced last week, is vital just now: as any ecologist can tell you, in the face of unpredictable change, the wider the range of variation in a species, the more likely that some of them will have what it takes to adapt. A monoculture of ideas, organizational styles, or paradigms is just as vulnerable as a monoculture of living things, and so our best option just now is to encourage disagreement, so as to foster as many different approaches to the future as possible.
The need for dissensus, it should be stressed, does not simply cover the technologies different individuals and groups might decide to pursue, the organizations they might choose to make or support, or the survival strategies that might seem most promising to them. It also reaches into the realm of ends. I have said this several times in recent posts, but it bears repeating: we have no idea what kind of society is best suited to a world after industrialism. It’s far more likely than not, in fact, that such a society will have little in common with the notions that middle-class intellectuals in the industrial world today might have of it. This doesn’t mean that we shouldn’t try to imagine such a society; it does mean that attempts to push diverse visions into a single consensus are as unproductive as they are futile.
Diversity in the realm of ends, finally, also applies to the most basic decisions about the way the predicament of our time is framed. For some people, the most meaningful challenge focuses on rebuilding communities to help them and their residents get through the end of the age of abundance. For others, it focuses on building new societies they hope will replace the one we have now. For still others, it focuses on developing new technologies, or rescuing old ones, to replace those that will stop working when today’s lavish energy supplies run out. There are those for whom raw survival is the most important thing, and there are those who have come to terms with the inevitability of death and are pursuing other goals.
Which of these choices is the best? Wrong question. All of them, and more, are necessary parts of a dissensus-based approach to the crisis of industrial civilization. As you read these words, members of a city council in a Midwestern college town may be mulling over a project that will pull their community through hard times, while activists one town over, with the best intentions in the world, devise a similar program that will fail and take their town’s future with it. One ecovillage in Ohio may be inventing social forms that will evolve into the neotribal societies of the 22nd century, while another attempt on similar lines sparks quarrels that tear a community to shreds. One hobbyist in Montana, staring at pictures of a 19th century solar steam engine, may start making the prototype of a machine that will become the prime energy source of the ecotechnic age, while others miss the necessary insight and waste their lives on dead ends.
What adds spice to the irony is that we have no way of knowing in advance which is which. All any of us can do is pursue the work that calls to us individually, cooperate with others who share the same commitment, take the measures to weather the crisis that seem to make sense from where we are, and remember that those who disagree with us most heartily may be assembling their own piece of a puzzle that is, ultimately, bigger than any of us.
Still, that discussion leads on to a further question, one that can’t reasonably be avoided here. Given that I don’t think much of the prospects of planning a desirable future and then making it happen, what am I suggesting instead? That’s a harder question to answer than to ask, because the only answer I have to offer presupposes certain things about what we can know about the future, and those have to be clarified first. The same thing is true, to be sure, about the attempts to plan the future I’ve critiqued. If we could know where history is headed and what influence our actions could have on it, making firm plans for the future would be a safe bet. Equally, if it’s impossible for us to know anything at all about the future, then all bets are off, no course of action is more likely to succeed than any other, and the only option left would be moment-by-moment improvisation.
Still, it seems to me that neither of these extremes fits our situation. There are certainly things ahead that we will never expect until they show up on the doorstep, but not everything about the future falls into this category. An interesting distinction also lies between many of the things we can know about and many of the things we can’t: very often, we have no way of knowing what will happen, but we can predict very accurately that certain things won’t happen, and we can also accurately predict the kinds of things that will happen.
A specific example may help show how this works. Some years ago, when the late housing bubble was shifting into overdrive, quite a number of people – I was among them, though I didn’t have a public platform for my predictions in those days – noted the acceleration in housing prices and drew two conclusions. The first was that those who insisted real estate could increase in value forever were wrong: not just a little bit wrong, but utterly, catastrophically wrong. The second was that if real estate kept zooming up, there was going to be a massive crash. (Those who see this as 20/20 hindsight may visit the Housing Panic blog, one place where these predictions appeared.)
Both predictions, it’s worth noting, were based on the evidence of history. Ever since market economies evolved the capacity to support speculative bubbles, people have lost their senses at intervals over some investment or other: tulips, stock, real estate, precious metals, commodities, you name it. The infallible sign that this has happened is the claim that the investment in question is exempt from supply and demand and will just keep increasing in value. I think most of us remember when exactly these things were said about tech stocks, and it’s been rather les than a year since many people insisted that the same things were true about petroleum: wrong in both cases, and in every other case in human history. Thus we can know something about the future: we can accurately predict that no speculative investment will rise in value indefinitely.
The second prediction followed on the heels of the first. Because millions of people were climbing aboard the real estate bandwagon, and prices were zooming upwards, it was a safe bet that eventually prices would slump, people would sell off their investments, and the result would be a crash; that’s the way every speculative bubble in history has ended. Thus the bloggers on Housing Panic knew the kind of thing the future would hold: a collapse in real estate prices in which a great many people would lose a huge amount of money. Those who noticed that banks were loaning money recklessly to speculators also knew that many banks would go under; again, that’s the kind of thing that happens when greed trumps caution and banks forget that money should only be loaned to people who can pay it back.
What nobody knew was when the crash would come, what would trigger it, and how it would play out. This is the difference between knowing what kind of thing will happen and knowing what will happen. Nothing is more difficult than timing a bubble. Isaac Newton, arguably one of the brightest human beings who ever lived, tried to time the market during the South Sea Bubble and lost most of his money. (Any of my readers who consider themselves smarter than Newton are invited to try to predict the turning point of the current bubble in US treasury bills. Since the bursting of that bubble will probably put what’s left of the global economy into cardiac arrest, this is by no means a purely academic exercise.)
These same considerations apply to any attempt to predict the future, and in particular to the central theme of this blog, the twilight of industrial civilization and the long descent into a new dark age. Civilizations, like speculative bubbles, have promoters who insist they can keep on going forever; just as with bubbles, announcements of that sort have historically been a clear sign that serious trouble is not too far off. It’s a safe bet, in any case, that every bubble will pop and every civilization will decline and fall. Those who are heavily invested in a particular bubble or civilization will of course insist that it’s different this time, just as their predecessors did; those claims have been wrong so far, and the evidence isn’t favoring them this time, either.
It’s quite possible, in turn, to predict the kind of things that will happen as industrial civilization lurches down the uneven slope of decline. Plenty of civilizations have done that before, and the common features stand out clearly from history; some of these features are already visible in the present case – it’s educational to page through Spengler or Toynbee and note how many features of a declining civilization had not yet appeared in their time, but have shown up on schedule in ours. What nobody can know in advance is just how these trends will work out in detail.
This is the perspective on the future that frames the proposals I’ve made here and elsewhere for coping with the long descent ahead of us. It’s certainly possible to know in advance some of the things that won’t happen. For example, declining civilizations always seem to get prophets who insist that some vast and improbable transformation will suddenly replace their civilization with the kind of society they would rather inhabit. They are always wrong, and such prophecies should be seen as signs of the times rather than knowledge about what will actually happen.
Set such fantasies aside, and it’s not that difficult to predict the kind of things that will happen as our civilization runs down. Mass migrations, for example, usually take place when civilizations collapse; the tidal force of migrant workers and refugees streaming across today’s borders is already making headlines, so it’s a safe bet this process will shift into high gear in the future. On the other hand, it’s anyone’s guess how those migrations will affect individuals and communities in any given corner of the world. I’ve suggested in these essays, for instance, that the western shores of North America may end up receiving some millions of refugees by sea from Japan. The Japanese islands can only support a small fraction of their current population on local resources; the northern Pacific currents go the right direction, and Japan has a huge and capable merchant marine and fishing fleet, so means, motive, and opportunity are present.
None of this makes the arrival of the first rusting container ship full of refugees on an Oregon beach a certainty. For all we know, Japan might purchase eastern Siberia from a disintegrating Russia thirty years from now, and settle its extra population there; it might go to war with China and suffer losses so drastic that the point becomes moot; or some other unexpected turn of events might set history in motion down a different path. What we do know is that as fossil fuels run short and importing food becomes a strategy without a future, a large fraction of the population of Japan will either relocate or die; what they do about that bitter choice is less predictable.
Thus the knowledge we can get provides no basis for making a future to order, but potentially allows room for something beyond improvisation. Since a miracle is not going to bail us out from the results of our collective mistakes, we need not waste time waiting for one, and can get to work in more practical ways. Since the kind of things that happen early in a civilization’s decline are tolerably well documented, we can assess trends already at work in the areas where we live, and guess at the near-term challenges we are likely to face. Since the endpoint of the process of decline is also fairly well documented, we can try to anticipate what things, readily available now, will be scarce and useful to our descendants, and do what we can to see that those things get passed down the chain of years to the waiting hands of the future.
Now it’s true, of course, that none of these options are foolproof. Even with the guidance of history, it’s possible to misjudge the shape of the future disastrously, and even those who guess the future in advance may not be able to avoid its dangers. This is why the concept of dissensus, introduced last week, is vital just now: as any ecologist can tell you, in the face of unpredictable change, the wider the range of variation in a species, the more likely that some of them will have what it takes to adapt. A monoculture of ideas, organizational styles, or paradigms is just as vulnerable as a monoculture of living things, and so our best option just now is to encourage disagreement, so as to foster as many different approaches to the future as possible.
The need for dissensus, it should be stressed, does not simply cover the technologies different individuals and groups might decide to pursue, the organizations they might choose to make or support, or the survival strategies that might seem most promising to them. It also reaches into the realm of ends. I have said this several times in recent posts, but it bears repeating: we have no idea what kind of society is best suited to a world after industrialism. It’s far more likely than not, in fact, that such a society will have little in common with the notions that middle-class intellectuals in the industrial world today might have of it. This doesn’t mean that we shouldn’t try to imagine such a society; it does mean that attempts to push diverse visions into a single consensus are as unproductive as they are futile.
Diversity in the realm of ends, finally, also applies to the most basic decisions about the way the predicament of our time is framed. For some people, the most meaningful challenge focuses on rebuilding communities to help them and their residents get through the end of the age of abundance. For others, it focuses on building new societies they hope will replace the one we have now. For still others, it focuses on developing new technologies, or rescuing old ones, to replace those that will stop working when today’s lavish energy supplies run out. There are those for whom raw survival is the most important thing, and there are those who have come to terms with the inevitability of death and are pursuing other goals.
Which of these choices is the best? Wrong question. All of them, and more, are necessary parts of a dissensus-based approach to the crisis of industrial civilization. As you read these words, members of a city council in a Midwestern college town may be mulling over a project that will pull their community through hard times, while activists one town over, with the best intentions in the world, devise a similar program that will fail and take their town’s future with it. One ecovillage in Ohio may be inventing social forms that will evolve into the neotribal societies of the 22nd century, while another attempt on similar lines sparks quarrels that tear a community to shreds. One hobbyist in Montana, staring at pictures of a 19th century solar steam engine, may start making the prototype of a machine that will become the prime energy source of the ecotechnic age, while others miss the necessary insight and waste their lives on dead ends.
What adds spice to the irony is that we have no way of knowing in advance which is which. All any of us can do is pursue the work that calls to us individually, cooperate with others who share the same commitment, take the measures to weather the crisis that seem to make sense from where we are, and remember that those who disagree with us most heartily may be assembling their own piece of a puzzle that is, ultimately, bigger than any of us.
Wednesday, December 10, 2008
Dissensus and Organic Process
In bringing up the vexed relationship between evolution as it happens in nature, on the one hand, and the ways the concept of evolution has been redefined in current ideologies on the other, last week’s Archdruid Report post dipped a tentative toe into some very deep and murky waters. Over a century or more, ideas and metaphors from the natural sciences have become potent factors in the public life of the western world; terms such as “natural,” “organic,” and, yes, “evolution” have been caught up by any number of players in the scrimmage of contemporary culture, and more often than not have come out much the worse for wear.
There’s no shortage of ingenious ways to misuse concepts such as these, but one in particular has had a pervasive presence in our collective dialogue. Perhaps the best way to show it at work is to track the use of natural concepts in one of the towering creative minds of the twentieth century, American architect Frank Lloyd Wright.
Full disclosure probably requires me to admit up front that I’m a fan of Wright’s work, and not only because he was one of the handful of first-rate creative talents to have been influenced by the modern Druid tradition. In his quest for an organic architecture – notice the concept lifted from the life sciences – he reshaped the vocabulary of space and form in ways that are still being explored by architects today, and he also produced rather more than his share of stunningly beautiful buildings.
Still, there are few geniuses whose works are without flaws, and Wright was not one of them. Stewart Brand of Whole Earth Catalog fame has set out the case for the prosecution in his useful book How Buildings Learn (1994). To begin with, Brand points out, all Wright’s roofs leak. This may seem like a small thing, but since the basic purpose of shelter is to keep weather out, and it’s not actually that difficult to design a watertight roof, Wright’s failure to accomplish this fundamental requirement is not a good sign.
More generally, Wright paid close attention to the esthetic qualities of building materials, but not always to their structural strength; the results included a fair number of splendid buildings that could not hold up to normal wear and tear, or in some cases, the simple force of gravity. Thus a great many Wright buildings have had to be torn down since his time, and others linger on as museums, struggling to raise the money to meet their huge maintenance costs. Similar concerns run through every aspect of his work; the chairs he designed were beautiful, for instance, but many of them are acutely uncomfortable to sit in.
The problem with Wright’s work, essentially, is that he applied his core concept of organic architecture in too one-sided a way. The way he structured space resonates intensely with the nature of the site, the purpose of the building, and the esthetic of the materials he used; so far, so good. The difficulties arose because he handled at least two other aspects of the building process in a profoundly inorganic way. The first of these, as mentioned already, was his cavalier attitude toward the structural qualities of materials, and more generally to the “substance” side of Aristotle’s famous form/substance dichotomy. The rain that leaked through Wright’s roofs, and the dampness that pervaded his famous house Fallingwater – it had a stream running through the middle of it, complete with waterfall – and made its first owner refer to it as “Rising Mildew,” are substances as relevant to the architect as the material forming the beams that support the floors. An architecture that embraced substance in an organic way would arguably shape form according to the physical potentials and weaknesses of the relevant substances, just as Wright’s forms were shaped by the esthetics of the substances he used.
The second aspect is subtler, and the book by Stewart Brand mentioned above is perhaps the best guide to it. A building is a pattern in space and in substance, but it is also a pattern in time, following its own trajectory from the first work on the site to the last swing of the wrecking ball. Successful buildings adapt to the people who live in them or use them, just as the people adapt to the buildings; Brand argues that in this sense, buildings “learn.” Many of Wright’s buildings – though there were important exceptions – were distinctly slow learners, and some proved to be wholly unteachable. Admittedly, in Wright’s day as now, the architect’s job mostly ended when the blueprints were handed over to the builder; additionally, of course, creative minds in his milieu were expected to be prima donnas, and his income and reputation depended at least in part on playing that role. Most of today’s fashionable architects suffer from the same fixation on form over substance and process, without the benefit of Wright’s sure esthetic touch.
All this may seem far removed from the questions that have become central to this blog – the twilight of the industrial age and the birthing of constructive responses to its end – but the same three dimensions just considered – form, substance, process – apply to design in any context, from a mud hut to an alternative currency. Mud huts aside, most modern design that tries to be organic focuses, as Wright did, on organic form, and much of it neglects substance and process. Thus, for example, you get plans for “renewable” energy systems that may use sun or wind, but can’t be made or maintained without petroleum products and massive energy inputs, and power equally unsustainable machines or lifestyles.
These same concerns apply even more stringently to plans for social change. Plenty of proposals for allegedly “natural” or “ecological” societies, communities, and institutions have been floated over the last three decades or so, and most of them are natural in the same sense that Wright’s architecture is organic: they represent one person’s best shot at grasping the natural potentials of a situation. Very often, though, these proposals fail to address issues of substance or process. Substance in a social context refers, among other things, to the people who will presumably take up the new social system, but who inevitably bring to it attitudes and behavior patterns from other social contexts and the evolution of our species; it’s notorious, and also true, that most Utopian schemes would work wondrously well if human beings could just stop behaving like human beings.
Process in a social context, in turn, refers to the way that the new system is to be designed, set in motion, and adapted to meet changing needs, but there is another dimension as well: how the new system is to deal with competition from other social systems. When this has been addressed at all, it has too often been phrased in simplistic and stereotyped terms, as by insisting that lifeboat communities have plenty of guns so they can fight off the marauding hordes that feature so largely in contemporary survivalist fantasy. The history of Utopian communities in North America offers a useful corrective; most of the successful communes of the nineteenth century, for example, went under once the founding generation died off and the younger generations found communal life less appealing than the seductions of mainstream culture. The same thing could easily happen in a generation or so to any number of the communities being planned so eagerly today, since a future in which the inhabitants of such communities have no other options is probably the least likely of all the possibilities before us.
I’ve critiqued the Transition Town movement in these essays, but the value of organic process is one thing that this movement has grasped at least as well as anybody in the peak oil movement just now. Those who are still trying to impose plans based on some ideology or other on the fluid potentials of the future might learn a few things from this source. Still, it’s possible and, I think, useful, to go further still in the same direction. One potentially valuable way of doing so is the process of dissensus.
I’ve borrowed that term from postmodern theorist Ewa Ziarek, who introduced it in a book on ethical theory in 2001. As most of my readers likely guessed at first glance, dissensus is the opposite of consensus, and it comes into play when consensus, for one reason or another, is either impossible or a bad idea: when, that is, irreducible differences make it impossible to find any common ground for agreement on the points that matter, or when settling on any common decision would be premature.
This latter, I suggest, is a fair description of where we stand as we face the future that will follow the end of the industrial age. There’s an interesting dichotomy in our knowledge of the future: history can give us a fair idea of the type of events that we will encounter, but neither it nor anything else can give us the details. When housing prices started zooming upwards a few years back, quite a number of people compared that to other speculative bubbles and correctly predicted that an enormous crash would shake the world economy when the bubble popped – but neither they nor anyone else could have known in advance when the crash would come or what the details of its downward course would be.
The twilight of the industrial age puts us in a similar place. Looking at what’s happened to previous civilizations that overshot the limits of their resource base, it’s not hard to recognize the parallels and predict the onset of the familiar process of decline and fall. That process has some constant features, and it’s probably safe to predict that those will occur this time too: for example, mass migration is a very common consequence of the fall of civilizations, and recent warnings about tidal flows of environmental refugees in the not too distant future suggest that it may be a safe bet to assume that the same thing will happen in our future. What nobody can anticipate are the details: what will set any particular migration in motion, what its scale, route, and final destination will be, and above all what the timing will be.
Lacking those details, a consensus plan is not a good idea. If you knew today, let’s say, that the region containing your ecovillage was going to have much less rain in the future, you would make one set of choices; if you knew that the same region was going to have much more rain in the future, you would make another, and so on. If you knew that a million refugees from climate change will be coming through your town, your plans would be very different from the ones you would make if you knew that your town would be far from the migration routes. Since these things can’t be known in advance, though, whatever consensus you reach has a very real chance of being exactly the wrong choice. This is where dissensus comes to the rescue. In a situation of uncertainty, encouraging people to pursue different and even opposed options increases the likelihood that somebody will happen on the right answer.
Dissensus, it deserves to be said, is not simply a lack of consensus. Like consensus itself, it has its own methods and process, its own values and style; the Thelonious Monk CD playing in my study as I type these words might also serve as a reminder that where dissensus is encouraged, and individuals pursue their own visions rather than submitting to a socially based consensus, the results can include dazzling creativity. Frank Lloyd Wright, with whom I began this essay, was a master of dissensus; great artists usually are. Yet the greatest master of dissensus is arguably Nature itself.
Those first inch-long vertebrates who darted about in shallow seas half a billion years ago, after all, did not come to some sort of genetic consensus about where evolution was going to take them, nor did the evolutionary process itself push them in one direction. Some of their offspring became fish, some amphibians, some reptiles, some birds, and some mammals, and a few of the latter are either typing this essay or reading it. Evolution is dissensus in action, the outward pressure of genetic diversification running up against the limits of environment and, now and then, pushing through to some new adaptation: the wings of bats, the opposable thumbs of primates, the cultural evolution of human beings. As we enter a future of new limits and unpredictable opportunities, this is arguably the kind of organic process we need most.
There’s no shortage of ingenious ways to misuse concepts such as these, but one in particular has had a pervasive presence in our collective dialogue. Perhaps the best way to show it at work is to track the use of natural concepts in one of the towering creative minds of the twentieth century, American architect Frank Lloyd Wright.
Full disclosure probably requires me to admit up front that I’m a fan of Wright’s work, and not only because he was one of the handful of first-rate creative talents to have been influenced by the modern Druid tradition. In his quest for an organic architecture – notice the concept lifted from the life sciences – he reshaped the vocabulary of space and form in ways that are still being explored by architects today, and he also produced rather more than his share of stunningly beautiful buildings.
Still, there are few geniuses whose works are without flaws, and Wright was not one of them. Stewart Brand of Whole Earth Catalog fame has set out the case for the prosecution in his useful book How Buildings Learn (1994). To begin with, Brand points out, all Wright’s roofs leak. This may seem like a small thing, but since the basic purpose of shelter is to keep weather out, and it’s not actually that difficult to design a watertight roof, Wright’s failure to accomplish this fundamental requirement is not a good sign.
More generally, Wright paid close attention to the esthetic qualities of building materials, but not always to their structural strength; the results included a fair number of splendid buildings that could not hold up to normal wear and tear, or in some cases, the simple force of gravity. Thus a great many Wright buildings have had to be torn down since his time, and others linger on as museums, struggling to raise the money to meet their huge maintenance costs. Similar concerns run through every aspect of his work; the chairs he designed were beautiful, for instance, but many of them are acutely uncomfortable to sit in.
The problem with Wright’s work, essentially, is that he applied his core concept of organic architecture in too one-sided a way. The way he structured space resonates intensely with the nature of the site, the purpose of the building, and the esthetic of the materials he used; so far, so good. The difficulties arose because he handled at least two other aspects of the building process in a profoundly inorganic way. The first of these, as mentioned already, was his cavalier attitude toward the structural qualities of materials, and more generally to the “substance” side of Aristotle’s famous form/substance dichotomy. The rain that leaked through Wright’s roofs, and the dampness that pervaded his famous house Fallingwater – it had a stream running through the middle of it, complete with waterfall – and made its first owner refer to it as “Rising Mildew,” are substances as relevant to the architect as the material forming the beams that support the floors. An architecture that embraced substance in an organic way would arguably shape form according to the physical potentials and weaknesses of the relevant substances, just as Wright’s forms were shaped by the esthetics of the substances he used.
The second aspect is subtler, and the book by Stewart Brand mentioned above is perhaps the best guide to it. A building is a pattern in space and in substance, but it is also a pattern in time, following its own trajectory from the first work on the site to the last swing of the wrecking ball. Successful buildings adapt to the people who live in them or use them, just as the people adapt to the buildings; Brand argues that in this sense, buildings “learn.” Many of Wright’s buildings – though there were important exceptions – were distinctly slow learners, and some proved to be wholly unteachable. Admittedly, in Wright’s day as now, the architect’s job mostly ended when the blueprints were handed over to the builder; additionally, of course, creative minds in his milieu were expected to be prima donnas, and his income and reputation depended at least in part on playing that role. Most of today’s fashionable architects suffer from the same fixation on form over substance and process, without the benefit of Wright’s sure esthetic touch.
All this may seem far removed from the questions that have become central to this blog – the twilight of the industrial age and the birthing of constructive responses to its end – but the same three dimensions just considered – form, substance, process – apply to design in any context, from a mud hut to an alternative currency. Mud huts aside, most modern design that tries to be organic focuses, as Wright did, on organic form, and much of it neglects substance and process. Thus, for example, you get plans for “renewable” energy systems that may use sun or wind, but can’t be made or maintained without petroleum products and massive energy inputs, and power equally unsustainable machines or lifestyles.
These same concerns apply even more stringently to plans for social change. Plenty of proposals for allegedly “natural” or “ecological” societies, communities, and institutions have been floated over the last three decades or so, and most of them are natural in the same sense that Wright’s architecture is organic: they represent one person’s best shot at grasping the natural potentials of a situation. Very often, though, these proposals fail to address issues of substance or process. Substance in a social context refers, among other things, to the people who will presumably take up the new social system, but who inevitably bring to it attitudes and behavior patterns from other social contexts and the evolution of our species; it’s notorious, and also true, that most Utopian schemes would work wondrously well if human beings could just stop behaving like human beings.
Process in a social context, in turn, refers to the way that the new system is to be designed, set in motion, and adapted to meet changing needs, but there is another dimension as well: how the new system is to deal with competition from other social systems. When this has been addressed at all, it has too often been phrased in simplistic and stereotyped terms, as by insisting that lifeboat communities have plenty of guns so they can fight off the marauding hordes that feature so largely in contemporary survivalist fantasy. The history of Utopian communities in North America offers a useful corrective; most of the successful communes of the nineteenth century, for example, went under once the founding generation died off and the younger generations found communal life less appealing than the seductions of mainstream culture. The same thing could easily happen in a generation or so to any number of the communities being planned so eagerly today, since a future in which the inhabitants of such communities have no other options is probably the least likely of all the possibilities before us.
I’ve critiqued the Transition Town movement in these essays, but the value of organic process is one thing that this movement has grasped at least as well as anybody in the peak oil movement just now. Those who are still trying to impose plans based on some ideology or other on the fluid potentials of the future might learn a few things from this source. Still, it’s possible and, I think, useful, to go further still in the same direction. One potentially valuable way of doing so is the process of dissensus.
I’ve borrowed that term from postmodern theorist Ewa Ziarek, who introduced it in a book on ethical theory in 2001. As most of my readers likely guessed at first glance, dissensus is the opposite of consensus, and it comes into play when consensus, for one reason or another, is either impossible or a bad idea: when, that is, irreducible differences make it impossible to find any common ground for agreement on the points that matter, or when settling on any common decision would be premature.
This latter, I suggest, is a fair description of where we stand as we face the future that will follow the end of the industrial age. There’s an interesting dichotomy in our knowledge of the future: history can give us a fair idea of the type of events that we will encounter, but neither it nor anything else can give us the details. When housing prices started zooming upwards a few years back, quite a number of people compared that to other speculative bubbles and correctly predicted that an enormous crash would shake the world economy when the bubble popped – but neither they nor anyone else could have known in advance when the crash would come or what the details of its downward course would be.
The twilight of the industrial age puts us in a similar place. Looking at what’s happened to previous civilizations that overshot the limits of their resource base, it’s not hard to recognize the parallels and predict the onset of the familiar process of decline and fall. That process has some constant features, and it’s probably safe to predict that those will occur this time too: for example, mass migration is a very common consequence of the fall of civilizations, and recent warnings about tidal flows of environmental refugees in the not too distant future suggest that it may be a safe bet to assume that the same thing will happen in our future. What nobody can anticipate are the details: what will set any particular migration in motion, what its scale, route, and final destination will be, and above all what the timing will be.
Lacking those details, a consensus plan is not a good idea. If you knew today, let’s say, that the region containing your ecovillage was going to have much less rain in the future, you would make one set of choices; if you knew that the same region was going to have much more rain in the future, you would make another, and so on. If you knew that a million refugees from climate change will be coming through your town, your plans would be very different from the ones you would make if you knew that your town would be far from the migration routes. Since these things can’t be known in advance, though, whatever consensus you reach has a very real chance of being exactly the wrong choice. This is where dissensus comes to the rescue. In a situation of uncertainty, encouraging people to pursue different and even opposed options increases the likelihood that somebody will happen on the right answer.
Dissensus, it deserves to be said, is not simply a lack of consensus. Like consensus itself, it has its own methods and process, its own values and style; the Thelonious Monk CD playing in my study as I type these words might also serve as a reminder that where dissensus is encouraged, and individuals pursue their own visions rather than submitting to a socially based consensus, the results can include dazzling creativity. Frank Lloyd Wright, with whom I began this essay, was a master of dissensus; great artists usually are. Yet the greatest master of dissensus is arguably Nature itself.
Those first inch-long vertebrates who darted about in shallow seas half a billion years ago, after all, did not come to some sort of genetic consensus about where evolution was going to take them, nor did the evolutionary process itself push them in one direction. Some of their offspring became fish, some amphibians, some reptiles, some birds, and some mammals, and a few of the latter are either typing this essay or reading it. Evolution is dissensus in action, the outward pressure of genetic diversification running up against the limits of environment and, now and then, pushing through to some new adaptation: the wings of bats, the opposable thumbs of primates, the cultural evolution of human beings. As we enter a future of new limits and unpredictable opportunities, this is arguably the kind of organic process we need most.
Wednesday, December 3, 2008
Taking Evolution Seriously
Back in 1904, sociologist Max Weber proposed that the modern period was witnessing “the disenchantment of the world” – a process which traditional mythic ideas that wove meaning into human experience were being replaced by the alienating and dehumanizing worldview of materialist science. There’s some truth to Weber’s thesis, but I’m not sure he anticipated the inevitable backlash: the Procrustean stretching and lopping of scientific ideas in the popular imagination that has turned many of them into substitute myths.
One example that has been much on my mind of late is the way the theory of evolution has been manhandled into a surrogate mythology. The reason it’s been on my mind is simple enough: whenever I discuss peak oil at a lecture, book signing, or some other public setting, it’s a safe bet that someone will raise a hand and ask what I think about the possibility that the approaching crisis is part of our transition to a new evolutionary level. I am always left wondering what to say in response, because this sort of question is almost always rooted in the notion that evolution is a linear movement that leads onward and upward through a series of distinct stages or levels – and this notion is a pretty fair misstatement of the way evolution takes place in nature.
Few things in the history of ideas are quite so interesting as the way that new discoveries get harnessed in the service of old obsessions. When X-rays were first detected in 1895, for example, one of the first results was panic over the possibility that the new rays might make it possible to see through clothing; the New Jersey state legislature actually debated a bill to ban the use of X-rays in opera glasses. Wildly inaccurate as it was, this notion was rooted in profound fears about sexuality, and so it took many decades to dispel – when I was a child, ads in comic books still claimed to sell “X-ray glasses” that would let you see people naked.
Something not that different happened to the theory of evolution, and thus nearly all of today’s popular notions about evolution are shrapnel from the head-on collision between Darwin’s theory and the obsessions of the era in which that theory emerged. Social class rather than sex was the driving force here; as religious justifications for the English caste system faltered, the manufacture of scientific justifications for social hierarchy became a growth industry, and by the time the ink was dry on the first copies of The Origin of Species, evolution was already being drafted into service in this dubious cause. The resulting belief system was very nearly a parody of George Orwell’s Animal Farm in advance – all living things evolve, but some are more evolved than others.
Now of course this is nonsense. A human being, a gecko, a dandelion, and a single-celled blue-green alga are all equally evolved – that is, they have all been shaped to the same degree by the pressures of their environment, and their ancestors have all undergone an equal amount of natural selection. We think of humans as “more evolved” than blue-green alga because Victorian Social Darwinists such as Herbert Spencer engaged in conceptual sleight of hand, transforming the amorphous outward surge of life toward available niches into a ladder of social status, with English gentlemen at the top level and everybody and everything else slotted into place further down. The concept of evolutionary stages or levels was essential to this conjurer’s act, since it allowed social barriers between classes to be mapped onto the biological world.
In nature, though, evolution has no levels, it just has adaptations. There is no straight line of progress along which living things can be ranked. Instead, evolutionary lineages splay outward like the branches of an unruly shrub. Sometimes those branches take unexpected turns, but these evolutionary breakthroughs can no more be ranked in an ascending hierarchy than organisms can. They move outward into new niches, rather than upward to some imagined goal. There are any number of examples from nature; the one I want to use here is the evolution of bats.
The ancestors of the first bats were shrewlike, insect-eating nocturnal mammals, related to early primates, who scampered through the forest canopies of the Eocene around 60 million years ago. For animals that live in trees, the risk of falling is a constant source of evolutionary pressure, and adaptations that will help manage that danger will likely spread through a population; that’s how sloths got their claws, New World monkeys got prehensile tails, and many animals of past and present got extra skin that functions as a parachute. If the extra skin bridges the gap between forelegs and the hindlegs, the most common adaptation, you get the ability to glide, like flying squirrels, colugoes, and the like; you’ve got a viable adaptation, and there you stop.
If the extra skin is mostly on and around the forelimbs, though, you’ve just jumped through the door into a new world, because you can control your glide much more precisely, and you can put muscle into the movements – in other words, you can begin to fly. Once you can do better than a controlled fall, furthermore, the trillions of tasty insects flitting through the forest air are on your menu, and the better you can fly, the more you can catch. The result is ferocious evolutionary pressure toward improved flight skills, and in a few hundred thousand generations, you’ve got agile fliers. That’s what happened to bats, as it happened some 200 million years earlier to the ancestors of the pterodactyls.
By 55 million years ago, bats almost identical to today’s insect-eating bats were darting through the Eocene skies. Sonar seems to have taken a while to evolve, and some offshoots of the family – the big fruit bats and flying foxes, for example – took even longer, but the basic adaptations were set and, to the discomfiture of countless generations of mosquitoes and moths, have remained ever since. As evolutionary breakthroughs go, the leap into flight was a massive success; bats are the second most numerous of mammal orders, exceeded only by the rodents, but it’s impossible to fit the breakthrough that created them into any linear scheme.
Applying an ecological concept to human social systems always takes tinkering, but there are good reasons to accept the idea that societies are capable of evolution; like populations of other living things, human communities face pressures from their environments, and adapt or perish in response. Here again, though, the evolutionary process moves outward in all directions rather than ascending an imaginary hierarchy of levels. Hunter-gatherer systems seem to have been the original form of human society, but other forms branched off as adaptations opened doors to possibilities that were likely as appealing at the time as the bug-filled night sky must have been to the first clumsily flapping proto-bats.
Where large herbivores could be tamed, therefore, nomadic herding societies came into being; where many food plants could be raised in intensive gardens, tribal horticultural societies were born; where extensive fields of seed-bearing grasses offered the best option for survival, agrarian societies took shape. As it turned out, grains could be bred to yield large surpluses that could be transported and stored, and so the agrarian system opened the door to large-scale divisions of labor and the rise of cities. These in turn made complex material culture possible, and ultimately drove the creation of the machines that broke into the Earth’s stockpiles of fossil carbon and gave the modern world its three centuries of exuberance.
Thus industrial society is not “more evolved” than other societies, for for that matter “less evolved.” It was simply the most successful adaptation to the evolutionary pressures that opened up once fossil fuel energy had been tapped, and it outcompeted other systems in something of the same way that an invasive exotic outcompetes less robust native organisms. As fossil fuels deplete and climate change unfolds, the balance of evolutionary pressures is shifting, and as the new reality of limits takes hold, selection will favor those systems that are better adapted to the new ecological constraints of global climate instability, energy scarcity, and resource shortage.
The fact that those new systems are better adapted to new realities, however, does not free them from the human condition. This is where the rubber meets the road, because the people who ask me about the prospects of a new evolutionary level are rarely asking whether the societies of the future will be better adapted to an environment of resource scarcity. They are generally asking whether societies on the other side of an imagined evolutionary leap will be free from problems such as poverty, war, and environmental destruction.
The best way to assess this, it seems to me, is to consider what happened the last time human social evolution yielded a breakthrough to a new way of living in the world: that is, the rise of industrial societies beginning around 1750. Agrarian societies suffered from poverty, war, and environmental destruction, and so did all the other “evolutionary levels” or, rather, adaptations, right back to the hunter-gatherers. Many hunter-gatherers among the First Nations in North America, for example, had sharp social inequalities, a busy slave trade, and a long history of fierce tribal wars. Their ecological relationships were less problematic, since those native societies that failed to find a balance with nature, such as the Mound Builders and the people of Chaco Canyon, collapsed long before 1492.
Just as bats faced the same experiences of hunger, social squabbles, and the unfriendly attentions of predators as their ancestors, the societies that took up industrialism experienced poverty, war, and environmental destruction just like earlier societies, and it’s hard to think of a good reason why the new societies that emerge in response to the evolutionary pressures of the deindustrial age should be exempt from the same troubles. Evolutionary adaptations can make things easier for living things – plenty of predators in the Eocene must have been discomfited when bats evolved the ability to flutter away to safety – but no living thing is exempt from the balances of the natural world. It’s a mistake, in other words, to see evolution as a movement toward Utopia.
When I’ve tried to explain any of the above in public, though, someone – and it’s not always the same someone who asked the original question – usually insists that this may be how biological evolution works, but spiritual evolution is different. Some of my readers just now may have come up with the same objection. All I can say in response is I know of none of the world’s great spiritual traditions that would approve the claim that people living extravagant lifestyles of wealth and privilege – this is, after all, a fair description of life in modern industrial societies from the standpoint of the rest of human experience – can expect a sudden leap to an even more comfortable and convenient life, just because they happen to want it, and would find it a useful way to avoid dealing with the consequences of their own shortsighted choices.
This may seem unduly harsh. Still, the notion that an evolutionary leap will extract us from the mess we’ve made for ourselves is as much a distortion of the realities of the evolutionary process as any Social Darwinist screed. If people want to believe that a miracle will rescue them from the predicament of industrial society, they have every right to their faith, but it would confuse communication a little less to call it a miracle, instead of trying to wrap it in the borrowed prestige of Darwin’s theory. Perhaps it’s the bias instilled by my own Druid faith, furthermore, but it seems to me that if we are going to use evolution as a metaphor, we need to start by taking evolution seriously, rather than imposing our own fantasies on the very different stories that nature is telling us.
One example that has been much on my mind of late is the way the theory of evolution has been manhandled into a surrogate mythology. The reason it’s been on my mind is simple enough: whenever I discuss peak oil at a lecture, book signing, or some other public setting, it’s a safe bet that someone will raise a hand and ask what I think about the possibility that the approaching crisis is part of our transition to a new evolutionary level. I am always left wondering what to say in response, because this sort of question is almost always rooted in the notion that evolution is a linear movement that leads onward and upward through a series of distinct stages or levels – and this notion is a pretty fair misstatement of the way evolution takes place in nature.
Few things in the history of ideas are quite so interesting as the way that new discoveries get harnessed in the service of old obsessions. When X-rays were first detected in 1895, for example, one of the first results was panic over the possibility that the new rays might make it possible to see through clothing; the New Jersey state legislature actually debated a bill to ban the use of X-rays in opera glasses. Wildly inaccurate as it was, this notion was rooted in profound fears about sexuality, and so it took many decades to dispel – when I was a child, ads in comic books still claimed to sell “X-ray glasses” that would let you see people naked.
Something not that different happened to the theory of evolution, and thus nearly all of today’s popular notions about evolution are shrapnel from the head-on collision between Darwin’s theory and the obsessions of the era in which that theory emerged. Social class rather than sex was the driving force here; as religious justifications for the English caste system faltered, the manufacture of scientific justifications for social hierarchy became a growth industry, and by the time the ink was dry on the first copies of The Origin of Species, evolution was already being drafted into service in this dubious cause. The resulting belief system was very nearly a parody of George Orwell’s Animal Farm in advance – all living things evolve, but some are more evolved than others.
Now of course this is nonsense. A human being, a gecko, a dandelion, and a single-celled blue-green alga are all equally evolved – that is, they have all been shaped to the same degree by the pressures of their environment, and their ancestors have all undergone an equal amount of natural selection. We think of humans as “more evolved” than blue-green alga because Victorian Social Darwinists such as Herbert Spencer engaged in conceptual sleight of hand, transforming the amorphous outward surge of life toward available niches into a ladder of social status, with English gentlemen at the top level and everybody and everything else slotted into place further down. The concept of evolutionary stages or levels was essential to this conjurer’s act, since it allowed social barriers between classes to be mapped onto the biological world.
In nature, though, evolution has no levels, it just has adaptations. There is no straight line of progress along which living things can be ranked. Instead, evolutionary lineages splay outward like the branches of an unruly shrub. Sometimes those branches take unexpected turns, but these evolutionary breakthroughs can no more be ranked in an ascending hierarchy than organisms can. They move outward into new niches, rather than upward to some imagined goal. There are any number of examples from nature; the one I want to use here is the evolution of bats.
The ancestors of the first bats were shrewlike, insect-eating nocturnal mammals, related to early primates, who scampered through the forest canopies of the Eocene around 60 million years ago. For animals that live in trees, the risk of falling is a constant source of evolutionary pressure, and adaptations that will help manage that danger will likely spread through a population; that’s how sloths got their claws, New World monkeys got prehensile tails, and many animals of past and present got extra skin that functions as a parachute. If the extra skin bridges the gap between forelegs and the hindlegs, the most common adaptation, you get the ability to glide, like flying squirrels, colugoes, and the like; you’ve got a viable adaptation, and there you stop.
If the extra skin is mostly on and around the forelimbs, though, you’ve just jumped through the door into a new world, because you can control your glide much more precisely, and you can put muscle into the movements – in other words, you can begin to fly. Once you can do better than a controlled fall, furthermore, the trillions of tasty insects flitting through the forest air are on your menu, and the better you can fly, the more you can catch. The result is ferocious evolutionary pressure toward improved flight skills, and in a few hundred thousand generations, you’ve got agile fliers. That’s what happened to bats, as it happened some 200 million years earlier to the ancestors of the pterodactyls.
By 55 million years ago, bats almost identical to today’s insect-eating bats were darting through the Eocene skies. Sonar seems to have taken a while to evolve, and some offshoots of the family – the big fruit bats and flying foxes, for example – took even longer, but the basic adaptations were set and, to the discomfiture of countless generations of mosquitoes and moths, have remained ever since. As evolutionary breakthroughs go, the leap into flight was a massive success; bats are the second most numerous of mammal orders, exceeded only by the rodents, but it’s impossible to fit the breakthrough that created them into any linear scheme.
Applying an ecological concept to human social systems always takes tinkering, but there are good reasons to accept the idea that societies are capable of evolution; like populations of other living things, human communities face pressures from their environments, and adapt or perish in response. Here again, though, the evolutionary process moves outward in all directions rather than ascending an imaginary hierarchy of levels. Hunter-gatherer systems seem to have been the original form of human society, but other forms branched off as adaptations opened doors to possibilities that were likely as appealing at the time as the bug-filled night sky must have been to the first clumsily flapping proto-bats.
Where large herbivores could be tamed, therefore, nomadic herding societies came into being; where many food plants could be raised in intensive gardens, tribal horticultural societies were born; where extensive fields of seed-bearing grasses offered the best option for survival, agrarian societies took shape. As it turned out, grains could be bred to yield large surpluses that could be transported and stored, and so the agrarian system opened the door to large-scale divisions of labor and the rise of cities. These in turn made complex material culture possible, and ultimately drove the creation of the machines that broke into the Earth’s stockpiles of fossil carbon and gave the modern world its three centuries of exuberance.
Thus industrial society is not “more evolved” than other societies, for for that matter “less evolved.” It was simply the most successful adaptation to the evolutionary pressures that opened up once fossil fuel energy had been tapped, and it outcompeted other systems in something of the same way that an invasive exotic outcompetes less robust native organisms. As fossil fuels deplete and climate change unfolds, the balance of evolutionary pressures is shifting, and as the new reality of limits takes hold, selection will favor those systems that are better adapted to the new ecological constraints of global climate instability, energy scarcity, and resource shortage.
The fact that those new systems are better adapted to new realities, however, does not free them from the human condition. This is where the rubber meets the road, because the people who ask me about the prospects of a new evolutionary level are rarely asking whether the societies of the future will be better adapted to an environment of resource scarcity. They are generally asking whether societies on the other side of an imagined evolutionary leap will be free from problems such as poverty, war, and environmental destruction.
The best way to assess this, it seems to me, is to consider what happened the last time human social evolution yielded a breakthrough to a new way of living in the world: that is, the rise of industrial societies beginning around 1750. Agrarian societies suffered from poverty, war, and environmental destruction, and so did all the other “evolutionary levels” or, rather, adaptations, right back to the hunter-gatherers. Many hunter-gatherers among the First Nations in North America, for example, had sharp social inequalities, a busy slave trade, and a long history of fierce tribal wars. Their ecological relationships were less problematic, since those native societies that failed to find a balance with nature, such as the Mound Builders and the people of Chaco Canyon, collapsed long before 1492.
Just as bats faced the same experiences of hunger, social squabbles, and the unfriendly attentions of predators as their ancestors, the societies that took up industrialism experienced poverty, war, and environmental destruction just like earlier societies, and it’s hard to think of a good reason why the new societies that emerge in response to the evolutionary pressures of the deindustrial age should be exempt from the same troubles. Evolutionary adaptations can make things easier for living things – plenty of predators in the Eocene must have been discomfited when bats evolved the ability to flutter away to safety – but no living thing is exempt from the balances of the natural world. It’s a mistake, in other words, to see evolution as a movement toward Utopia.
When I’ve tried to explain any of the above in public, though, someone – and it’s not always the same someone who asked the original question – usually insists that this may be how biological evolution works, but spiritual evolution is different. Some of my readers just now may have come up with the same objection. All I can say in response is I know of none of the world’s great spiritual traditions that would approve the claim that people living extravagant lifestyles of wealth and privilege – this is, after all, a fair description of life in modern industrial societies from the standpoint of the rest of human experience – can expect a sudden leap to an even more comfortable and convenient life, just because they happen to want it, and would find it a useful way to avoid dealing with the consequences of their own shortsighted choices.
This may seem unduly harsh. Still, the notion that an evolutionary leap will extract us from the mess we’ve made for ourselves is as much a distortion of the realities of the evolutionary process as any Social Darwinist screed. If people want to believe that a miracle will rescue them from the predicament of industrial society, they have every right to their faith, but it would confuse communication a little less to call it a miracle, instead of trying to wrap it in the borrowed prestige of Darwin’s theory. Perhaps it’s the bias instilled by my own Druid faith, furthermore, but it seems to me that if we are going to use evolution as a metaphor, we need to start by taking evolution seriously, rather than imposing our own fantasies on the very different stories that nature is telling us.
Subscribe to:
Posts (Atom)